Development of an object-oriented ORIGEN for advanced nuclear fuel modeling applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skutnik, S.; Havloej, F.; Lago, D.
2013-07-01
The ORIGEN package serves as the core depletion and decay calculation module within the SCALE code system. A recent major re-factor to the ORIGEN code architecture as part of an overall modernization of the SCALE code system has both greatly enhanced its maintainability as well as afforded several new capabilities useful for incorporating depletion analysis into other code frameworks. This paper will present an overview of the improved ORIGEN code architecture (including the methods and data structures introduced) as well as current and potential future applications utilizing the new ORIGEN framework. (authors)
Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tippayakul, C.; Ivanov, K.; Misu, S.
2006-07-01
This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross sectionmore » library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)« less
Status Report on NEAMS PROTEUS/ORIGEN Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A
2016-02-18
The US Department of Energy’s Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program has contributed significantly to the development of the PROTEUS neutron transport code at Argonne National Laboratory and to the Oak Ridge Isotope Generation and Depletion Code (ORIGEN) depletion/decay code at Oak Ridge National Laboratory. PROTEUS’s key capability is the efficient and scalable (up to hundreds of thousands of cores) neutron transport solver on general, unstructured, three-dimensional finite-element-type meshes. The scalability and mesh generality enable the transfer of neutron and power distributions to other codes in the NEAMS toolkit for advanced multiphysics analysis. Recently, ORIGEN has received considerablemore » modernization to provide the high-performance depletion/decay capability within the NEAMS toolkit. This work presents a description of the initial integration of ORIGEN in PROTEUS, mainly performed during FY 2015, with minor updates in FY 2016.« less
Nuclear Fuel Depletion Analysis Using Matlab Software
NASA Astrophysics Data System (ADS)
Faghihi, F.; Nematollahi, M. R.
Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.
Nuclide Depletion Capabilities in the Shift Monte Carlo Code
Davidson, Gregory G.; Pandya, Tara M.; Johnson, Seth R.; ...
2017-12-21
A new depletion capability has been developed in the Exnihilo radiation transport code suite. This capability enables massively parallel domain-decomposed coupling between the Shift continuous-energy Monte Carlo solver and the nuclide depletion solvers in ORIGEN to perform high-performance Monte Carlo depletion calculations. This paper describes this new depletion capability and discusses its various features, including a multi-level parallel decomposition, high-order transport-depletion coupling, and energy-integrated power renormalization. Several test problems are presented to validate the new capability against other Monte Carlo depletion codes, and the parallel performance of the new capability is analyzed.
Turtle 24.0 diffusion depletion code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altomare, S.; Barry, R.F.
1971-09-01
TURTLE is a two-group, two-dimensional (x-y, x-z, r-z) neutron diffusion code featuring a direct treatment of the nonlinear effects of xenon, enthalpy, and Doppler. Fuel depletion is allowed. TURTLE was written for the study of azimuthal xenon oscillations, but the code is useful for general analysis. The input is simple, fuel management is handled directly, and a boron criticality search is allowed. Ten thousand space points are allowed (over 20,000 with diagonal symmetry). TURTLE is written in FORTRAN IV and is tailored for the present CDC-6600. The program is corecontained. Provision is made to save data on tape for futuremore » reference. ( auth)« less
NASA Astrophysics Data System (ADS)
Tsilanizara, A.; Gilardi, N.; Huynh, T. D.; Jouanne, C.; Lahaye, S.; Martinez, J. M.; Diop, C. M.
2014-06-01
The knowledge of the decay heat quantity and the associated uncertainties are important issues for the safety of nuclear facilities. Many codes are available to estimate the decay heat. ORIGEN, FISPACT, DARWIN/PEPIN2 are part of them. MENDEL is a new depletion code developed at CEA, with new software architecture, devoted to the calculation of physical quantities related to fuel cycle studies, in particular decay heat. The purpose of this paper is to present a probabilistic approach to assess decay heat uncertainty due to the decay data uncertainties from nuclear data evaluation like JEFF-3.1.1 or ENDF/B-VII.1. This probabilistic approach is based both on MENDEL code and URANIE software which is a CEA uncertainty analysis platform. As preliminary applications, single thermal fission of uranium 235, plutonium 239 and PWR UOx spent fuel cell are investigated.
Gadolinia depletion analysis by CASMO-4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, Y.; Saji, E.; Toba, A.
1993-01-01
CASMO-4 is the most recent version of the lattice physics code CASMO introduced by Studsvik. The principal aspects of the CASMO-4 model that differ from the models in previous CASMO versions are as follows: (1) heterogeneous model for two-dimensional transport theory calculations; and (2) microregion depletion model for burnable absorbers, such as gadolinia. Of these aspects, the first has previously been benchmarked against measured data of critical experiments and Monte Carlo calculations, verifying the high degree of accuracy. To proceed with CASMO-4 benchmarking, it is desirable to benchmark the microregion depletion model, which enables CASMO-4 to calculate gadolinium depletion directlymore » without the need for precalculated MICBURN cross-section data. This paper presents the benchmarking results for the microregion depletion model in CASMO-4 using the measured data of depleted gadolinium rods.« less
NASA Astrophysics Data System (ADS)
Rizzo, Axel; Vaglio-Gaudard, Claire; Martin, Julie-Fiona; Noguère, Gilles; Eschbach, Romain
2017-09-01
DARWIN2.3 is the reference package used for fuel cycle applications in France. It solves the Boltzmann and Bateman equations in a coupling way, with the European JEFF-3.1.1 nuclear data library, to compute the fuel cycle values of interest. It includes both deterministic transport codes APOLLO2 (for light water reactors) and ERANOS2 (for fast reactors), and the DARWIN/PEPIN2 depletion code, each of them being developed by CEA/DEN with the support of its industrial partners. The DARWIN2.3 package has been experimentally validated for pressurized and boiling water reactors, as well as for sodium fast reactors; this experimental validation relies on the analysis of post-irradiation experiments (PIE). The DARWIN2.3 experimental validation work points out some isotopes for which the depleted concentration calculation can be improved. Some other nuclides have no available experimental validation, and their concentration calculation uncertainty is provided by the propagation of a priori nuclear data uncertainties. This paper describes the work plan of studies initiated this year to improve the accuracy of the DARWIN2.3 depleted material balance calculation concerning some nuclides of interest for the fuel cycle.
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Javier Ortensi; Sonat Sen
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible formore » defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III results of all other international participants in 2014, while the remaining Phase II transient case results will be reported in 2015.« less
NASA Astrophysics Data System (ADS)
Dieudonne, Cyril; Dumonteil, Eric; Malvagi, Fausto; M'Backé Diop, Cheikh
2014-06-01
For several years, Monte Carlo burnup/depletion codes have appeared, which couple Monte Carlo codes to simulate the neutron transport to deterministic methods, which handle the medium depletion due to the neutron flux. Solving Boltzmann and Bateman equations in such a way allows to track fine 3-dimensional effects and to get rid of multi-group hypotheses done by deterministic solvers. The counterpart is the prohibitive calculation time due to the Monte Carlo solver called at each time step. In this paper we present a methodology to avoid the repetitive and time-expensive Monte Carlo simulations, and to replace them by perturbation calculations: indeed the different burnup steps may be seen as perturbations of the isotopic concentration of an initial Monte Carlo simulation. In a first time we will present this method, and provide details on the perturbative technique used, namely the correlated sampling. In a second time the implementation of this method in the TRIPOLI-4® code will be discussed, as well as the precise calculation scheme a meme to bring important speed-up of the depletion calculation. Finally, this technique will be used to calculate the depletion of a REP-like assembly, studied at beginning of its cycle. After having validated the method with a reference calculation we will show that it can speed-up by nearly an order of magnitude standard Monte-Carlo depletion codes.
The Modeling of Advanced BWR Fuel Designs with the NRC Fuel Depletion Codes PARCS/PATHS
Ward, Andrew; Downar, Thomas J.; Xu, Y.; ...
2015-04-22
The PATHS (PARCS Advanced Thermal Hydraulic Solver) code was developed at the University of Michigan in support of U.S. Nuclear Regulatory Commission research to solve the steady-state, two-phase, thermal-hydraulic equations for a boiling water reactor (BWR) and to provide thermal-hydraulic feedback for BWR depletion calculations with the neutronics code PARCS (Purdue Advanced Reactor Core Simulator). The simplified solution methodology, including a three-equation drift flux formulation and an optimized iteration scheme, yields very fast run times in comparison to conventional thermal-hydraulic systems codes used in the industry, while still retaining sufficient accuracy for applications such as BWR depletion calculations. Lastly, themore » capability to model advanced BWR fuel designs with part-length fuel rods and heterogeneous axial channel flow geometry has been implemented in PATHS, and the code has been validated against previously benchmarked advanced core simulators as well as BWR plant and experimental data. We describe the modifications to the codes and the results of the validation in this paper.« less
NASA Astrophysics Data System (ADS)
Fensin, Michael Lorne
Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established and supported radiation transport code, for further development of a Monte Carlo-linked depletion methodology which is essential to the future development of advanced reactor technologies that exceed the limitations of current deterministic based methods.
Impact of Reactor Operating Parameters on Cask Reactivity in BWR Burnup Credit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilas, Germina; Betzler, Benjamin R; Ade, Brian J
This paper discusses the effect of reactor operating parameters used in fuel depletion calculations on spent fuel cask reactivity, with relevance for boiling-water reactor (BWR) burnup credit (BUC) applications. Assessments that used generic BWR fuel assembly and spent fuel cask configurations are presented. The considered operating parameters, which were independently varied in the depletion simulations for the assembly, included fuel temperature, bypass water density, specific power, and operating history. Different operating history scenarios were considered for the assembly depletion to determine the effect of relative power distribution during the irradiation cycles, as well as the downtime between cycles. Depletion, decay,more » and criticality simulations were performed using computer codes and associated nuclear data within the SCALE code system. Results quantifying the dependence of cask reactivity on the assembly depletion parameters are presented herein.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less
Cunningham, Michael R.; Baumeister, Roy F.
2016-01-01
The limited resource model states that self-control is governed by a relatively finite set of inner resources on which people draw when exerting willpower. Once self-control resources have been used up or depleted, they are less available for other self-control tasks, leading to a decrement in subsequent self-control success. The depletion effect has been studied for over 20 years, tested or extended in more than 600 studies, and supported in an independent meta-analysis (Hagger et al., 2010). Meta-analyses are supposed to reduce bias in literature reviews. Carter et al.’s (2015) meta-analysis, by contrast, included a series of questionable decisions involving sampling, methods, and data analysis. We provide quantitative analyses of key sampling issues: exclusion of many of the best depletion studies based on idiosyncratic criteria and the emphasis on mini meta-analyses with low statistical power as opposed to the overall depletion effect. We discuss two key methodological issues: failure to code for research quality, and the quantitative impact of weak studies by novice researchers. We discuss two key data analysis issues: questionable interpretation of the results of trim and fill and Funnel Plot Asymmetry test procedures, and the use and misinterpretation of the untested Precision Effect Test and Precision Effect Estimate with Standard Error (PEESE) procedures. Despite these serious problems, the Carter et al. (2015) meta-analysis results actually indicate that there is a real depletion effect – contrary to their title. PMID:27826272
Richard, Joshua; Galloway, Jack; Fensin, Michael; ...
2015-04-04
A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less
MPACT Standard Input User s Manual, Version 2.2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, Benjamin S.; Downar, Thomas; Fitzgerald, Andrew
The MPACT (Michigan PArallel Charactistics based Transport) code is designed to perform high-fidelity light water reactor (LWR) analysis using whole-core pin-resolved neutron transport calculations on modern parallel-computing hardware. The code consists of several libraries which provide the functionality necessary to solve steady-state eigenvalue problems. Several transport capabilities are available within MPACT including both 2-D and 3-D Method of Characteristics (MOC). A three-dimensional whole core solution based on the 2D-1D solution method provides the capability for full core depletion calculations.
1986-09-30
4 . ~**..ft.. ft . - - - ft SI TABLES 9 I. SA32~40 Single Event Upset Test, 1140-MeV Krypton, 9/l8/8~4. . .. .. .. .. .. .16 II. CRUP Simulation...cosmic ray interaction analysis described in the remainder of this report were calculated using the CRUP computer code 3 modified for funneling. The... CRUP code requires, as inputs, the size of a depletion region specified as a retangular parallel piped with dimensions a 9 b S c, the effective funnel
Methods used to calculate doses resulting from inhalation of Capstone depleted uranium aerosols.
Miller, Guthrie; Cheng, Yung Sung; Traub, Richard J; Little, Tom T; Guilmette, Raymond A
2009-03-01
The methods used to calculate radiological and toxicological doses to hypothetical persons inside either a U.S. Army Abrams tank or Bradley Fighting Vehicle that has been perforated by depleted uranium munitions are described. Data from time- and particle-size-resolved measurements of depleted uranium aerosol as well as particle-size-resolved measurements of aerosol solubility in lung fluids for aerosol produced in the breathing zones of the hypothetical occupants were used. The aerosol was approximated as a mixture of nine monodisperse (single particle size) components corresponding to particle size increments measured by the eight stages plus the backup filter of the cascade impactors used. A Markov Chain Monte Carlo Bayesian analysis technique was employed, which straightforwardly calculates the uncertainties in doses. Extensive quality control checking of the various computer codes used is described.
Cross-site comparison of ribosomal depletion kits for Illumina RNAseq library construction.
Herbert, Zachary T; Kershner, Jamie P; Butty, Vincent L; Thimmapuram, Jyothi; Choudhari, Sulbha; Alekseyev, Yuriy O; Fan, Jun; Podnar, Jessica W; Wilcox, Edward; Gipson, Jenny; Gillaspy, Allison; Jepsen, Kristen; BonDurant, Sandra Splinter; Morris, Krystalynne; Berkeley, Maura; LeClerc, Ashley; Simpson, Stephen D; Sommerville, Gary; Grimmett, Leslie; Adams, Marie; Levine, Stuart S
2018-03-15
Ribosomal RNA (rRNA) comprises at least 90% of total RNA extracted from mammalian tissue or cell line samples. Informative transcriptional profiling using massively parallel sequencing technologies requires either enrichment of mature poly-adenylated transcripts or targeted depletion of the rRNA fraction. The latter method is of particular interest because it is compatible with degraded samples such as those extracted from FFPE and also captures transcripts that are not poly-adenylated such as some non-coding RNAs. Here we provide a cross-site study that evaluates the performance of ribosomal RNA removal kits from Illumina, Takara/Clontech, Kapa Biosystems, Lexogen, New England Biolabs and Qiagen on intact and degraded RNA samples. We find that all of the kits are capable of performing significant ribosomal depletion, though there are differences in their ease of use. All kits were able to remove ribosomal RNA to below 20% with intact RNA and identify ~ 14,000 protein coding genes from the Universal Human Reference RNA sample at >1FPKM. Analysis of differentially detected genes between kits suggests that transcript length may be a key factor in library production efficiency. These results provide a roadmap for labs on the strengths of each of these methods and how best to utilize them.
75 FR 38182 - Proposed Collection; Comment Request for Regulation Project
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... Deplete the Ozone Layer and on Products Containing Such Chemicals (Sec. Sec. 52.4682-1(b), 52.4682-2(b....gov . SUPPLEMENTARY INFORMATION: Title: Excise Tax on Chemicals That Deplete the Ozone Layer and on... Revenue Code sections 4681 and 4682 relating to the tax on chemicals that deplete the ozone layer and on...
Development of a new lattice physics code robin for PWR application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Chen, G.
2013-07-01
This paper presents a description of methodologies and preliminary verification results of a new lattice physics code ROBIN, being developed for PWR application at Shanghai NuStar Nuclear Power Technology Co., Ltd. The methods used in ROBIN to fulfill various tasks of lattice physics analysis are an integration of historical methods and new methods that came into being very recently. Not only these methods like equivalence theory for resonance treatment and method of characteristics for neutron transport calculation are adopted, as they are applied in many of today's production-level LWR lattice codes, but also very useful new methods like the enhancedmore » neutron current method for Dancoff correction in large and complicated geometry and the log linear rate constant power depletion method for Gd-bearing fuel are implemented in the code. A small sample of verification results are provided to illustrate the type of accuracy achievable using ROBIN. It is demonstrated that ROBIN is capable of satisfying most of the needs for PWR lattice analysis and has the potential to become a production quality code in the future. (authors)« less
Petrova, Olga E.; Garcia-Alcalde, Fernando; Zampaloni, Claudia; Sauer, Karin
2017-01-01
Global transcriptomic analysis via RNA-seq is often hampered by the high abundance of ribosomal (r)RNA in bacterial cells. To remove rRNA and enrich coding sequences, subtractive hybridization procedures have become the approach of choice prior to RNA-seq, with their efficiency varying in a manner dependent on sample type and composition. Yet, despite an increasing number of RNA-seq studies, comparative evaluation of bacterial rRNA depletion methods has remained limited. Moreover, no such study has utilized RNA derived from bacterial biofilms, which have potentially higher rRNA:mRNA ratios and higher rRNA carryover during RNA-seq analysis. Presently, we evaluated the efficiency of three subtractive hybridization-based kits in depleting rRNA from samples derived from biofilm, as well as planktonic cells of the opportunistic human pathogen Pseudomonas aeruginosa. Our results indicated different rRNA removal efficiency for the three procedures, with the Ribo-Zero kit yielding the highest degree of rRNA depletion, which translated into enhanced enrichment of non-rRNA transcripts and increased depth of RNA-seq coverage. The results indicated that, in addition to improving RNA-seq sensitivity, efficient rRNA removal enhanced detection of low abundance transcripts via qPCR. Finally, we demonstrate that the Ribo-Zero kit also exhibited the highest efficiency when P. aeruginosa/Staphylococcus aureus co-culture RNA samples were tested. PMID:28117413
NASA Astrophysics Data System (ADS)
Baraffe, I.; Pratt, J.; Goffrey, T.; Constantino, T.; Folini, D.; Popov, M. V.; Walder, R.; Viallet, M.
2017-08-01
We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a young low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ˜50 Myr to ˜4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baraffe, I.; Pratt, J.; Goffrey, T.
We study lithium depletion in low-mass and solar-like stars as a function of time, using a new diffusion coefficient describing extra-mixing taking place at the bottom of a convective envelope. This new form is motivated by multi-dimensional fully compressible, time-implicit hydrodynamic simulations performed with the MUSIC code. Intermittent convective mixing at the convective boundary in a star can be modeled using extreme value theory, a statistical analysis frequently used for finance, meteorology, and environmental science. In this Letter, we implement this statistical diffusion coefficient in a one-dimensional stellar evolution code, using parameters calibrated from multi-dimensional hydrodynamic simulations of a youngmore » low-mass star. We propose a new scenario that can explain observations of the surface abundance of lithium in the Sun and in clusters covering a wide range of ages, from ∼50 Myr to ∼4 Gyr. Because it relies on our physical model of convective penetration, this scenario has a limited number of assumptions. It can explain the observed trend between rotation and depletion, based on a single additional assumption, namely, that rotation affects the mixing efficiency at the convective boundary. We suggest the existence of a threshold in stellar rotation rate above which rotation strongly prevents the vertical penetration of plumes and below which rotation has small effects. In addition to providing a possible explanation for the long-standing problem of lithium depletion in pre-main-sequence and main-sequence stars, the strength of our scenario is that its basic assumptions can be tested by future hydrodynamic simulations.« less
Code of Federal Regulations, 2011 CFR
2011-04-01
... TAXES (CONTINUED) Natural Resources § 1.613-7 Application of percentage depletion rates provided in... Code). In the case of mines, wells, or other natural deposits listed in section 613(b), the election...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greene, N.M.; Petrie, L.M.; Westfall, R.M.
SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less
Characterization and Remediation of Contaminated Sites:Modeling, Measurement and Assessment
NASA Astrophysics Data System (ADS)
Basu, N. B.; Rao, P. C.; Poyer, I. C.; Christ, J. A.; Zhang, C. Y.; Jawitz, J. W.; Werth, C. J.; Annable, M. D.; Hatfield, K.
2008-05-01
The complexity of natural systems makes it impossible to estimate parameters at the required level of spatial and temporal detail. Thus, it becomes necessary to transition from spatially distributed parameters to spatially integrated parameters that are capable of adequately capturing the system dynamics, without always accounting for local process behavior. Contaminant flux across the source control plane is proposed as an integrated metric that captures source behavior and links it to plume dynamics. Contaminant fluxes were measured using an innovative technology, the passive flux meter at field sites contaminated with dense non-aqueous phase liquids or DNAPLs in the US and Australia. Flux distributions were observed to be positively or negatively correlated with the conductivity distribution, depending on the source characteristics of the site. The impact of partial source depletion on the mean contaminant flux and flux architecture was investigated in three-dimensional complex heterogeneous settings using the multiphase transport code UTCHEM and the reactive transport code ISCO3D. Source mass depletion reduced the mean contaminant flux approximately linearly, while the contaminant flux standard deviation reduced proportionally with the mean (i.e., coefficient of variation of flux distribution is constant with time). Similar analysis was performed using data from field sites, and the results confirmed the numerical simulations. The linearity of the mass depletion-flux reduction relationship indicates the ability to design remediation systems that deplete mass to achieve target reduction in source strength. Stability of the flux distribution indicates the ability to characterize the distributions in time once the initial distribution is known. Lagrangian techniques were used to predict contaminant flux behavior during source depletion in terms of the statistics of the hydrodynamic and DNAPL distribution. The advantage of the Lagrangian techniques lies in their small computation time and their inclusion of spatially integrated parameters that can be measured in the field using tracer tests. Analytical models that couple source depletion to plume transport were used for optimization of source and plume treatment. These models are being used for the development of decision and management tools (for DNAPL sites) that consider uncertainty assessments as an integral part of the decision-making process for contaminated site remediation.
Analysis of protein-coding genetic variation in 60,706 humans.
Lek, Monkol; Karczewski, Konrad J; Minikel, Eric V; Samocha, Kaitlin E; Banks, Eric; Fennell, Timothy; O'Donnell-Luria, Anne H; Ware, James S; Hill, Andrew J; Cummings, Beryl B; Tukiainen, Taru; Birnbaum, Daniel P; Kosmicki, Jack A; Duncan, Laramie E; Estrada, Karol; Zhao, Fengmei; Zou, James; Pierce-Hoffman, Emma; Berghout, Joanne; Cooper, David N; Deflaux, Nicole; DePristo, Mark; Do, Ron; Flannick, Jason; Fromer, Menachem; Gauthier, Laura; Goldstein, Jackie; Gupta, Namrata; Howrigan, Daniel; Kiezun, Adam; Kurki, Mitja I; Moonshine, Ami Levy; Natarajan, Pradeep; Orozco, Lorena; Peloso, Gina M; Poplin, Ryan; Rivas, Manuel A; Ruano-Rubio, Valentin; Rose, Samuel A; Ruderfer, Douglas M; Shakir, Khalid; Stenson, Peter D; Stevens, Christine; Thomas, Brett P; Tiao, Grace; Tusie-Luna, Maria T; Weisburd, Ben; Won, Hong-Hee; Yu, Dongmei; Altshuler, David M; Ardissino, Diego; Boehnke, Michael; Danesh, John; Donnelly, Stacey; Elosua, Roberto; Florez, Jose C; Gabriel, Stacey B; Getz, Gad; Glatt, Stephen J; Hultman, Christina M; Kathiresan, Sekar; Laakso, Markku; McCarroll, Steven; McCarthy, Mark I; McGovern, Dermot; McPherson, Ruth; Neale, Benjamin M; Palotie, Aarno; Purcell, Shaun M; Saleheen, Danish; Scharf, Jeremiah M; Sklar, Pamela; Sullivan, Patrick F; Tuomilehto, Jaakko; Tsuang, Ming T; Watkins, Hugh C; Wilson, James G; Daly, Mark J; MacArthur, Daniel G
2016-08-18
Large-scale reference data sets of human genetic variation are critical for the medical and functional interpretation of DNA sequence changes. Here we describe the aggregation and analysis of high-quality exome (protein-coding region) DNA sequence data for 60,706 individuals of diverse ancestries generated as part of the Exome Aggregation Consortium (ExAC). This catalogue of human genetic diversity contains an average of one variant every eight bases of the exome, and provides direct evidence for the presence of widespread mutational recurrence. We have used this catalogue to calculate objective metrics of pathogenicity for sequence variants, and to identify genes subject to strong selection against various classes of mutation; identifying 3,230 genes with near-complete depletion of predicted protein-truncating variants, with 72% of these genes having no currently established human disease phenotype. Finally, we demonstrate that these data can be used for the efficient filtering of candidate disease-causing variants, and for the discovery of human 'knockout' variants in protein-coding genes.
Depletion optimization of lumped burnable poisons in pressurized water reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kodah, Z.H.
1982-01-01
Techniques were developed to construct a set of basic poison depletion curves which deplete in a monotonical manner. These curves were combined to match a required optimized depletion profile by utilizing either linear or non-linear programming methods. Three computer codes, LEOPARD, XSDRN, and EXTERMINATOR-2 were used in the analyses. A depletion routine was developed and incorporated into the XSDRN code to allow the depletion of fuel, fission products, and burnable poisons. The Three Mile Island Unit-1 reactor core was used in this work as a typical PWR core. Two fundamental burnable poison rod designs were studied. They are a solidmore » cylindrical poison rod and an annular cylindrical poison rod with water filling the central region.These two designs have either a uniform mixture of burnable poisons or lumped spheroids of burnable poisons in the poison region. Boron and gadolinium are the two burnable poisons which were investigated in this project. Thermal self-shielding factor calculations for solid and annular poison rods were conducted. Also expressions for overall thermal self-shielding factors for one or more than one size group of poison spheroids inside solid and annular poison rods were derived and studied. Poison spheroids deplete at a slower rate than the poison mixture because each spheroid exhibits some self-shielding effects of its own. The larger the spheroid, the higher the self-shielding effects due to the increase in poison concentration.« less
NASA Astrophysics Data System (ADS)
Nelson, Rebecca L.
2012-01-01
Groundwater pumping has caused excessive groundwater depletion around the world, yet regulating pumping remains a profound challenge. California uses more groundwater than any other U.S. state, and serves as a microcosm of the adverse effects of pumping felt worldwide—land subsidence, impaired water quality, and damaged ecosystems, all against the looming threat of climate change. The state largely entrusts the control of depletion to the local level. This study uses internationally accepted water resources planning theories systematically to investigate three key aspects of controlling groundwater depletion in California, with an emphasis on local-level action: (a) making decisions and engaging stakeholders; (b) monitoring groundwater; and (c) using mandatory, fee-based and voluntary approaches to control groundwater depletion (e.g., pumping restrictions, pumping fees, and education about water conservation, respectively). The methodology used is the social science-derived technique of content analysis, which involves using a coding scheme to record these three elements in local rules and plans, and State legislation, then analyzing patterns and trends. The study finds that Californian local groundwater managers rarely use, or plan to use, mandatory and fee-based measures to control groundwater depletion. Most use only voluntary approaches or infrastructure to attempt to reduce depletion, regardless of whether they have more severe groundwater problems, or problems which are more likely to have irreversible adverse effects. The study suggests legal reforms to the local groundwater planning system, drawing upon its empirical findings. Considering the content of these recommendations may also benefit other jurisdictions that use a local groundwater management planning paradigm.
Hydrologic Drought Decision Support System (HyDroDSS)
Granato, Gregory E.
2014-01-01
The hydrologic drought decision support system (HyDroDSS) was developed by the U.S. Geological Survey (USGS) in cooperation with the Rhode Island Water Resources Board (RIWRB) for use in the analysis of hydrologic variables that may indicate the risk for streamflows to be below user-defined flow targets at a designated site of interest, which is defined herein as data-collection site on a stream that may be adversely affected by pumping. Hydrologic drought is defined for this study as a period of lower than normal streamflows caused by precipitation deficits and (or) water withdrawals. The HyDroDSS is designed to provide water managers with risk-based information for balancing water-supply needs and aquatic-habitat protection goals to mitigate potential effects of hydrologic drought. This report describes the theory and methods for retrospective streamflow-depletion analysis, rank correlation analysis, and drought-projection analysis. All three methods are designed to inform decisions made by drought steering committees and decisionmakers on the basis of quantitative risk assessment. All three methods use estimates of unaltered streamflow, which is the measured or modeled flow without major withdrawals or discharges, to approximate a natural low-flow regime. Retrospective streamflow-depletion analysis can be used by water-resource managers to evaluate relations between withdrawal plans and the potential effects of withdrawal plans on streams at one or more sites of interest in an area. Retrospective streamflow-depletion analysis indicates the historical risk of being below user-defined flow targets if different pumping plans were implemented for the period of record. Retrospective streamflow-depletion analysis also indicates the risk for creating hydrologic drought conditions caused by use of a pumping plan. Retrospective streamflow-depletion analysis is done by calculating the net streamflow depletions from withdrawals and discharges and applying these depletions to a simulated record of unaltered streamflow. Rank correlation analysis in the HyDroDSS indicates the persistence of hydrologic measurements from month to month for the prediction of developing hydrologic drought conditions and quantitatively indicates which hydrologic variables may be used to indicate the onset of hydrologic drought conditions. Rank correlation analysis also indicates the potential use of each variable for estimating the monthly minimum unaltered flow at a site of interest for use in the drought-projection analysis. Rank correlation analysis in the HyDroDSS is done by calculating Spearman’s rho for paired samples and the 95-percent confidence limits of this rho value. Rank correlation analysis can be done by using precipitation, groundwater levels, measured streamflows, and estimated unaltered streamflows. Serial correlation analysis, which indicates relations between current and future values, can be done for a single site. Cross correlation analysis, which indicates relations among current values at one site and current and future values at a second site, also can be done. Drought-projection analysis in the HyDroDSS indicates the risk for being in a hydrologic drought condition during the current month and the five following months with and without pumping. Drought-projection analysis also indicates the potential effectiveness of water-conservation methods for mitigating the effect of withdrawals in the coming months on the basis of the amount of depletion caused by different pumping plans and on the risk of unaltered flows being below streamflow targets. Drought-projection analysis in the HyDroDSS is done with Monte Carlo methods by using the position analysis method. In this method the initial value of estimated unaltered streamflows is calculated by correlation to a measured hydrologic variable (monthly precipitation, groundwater levels, or streamflows from an index station identified with the rank correlation analysis). Then a pseudorandom number generator is used to create 251 six-month-long flow traces by using a bootstrap method. Serial correlation of the estimated unaltered monthly minimum streamflows determined from the rank correlation analysis is preserved within each flow trace. The sample of unaltered streamflows indicates the risk of being below flow targets in the coming months under simulated natural conditions (without historic withdrawals). The streamflow-depletion algorithms are then used to estimate risks of flow being below targets if selected pumping plans are used. This report also describes the implementation of the HyDroDSS. The HyDroDSS was developed as a Microsoft Access® database application to facilitate storage, handling, and use of hydrologic datasets with a simple graphical user interface. The program is implemented in the database by using the Visual Basic for Applications® (VBA) programming language. Program source code for the analytical techniques is provided in the HyDroDSS and in electronic text files accompanying this report. Program source code for the graphical user interface and for data-handling code, which is specific to Microsoft Access® and the HyDroDSS, is provided in the database. An installation package with a run-time version of the software is available with this report for potential users who do not have a compatible copy of Microsoft Access®. Administrative rights are needed to install this version of the HyDroDSS. A case study, to demonstrate the use of HyDroDSS and interpretation of results for a site of interest, is detailed for the USGS streamgage on the Hunt River (station 01117000) near East Greenwich in central Rhode Island. The Hunt River streamgage was used because it has a long record of streamflow and is in a well-studied basin with a substantial amount of hydrologic and water-use data including groundwater pumping for municipal water supply.
NASA Technical Reports Server (NTRS)
Lahti, G. P.; Mueller, R. A.
1973-01-01
Measurements of MeV neutron were made at the surface of a lithium hydride and depleted uranium shielded reactor. Four shield configurations were considered: these were assembled progressively with cylindrical shells of 5-centimeter-thick depleted uranium, 13-centimeter-thick lithium hydride, 5-centimeter-thick depleted uranium, 13-centimeter-thick lithium hydride, 5-centimeter-thick depleted uranium, and 3-centimeter-thick depleted uranium. Measurements were made with a NE-218 scintillation spectrometer; proton pulse height distributions were differentiated to obtain neutron spectra. Calculations were made using the two-dimensional discrete ordinates code DOT and ENDF/B (version 3) cross sections. Good agreement between measured and calculated spectral shape was observed. Absolute measured and calculated fluxes were within 50 percent of one another; observed discrepancies in absolute flux may be due to cross section errors.
Champier, Jacques; Claustrat, Francine; Nazaret, Nicolas; Fèvre Montange, Michelle; Claustrat, Bruno
2012-02-01
Folate is essential for purine and thymidylate biosynthesis and in methyl transfer for DNA methylation. Folate deficiency alters the secretion of melatonin, a hormone involved in circadian rhythm entrainment, and causes hyperhomocysteinemia because of disruption of homocysteine metabolism. Adverse effects of homocysteine include the generation of free radicals, activation of proliferation or apoptosis, and alteration of gene expression. The liver is an important organ for folate metabolism, and its genome analysis has revealed numerous clock-regulated genes. The variations at the level of their expression during folate deficiency are not known. The aim of our study was to investigate the effects of folate deficiency on gene expression in the mouse liver. A control group receiving a synthetic diet and a folate-depleted group were housed for 4 weeks on a 12-hour/12-hour light/dark cycle. Three mice from each group were euthanized under dim red light at the beginning of the light cycle, and 3, at the beginning of the dark period. Gene expression was studied in a microarray analysis. Of the 53 genes showing modified daily expression in the controls, 52 showed a less marked or no difference after folate depletion. Only 1, lpin1, showed a more marked difference. Ten genes coding for proteins involved in lipid metabolism did not show a morning/evening difference in controls but did after folate depletion. This study shows that, in the mouse liver, dietary folate depletion leads to major changes in expression of several genes involved in fatty acid metabolism, DNA synthesis, and expression of circadian genes. Copyright © 2012 Elsevier Inc. All rights reserved.
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kavenoky, A.
1973-01-01
From national topical meeting on mathematical models and computational techniques for analysis of nuclear systems; Ann Arbor, Michigan, USA (8 Apr 1973). In mathematical models and computational techniques for analysis of nuclear systems. APOLLO calculates the space-and-energy-dependent flux for a one dimensional medium, in the multigroup approximation of the transport equation. For a one dimensional medium, refined collision probabilities have been developed for the resolution of the integral form of the transport equation; these collision probabilities increase accuracy and save computing time. The interaction between a few cells can also be treated by the multicell option of APOLLO. The diffusionmore » coefficient and the material buckling can be computed in the various B and P approximations with a linearly anisotropic scattering law, even in the thermal range of the spectrum. Eventually this coefficient is corrected for streaming by use of Benoist's theory. The self-shielding of the heavy isotopes is treated by a new and accurate technique which preserves the reaction rates of the fundamental fine structure flux. APOLLO can perform a depletion calculation for one cell, a group of cells or a complete reactor. The results of an APOLLO calculation are the space-and-energy-dependent flux, the material buckling or any reaction rate; these results can also be macroscopic cross sections used as input data for a 2D or 3D depletion and diffusion code in reactor geometry. 10 references. (auth)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Transfers of proven oil and gas properties from individuals to controlled corporations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cash, L.S.; Dickens, T.L.
1985-12-01
Code Section 613A(c) (10) sets forth an exception, for transfers by individuals to their controlled corporations, to the general rule of section 613A(c)(9) denying percentage depletion to transferees of proven oil and gas properties. The proposed regulations attempt to provide guidelines to help taxpayers comply, and, although these regulations are not all-inclusive, they should be helpful to taxpayers who must rely on these provisions to prevent the loss of deductions for percentage depletion. Because of some apparent ambiguities in this area of the Internal Revenue Code and Treasury's inability to flesh out these ambiguities in its proposed regulations, affected taxpayersmore » should be cautious. 2 tables.« less
Hirota, Ryuichi; Kuroda, Akio; Ikeda, Tsukasa; Takiguchi, Noboru; Ohtake, Hisao; Kato, Junichi
2006-08-01
The nitrifying bacterium Nitrosomonas sp. strain ENI-11 has three copies of the gene encoding hydroxylamine oxidoreductase (hao(1), hao(2), and hao(3)) on its genome. Broad-host-range reporter plasmids containing transcriptional fusion genes between hao copies and lacZ were constructed to analyze the expression of each hydroxylamine oxidoreductase gene (hao) copy individually and quantitatively. beta-Galactosidase assays of ENI-11 harboring reporter plasmids revealed that all hao copies were transcribed in the wild-type strain. Promoter analysis of hao copies revealed that transcription of hao(3) was highest among the hao copies. Expression levels of hao(1) and hao(2) were 40% and 62% of that of hao(3) respectively. Transcription of hao(1) was negatively regulated, whereas a portion of hao(3) transcription was read through transcription from the rpsT promoter. When energy-depleted cells were incubated in the growth medium, only hao(3) expression increased. This result suggests that it is hao(3) that is responsible for recovery from energy-depleted conditions in Nitrosomonas sp. strain ENI-11.
Hybrid reduced order modeling for assembly calculations
Bang, Youngsuk; Abdel-Khalik, Hany S.; Jessee, Matthew A.; ...
2015-08-14
While the accuracy of assembly calculations has greatly improved due to the increase in computer power enabling more refined description of the phase space and use of more sophisticated numerical algorithms, the computational cost continues to increase which limits the full utilization of their effectiveness for routine engineering analysis. Reduced order modeling is a mathematical vehicle that scales down the dimensionality of large-scale numerical problems to enable their repeated executions on small computing environment, often available to end users. This is done by capturing the most dominant underlying relationships between the model's inputs and outputs. Previous works demonstrated the usemore » of the reduced order modeling for a single physics code, such as a radiation transport calculation. This paper extends those works to coupled code systems as currently employed in assembly calculations. Finally, numerical tests are conducted using realistic SCALE assembly models with resonance self-shielding, neutron transport, and nuclides transmutation/depletion models representing the components of the coupled code system.« less
The stopping powers and energy straggling of heavy ions in polymer foils
NASA Astrophysics Data System (ADS)
Mikšová, R.; Macková, A.; Malinský, P.; Hnatowicz, V.; Slepička, P.
2014-07-01
The stopping power and energy straggling of 7Li, 12C and 16O ions in thin poly(etheretherketone) (PEEK), polyethylene terephthalate (PET) and polycarbonate (PC) foils were measured in the incident beam energy range of 9.4-11.8 MeV using an indirect transmission method. Ions scattered from a thin gold target at an angle of 150° were registered by a partially depleted PIPS detector, partly shielded with a polymer foil placed in front of the detector. Therefore, the signals from both direct and slowed down ions were visible in the same energy spectrum, which was evaluated by the ITAP code, developed at our laboratory. The ITAP code was employed to perform a Gaussian-fitting procedure to provide a complete analysis of each measured spectrum. The measured stopping powers were compared with the predictions obtained from the SRIM-2008 and MSTAR codes and with previous experimental data. The energy straggling data were compared with those calculated by using Bohr's, Lindhard-Scharff and Bethe-Livingston theories.
ORIGEN-based Nuclear Fuel Inventory Module for Fuel Cycle Assessment: Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skutnik, Steven E.
The goal of this project, “ORIGEN-based Nuclear Fuel Depletion Module for Fuel Cycle Assessment" is to create a physics-based reactor depletion and decay module for the Cyclus nuclear fuel cycle simulator in order to assess nuclear fuel inventories over a broad space of reactor operating conditions. The overall goal of this approach is to facilitate evaluations of nuclear fuel inventories for a broad space of scenarios, including extended used nuclear fuel storage and cascading impacts on fuel cycle options such as actinide recovery in used nuclear fuel, particularly for multiple recycle scenarios. The advantages of a physics-based approach (compared tomore » a recipe-based approach which has been typically employed for fuel cycle simulators) is in its inherent flexibility; such an approach can more readily accommodate the broad space of potential isotopic vectors that may be encountered under advanced fuel cycle options. In order to develop this flexible reactor analysis capability, we are leveraging the Origen nuclear fuel depletion and decay module from SCALE to produce a standalone “depletion engine” which will serve as the kernel of a Cyclus-based reactor analysis module. The ORIGEN depletion module is a rigorously benchmarked and extensively validated tool for nuclear fuel analysis and thus its incorporation into the Cyclus framework can bring these capabilities to bear on the problem of evaluating long-term impacts of fuel cycle option choices on relevant metrics of interest, including materials inventories and availability (for multiple recycle scenarios), long-term waste management and repository impacts, etc. Developing this Origen-based analysis capability for Cyclus requires the refinement of the Origen analysis sequence to the point where it can reasonably be compiled as a standalone sequence outside of SCALE; i.e., wherein all of the computational aspects of Origen (including reactor cross-section library processing and interpolation, input and output processing, and depletion/decay solvers) can be self-contained into a single executable sequence. Further, to embed this capability into other software environments (such as the Cyclus fuel cycle simulator) requires that Origen’s capabilities be encapsulated into a portable, self-contained library which other codes can then call directly through function calls, thereby directly accessing the solver and data processing capabilities of Origen. Additional components relevant to this work include modernization of the reactor data libraries used by Origen for conducting nuclear fuel depletion calculations. This work has included the development of new fuel assembly lattices not previously available (such as for CANDU heavy-water reactor assemblies) as well as validation of updated lattices for light-water reactors updated to employ modern nuclear data evaluations. The CyBORG reactor analysis module as-developed under this workscope is fully capable of dynamic calculation of depleted fuel compositions from all commercial U.S. reactor assembly types as well as a number of international fuel types, including MOX, VVER, MAGNOX, and PHWR CANDU fuel assemblies. In addition, the Origen-based depletion engine allows for CyBORG to evaluate novel fuel assembly and reactor design types via creation of Origen reactor data libraries via SCALE. The establishment of this new modeling capability affords fuel cycle modelers a substantially improved ability to model dynamically-changing fuel cycle and reactor conditions, including recycled fuel compositions from fuel cycle scenarios involving material recycle into thermal-spectrum systems.« less
Progress of IRSN R&D on ITER Safety Assessment
NASA Astrophysics Data System (ADS)
Van Dorsselaere, J. P.; Perrault, D.; Barrachin, M.; Bentaib, A.; Gensdarmes, F.; Haeck, W.; Pouvreau, S.; Salat, E.; Seropian, C.; Vendel, J.
2012-08-01
The French "Institut de Radioprotection et de Sûreté Nucléaire" (IRSN), in support to the French "Autorité de Sûreté Nucléaire", is analysing the safety of ITER fusion installation on the basis of the ITER operator's safety file. IRSN set up a multi-year R&D program in 2007 to support this safety assessment process. Priority has been given to four technical issues and the main outcomes of the work done in 2010 and 2011 are summarized in this paper: for simulation of accident scenarios in the vacuum vessel, adaptation of the ASTEC system code; for risk of explosion of gas-dust mixtures in the vacuum vessel, adaptation of the TONUS-CFD code for gas distribution, development of DUST code for dust transport, and preparation of IRSN experiments on gas inerting, dust mobilization, and hydrogen-dust mixtures explosion; for evaluation of the efficiency of the detritiation systems, thermo-chemical calculations of tritium speciation during transport in the gas phase and preparation of future experiments to evaluate the most influent factors on detritiation; for material neutron activation, adaptation of the VESTA Monte Carlo depletion code. The first results of these tasks have been used in 2011 for the analysis of the ITER safety file. In the near future, this R&D global programme may be reoriented to account for the feedback of the latter analysis or for new knowledge.
Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P
2017-03-01
Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5 ' proximal- i ntron- m inus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N 1 -methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N 1 -methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Understanding the haling power depletion (HPD) method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, S.; Blyth, T.; Ivanov, K.
2012-07-01
The Pennsylvania State Univ. (PSU) is using the university version of the Studsvik Scandpower Code System (CMS) for research and education purposes. Preparations have been made to incorporate the CMS into the PSU Nuclear Engineering graduate class 'Nuclear Fuel Management' course. The information presented in this paper was developed during the preparation of the material for the course. The Haling Power Depletion (HPD) was presented in the course for the first time. The HPD method has been criticized as not valid by many in the field even though it has been successfully applied at PSU for the past 20 years.more » It was noticed that the radial power distribution (RPD) for low leakage cores during depletion remained similar to that of the HPD during most of the cycle. Thus, the Haling Power Depletion (HPD) may be used conveniently mainly for low leakage cores. Studies were then made to better understand the HPD and the results are presented in this paper. Many different core configurations can be computed quickly with the HPD without using Burnable Poisons (BP) to produce several excellent low leakage core configurations that are viable for power production. Once the HPD core configuration is chosen for further analysis, techniques are available for establishing the BP design to prevent violating any of the safety constraints in such HPD calculated cores. In summary, in this paper it has been shown that the HPD method can be used for guiding the design for the low leakage core. (authors)« less
Ganguli, Dwaipayan; Chereji, Răzvan V.; Iben, James R.; Cole, Hope A.
2014-01-01
RSC and SWI/SNF are related ATP-dependent chromatin remodeling machines that move nucleosomes, regulating access to DNA. We addressed their roles in nucleosome phasing relative to transcription start sites in yeast. SWI/SNF has no effect on phasing at the global level. In contrast, RSC depletion results in global nucleosome repositioning: Both upstream and downstream nucleosomal arrays shift toward the nucleosome-depleted region (NDR), with no change in spacing, resulting in a narrower and partly filled NDR. The global picture of RSC-depleted chromatin represents the average of a range of chromatin structures, with most genes showing a shift of the +1 or the −1 nucleosome into the NDR. Using RSC ChIP data reported by others, we show that RSC occupancy is highest on the coding regions of heavily transcribed genes, though not at their NDRs. We propose that RSC has a role in restoring chromatin structure after transcription. Analysis of gene pairs in different orientations demonstrates that phasing patterns reflect competition between phasing signals emanating from neighboring NDRs. These signals may be in phase, resulting in constructive interference and a regular array, or out of phase, resulting in destructive interference and fuzzy positioning. We propose a modified barrier model, in which a stable complex located at the NDR acts as a bidirectional phasing barrier. In RSC-depleted cells, this barrier has a smaller footprint, resulting in narrower NDRs. Thus, RSC plays a critical role in organizing yeast chromatin. PMID:25015381
Ganguli, Dwaipayan; Chereji, Răzvan V; Iben, James R; Cole, Hope A; Clark, David J
2014-10-01
RSC and SWI/SNF are related ATP-dependent chromatin remodeling machines that move nucleosomes, regulating access to DNA. We addressed their roles in nucleosome phasing relative to transcription start sites in yeast. SWI/SNF has no effect on phasing at the global level. In contrast, RSC depletion results in global nucleosome repositioning: Both upstream and downstream nucleosomal arrays shift toward the nucleosome-depleted region (NDR), with no change in spacing, resulting in a narrower and partly filled NDR. The global picture of RSC-depleted chromatin represents the average of a range of chromatin structures, with most genes showing a shift of the +1 or the -1 nucleosome into the NDR. Using RSC ChIP data reported by others, we show that RSC occupancy is highest on the coding regions of heavily transcribed genes, though not at their NDRs. We propose that RSC has a role in restoring chromatin structure after transcription. Analysis of gene pairs in different orientations demonstrates that phasing patterns reflect competition between phasing signals emanating from neighboring NDRs. These signals may be in phase, resulting in constructive interference and a regular array, or out of phase, resulting in destructive interference and fuzzy positioning. We propose a modified barrier model, in which a stable complex located at the NDR acts as a bidirectional phasing barrier. In RSC-depleted cells, this barrier has a smaller footprint, resulting in narrower NDRs. Thus, RSC plays a critical role in organizing yeast chromatin. Published by Cold Spring Harbor Laboratory Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Ade, Brian J; Bowman, Stephen M
2015-01-01
Oak Ridge National Laboratory and the United States Nuclear Regulatory Commission have initiated a multiyear project to investigate application of burnup credit for boiling-water reactor (BWR) fuel in storage and transportation casks. This project includes two phases. The first phase (1) investigates applicability of peak reactivity methods currently used in spent fuel pools (SFPs) to storage and transportation systems and (2) evaluates validation of both reactivity (k eff) calculations and burnup credit nuclide concentrations within these methods. The second phase will focus on extending burnup credit beyond peak reactivity. This paper documents the first phase, including an analysis of latticemore » design parameters and depletion effects, as well as both validation components. Initial efforts related to extended burnup credit are discussed in a companion paper. Peak reactivity analyses have been used in criticality analyses for licensing of BWR fuel in SFPs over the last 20 years. These analyses typically combine credit for the gadolinium burnable absorber present in the fuel with a modest amount of burnup credit. Gadolinium burnable absorbers are used in BWR assemblies to control core reactivity. The burnable absorber significantly reduces assembly reactivity at beginning of life, potentially leading to significant increases in assembly reactivity for burnups less than 15–20 GWd/MTU. The reactivity of each fuel lattice is dependent on gadolinium loading. The number of gadolinium-bearing fuel pins lowers initial lattice reactivity, but it has a small impact on the burnup and reactivity of the peak. The gadolinium concentration in each pin has a small impact on initial lattice reactivity but a significant effect on the reactivity of the peak and the burnup at which the peak occurs. The importance of the lattice parameters and depletion conditions are primarily determined by their impact on the gadolinium depletion. Criticality code validation for BWR burnup credit at peak reactivity requires a different set of experiments than for pressurized-water reactor burnup credit analysis because of differences in actinide compositions, presence of residual gadolinium absorber, and lower fission product concentrations. A survey of available critical experiments is presented along with a sample criticality code validation and determination of undercoverage penalties for some nuclides. The validation of depleted fuel compositions at peak reactivity presents many challenges which largely result from a lack of radiochemical assay data applicable to BWR fuel in this burnup range. In addition, none of the existing low burnup measurement data include residual gadolinium measurements. An example bias and uncertainty associated with validation of actinide-only fuel compositions is presented.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan
2015-02-16
CASL's modeling and simulation technology, the Virtual Environment for Reactor Applications (VERA), incorporates coupled physics and science-based models, state-of-the-art numerical methods, modern computational science, integrated uncertainty quantification (UQ) and validation against data from operating pressurized water reactors (PWRs), single-effect experiments, and integral tests. The computational simulation component of VERA is the VERA Core Simulator (VERA-CS). The core simulator is the specific collection of multi-physics computer codes used to model and deplete a LWR core over multiple cycles. The core simulator has a single common input file that drives all of the different physics codes. The parser code, VERAIn, converts VERAmore » Input into an XML file that is used as input to different VERA codes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerjee, Kaushik; Clarity, Justin B; Cumberland, Riley M
This will be licensed via RSICC. A new, integrated data and analysis system has been designed to simplify and automate the performance of accurate and efficient evaluations for characterizing the input to the overall nuclear waste management system -UNF-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS). A relational database within UNF-ST&DARDS provides a standard means by which UNF-ST&DARDS can succinctly store and retrieve modeling and simulation (M&S) parameters for specific spent nuclear fuel analysis. A library of various analysis model templates provides the ability to communicate the various set of M&S parameters to the most appropriate M&S application.more » Interactive visualization capabilities facilitate data analysis and results interpretation. UNF-ST&DARDS current analysis capabilities include (1) assembly-specific depletion and decay, (2) and spent nuclear fuel cask-specific criticality and shielding. Currently, UNF-ST&DARDS uses SCALE nuclear analysis code system for performing nuclear analysis.« less
Ionospheric modification - An initial report on artificially created equatorial Spread F
NASA Technical Reports Server (NTRS)
Ossakow, S. L.; Zalesak, S. T.; Mcdonald, B. E.
1978-01-01
A numerical simulation code for investigating equatorial Spread F in the collisional Rayleigh-Taylor regime is utilized to follow the evolution of artificial plasma density depletions injected into the bottomside nighttime equatorial F region. The 70 km diameter hole rapidly rises and steepens, forming plasma density enhancements at altitudes below the rising hole. The distribution of enhancements and depletions is similar to natural equatorial Spread F phenomena, except it occurs on a much faster time scale. These predictions warrant carrying out artificial injection experiments in the nighttime equatorial F region.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
NASA Astrophysics Data System (ADS)
Burdo, James S.
This research is based on the concept that the diversion of nuclear fuel pins from Light Water Reactor (LWR) spent fuel assemblies is feasible by a careful comparison of spontaneous fission neutron and gamma levels in the guide tube locations of the fuel assemblies. The goal is to be able to determine whether some of the assembly fuel pins are either missing or have been replaced with dummy or fresh fuel pins. It is known that for typical commercial power spent fuel assemblies, the dominant spontaneous neutron emissions come from Cm-242 and Cm-244. Because of the shorter half-life of Cm-242 (0.45 yr) relative to that of Cm-244 (18.1 yr), Cm-244 is practically the only neutron source contributing to the neutron source term after the spent fuel assemblies are more than two years old. Initially, this research focused upon developing MCNP5 models of PWR fuel assemblies, modeling their depletion using the MONTEBURNS code, and by carrying out a preliminary depletion of a ¼ model 17x17 assembly from the TAKAHAMA-3 PWR. Later, the depletion and more accurate isotopic distribution in the pins at discharge was modeled using the TRITON depletion module of the SCALE computer code. Benchmarking comparisons were performed with the MONTEBURNS and TRITON results. Subsequently, the neutron flux in each of the guide tubes of the TAKAHAMA-3 PWR assembly at two years after discharge as calculated by the MCNP5 computer code was determined for various scenarios. Cases were considered for all spent fuel pins present and for replacement of a single pin at a position near the center of the assembly (10,9) and at the corner (17,1). Some scenarios were duplicated with a gamma flux calculation for high energies associated with Cm-244. For each case, the difference between the flux (neutron or gamma) for all spent fuel pins and with a pin removed or replaced is calculated for each guide tube. Different detection criteria were established. The first was whether the relative error of the difference was less than 1.00, allowing for the existence of the difference within the margin of error. The second was whether the difference between the two values was big enough to prevent their error bars from overlapping. Error analysis was performed both using a one second count and pseudo-Maxwell statistics for a projected 60 second count, giving four criteria for detection. The number of guide tubes meeting these criteria was compared and graphed for each case. Further analysis at extremes of high and low enrichment and long and short burnup times was done using data from assemblies at the Beaver Valley 1 and 2 PWR. In all neutron flux cases, at least two guide tube locations meet all the criteria for detection of pin diversion. At least one location in almost all of the gamma flux cases does. These results show that placing detectors in the empty guide tubes of spent fuel bundles to identify possible pin diversion is feasible.
Mahn, Andrea; Ismail, Maritza
2011-11-15
Ammonium sulfate precipitation (ASP) was explored as a method for depleting some highly abundant proteins from blood plasma, in order to reduce the dynamic range of protein concentration and to improve the detection of low abundance proteins by 2D-PAGE. 40% ammonium sulfate saturation was chosen since it allowed depleting 39% albumin and 82% α-1-antitrypsin. ASP-depletion showed high reproducibility in 2D-PAGE analysis (4.2% variation in relative abundance of albumin), similar to that offered by commercial affinity-depletion columns. Besides, it allowed detecting 59 spots per gel, very close to the number of spots detected in immuno-affinity-depleted plasma. Thus, ASP at 40% saturation is a reliable depletion method that may help in proteomic analysis of blood plasma. Finally, ASP-depletion seems to be complementary to hydrophobic interaction chromatography (HIC)-depletion, and therefore an ASP-step followed by a HIC-step could probably deplete the most highly abundant plasma proteins, thus improving the detection of low abundance proteins by 2D-PAGE. Copyright © 2011 Elsevier B.V. All rights reserved.
αCP Poly(C) Binding Proteins Act as Global Regulators of Alternative Polyadenylation
Ji, Xinjun; Wan, Ji; Vishnu, Melanie
2013-01-01
We have previously demonstrated that the KH-domain protein αCP binds to a 3′ untranslated region (3′UTR) C-rich motif of the nascent human alpha-globin (hα-globin) transcript and enhances the efficiency of 3′ processing. Here we assess the genome-wide impact of αCP RNA-protein (RNP) complexes on 3′ processing with a specific focus on its role in alternative polyadenylation (APA) site utilization. The major isoforms of αCP were acutely depleted from a human hematopoietic cell line, and the impact on mRNA representation and poly(A) site utilization was determined by direct RNA sequencing (DRS). Bioinformatic analysis revealed 357 significant alterations in poly(A) site utilization that could be specifically linked to the αCP depletion. These APA events correlated strongly with the presence of C-rich sequences in close proximity to the impacted poly(A) addition sites. The most significant linkage was the presence of a C-rich motif within a window 30 to 40 bases 5′ to poly(A) signals (AAUAAA) that were repressed upon αCP depletion. This linkage is consistent with a general role for αCPs as enhancers of 3′ processing. These findings predict a role for αCPs in posttranscriptional control pathways that can alter the coding potential and/or levels of expression of subsets of mRNAs in the mammalian transcriptome. PMID:23629627
Eberle, Andrea B; Jordán-Pla, Antonio; Gañez-Zapater, Antoni; Hessle, Viktoria; Silberberg, Gilad; von Euler, Anne; Silverstein, Rebecca A; Visa, Neus
2015-09-01
RNA surveillance factors are involved in heterochromatin regulation in yeast and plants, but less is known about the possible roles of ribonucleases in the heterochromatin of animal cells. Here we show that RRP6, one of the catalytic subunits of the exosome, is necessary for silencing heterochromatic repeats in the genome of Drosophila melanogaster. We show that a fraction of RRP6 is associated with heterochromatin, and the analysis of the RRP6 interaction network revealed physical links between RRP6 and the heterochromatin factors HP1a, SU(VAR)3-9 and RPD3. Moreover, genome-wide studies of RRP6 occupancy in cells depleted of SU(VAR)3-9 demonstrated that SU(VAR)3-9 contributes to the tethering of RRP6 to a subset of heterochromatic loci. Depletion of the exosome ribonucleases RRP6 and DIS3 stabilizes heterochromatic transcripts derived from transposons and repetitive sequences, and renders the heterochromatin less compact, as shown by micrococcal nuclease and proximity-ligation assays. Such depletion also increases the amount of HP1a bound to heterochromatic transcripts. Taken together, our results suggest that SU(VAR)3-9 targets RRP6 to a subset of heterochromatic loci where RRP6 degrades chromatin-associated non-coding RNAs in a process that is necessary to maintain the packaging of the heterochromatin.
Eberle, Andrea B.; Jordán-Pla, Antonio; Gañez-Zapater, Antoni; Hessle, Viktoria; Silberberg, Gilad; von Euler, Anne; Silverstein, Rebecca A.; Visa, Neus
2015-01-01
RNA surveillance factors are involved in heterochromatin regulation in yeast and plants, but less is known about the possible roles of ribonucleases in the heterochromatin of animal cells. Here we show that RRP6, one of the catalytic subunits of the exosome, is necessary for silencing heterochromatic repeats in the genome of Drosophila melanogaster. We show that a fraction of RRP6 is associated with heterochromatin, and the analysis of the RRP6 interaction network revealed physical links between RRP6 and the heterochromatin factors HP1a, SU(VAR)3-9 and RPD3. Moreover, genome-wide studies of RRP6 occupancy in cells depleted of SU(VAR)3-9 demonstrated that SU(VAR)3-9 contributes to the tethering of RRP6 to a subset of heterochromatic loci. Depletion of the exosome ribonucleases RRP6 and DIS3 stabilizes heterochromatic transcripts derived from transposons and repetitive sequences, and renders the heterochromatin less compact, as shown by micrococcal nuclease and proximity-ligation assays. Such depletion also increases the amount of HP1a bound to heterochromatic transcripts. Taken together, our results suggest that SU(VAR)3-9 targets RRP6 to a subset of heterochromatic loci where RRP6 degrades chromatin-associated non-coding RNAs in a process that is necessary to maintain the packaging of the heterochromatin. PMID:26389589
CESAR5.3: Isotopic depletion for Research and Testing Reactor decommissioning
NASA Astrophysics Data System (ADS)
Ritter, Guillaume; Eschbach, Romain; Girieud, Richard; Soulard, Maxime
2018-05-01
CESAR stands in French for "simplified depletion applied to reprocessing". The current version is now number 5.3 as it started 30 years ago from a long lasting cooperation with ORANO, co-owner of the code with CEA. This computer code can characterize several types of nuclear fuel assemblies, from the most regular PWR power plants to the most unexpected gas cooled and graphite moderated old timer research facility. Each type of fuel can also include numerous ranges of compositions like UOX, MOX, LEU or HEU. Such versatility comes from a broad catalog of cross section libraries, each corresponding to a specific reactor and fuel matrix design. CESAR goes beyond fuel characterization and can also provide an evaluation of structural materials activation. The cross-sections libraries are generated using the most refined assembly or core level transport code calculation schemes (CEA APOLLO2 or ERANOS), based on the European JEFF3.1.1 nuclear data base. Each new CESAR self shielded cross section library benefits all most recent CEA recommendations as for deterministic physics options. Resulting cross sections are organized as a function of burn up and initial fuel enrichment which allows to condensate this costly process into a series of Legendre polynomials. The final outcome is a fast, accurate and compact CESAR cross section library. Each library is fully validated, against a stochastic transport code (CEA TRIPOLI 4) if needed and against a reference depletion code (CEA DARWIN). Using CESAR does not require any of the neutron physics expertise implemented into cross section libraries generation. It is based on top quality nuclear data (JEFF3.1.1 for ˜400 isotopes) and includes up to date Bateman equation solving algorithms. However, defining a CESAR computation case can be very straightforward. Most results are only 3 steps away from any beginner's ambition: Initial composition, in core depletion and pool decay scenario. On top of a simple utilization architecture, CESAR includes a portable Graphical User Interface which can be broadly deployed in R&D or industrial facilities. Aging facilities currently face decommissioning and dismantling issues. This way to the end of the nuclear fuel cycle requires a careful assessment of source terms in the fuel, core structures and all parts of a facility that must be disposed of with "industrial nuclear" constraints. In that perspective, several CESAR cross section libraries were constructed for early CEA Research and Testing Reactors (RTR's). The aim of this paper is to describe how CESAR operates and how it can be used to help these facilities care for waste disposal, nuclear materials transport or basic safety cases. The test case will be based on the PHEBUS Facility located at CEA - Cadarache.
lncRNA requirements for mouse acute myeloid leukemia and normal differentiation
Knott, Simon RV; Munera Maravilla, Ester; Jackson, Benjamin T; Wild, Sophia A; Kovacevic, Tatjana; Stork, Eva Maria; Zhou, Meng; Erard, Nicolas; Lee, Emily; Kelley, David R; Roth, Mareike; Barbosa, Inês AM; Zuber, Johannes; Rinn, John L
2017-01-01
A substantial fraction of the genome is transcribed in a cell-type-specific manner, producing long non-coding RNAs (lncRNAs), rather than protein-coding transcripts. Here, we systematically characterize transcriptional dynamics during hematopoiesis and in hematological malignancies. Our analysis of annotated and de novo assembled lncRNAs showed many are regulated during differentiation and mis-regulated in disease. We assessed lncRNA function via an in vivo RNAi screen in a model of acute myeloid leukemia. This identified several lncRNAs essential for leukemia maintenance, and found that a number act by promoting leukemia stem cell signatures. Leukemia blasts show a myeloid differentiation phenotype when these lncRNAs were depleted, and our data indicates that this effect is mediated via effects on the MYC oncogene. Bone marrow reconstitutions showed that a lncRNA expressed across all progenitors was required for the myeloid lineage, whereas the other leukemia-induced lncRNAs were dispensable in the normal setting. PMID:28875933
lncRNA requirements for mouse acute myeloid leukemia and normal differentiation.
Delás, M Joaquina; Sabin, Leah R; Dolzhenko, Egor; Knott, Simon Rv; Munera Maravilla, Ester; Jackson, Benjamin T; Wild, Sophia A; Kovacevic, Tatjana; Stork, Eva Maria; Zhou, Meng; Erard, Nicolas; Lee, Emily; Kelley, David R; Roth, Mareike; Barbosa, Inês Am; Zuber, Johannes; Rinn, John L; Smith, Andrew D; Hannon, Gregory J
2017-09-06
A substantial fraction of the genome is transcribed in a cell-type-specific manner, producing long non-coding RNAs (lncRNAs), rather than protein-coding transcripts. Here, we systematically characterize transcriptional dynamics during hematopoiesis and in hematological malignancies. Our analysis of annotated and de novo assembled lncRNAs showed many are regulated during differentiation and mis-regulated in disease. We assessed lncRNA function via an in vivo RNAi screen in a model of acute myeloid leukemia. This identified several lncRNAs essential for leukemia maintenance, and found that a number act by promoting leukemia stem cell signatures. Leukemia blasts show a myeloid differentiation phenotype when these lncRNAs were depleted, and our data indicates that this effect is mediated via effects on the MYC oncogene. Bone marrow reconstitutions showed that a lncRNA expressed across all progenitors was required for the myeloid lineage, whereas the other leukemia-induced lncRNAs were dispensable in the normal setting.
Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camins, I.; Shinn, J.H.
We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less
A collision probability analysis of the double-heterogeneity problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebert, A.
1993-10-01
A practical collision probability model is presented for the description of geometries with many levels of heterogeneity. Regular regions of the macrogeometry are assumed to contain a stochastic mixture of spherical grains or cylindrical tubes. Simple expressions for the collision probabilities in the global geometry are obtained as a function of the collision probabilities in the macro- and microgeometries. This model was successfully implemented in the collision probability kernel of the APOLLO-1, APOLLO-2, and DRAGON lattice codes for the description of a broad range of reactor physics problems. Resonance self-shielding and depletion calculations in the microgeometries are possible because eachmore » microregion is explicitly represented.« less
Development of SSUBPIC code for modeling the neutral gas depletion effect in helicon discharges
NASA Astrophysics Data System (ADS)
Kollasch, Jeffrey; Sovenic, Carl; Schmitz, Oliver
2017-10-01
The SSUBPIC (steady-state unstructured-boundary particle-in-cell) code is being developed to model helicon plasma devices. The envisioned modeling framework incorporates (1) a kinetic neutral particle model, (2) a kinetic ion model, (3) a fluid electron model, and (4) an RF power deposition model. The models are loosely coupled and iterated until convergence to steady-state. Of the four required solvers, the kinetic ion and neutral particle simulation can now be done within the SSUBPIC code. Recent SSUBPIC modifications include implementation and testing of a Coulomb collision model (Lemons et al., JCP, 228(5), pp. 1391-1403) allowing efficient coupling of kineticly-treated ions to fluid electrons, and implementation of a neutral particle tracking mode with charge-exchange and electron impact ionization physics. These new simulation capabilities are demonstrated working independently and coupled to ``dummy'' profiles for RF power deposition to converge on steady-state plasma and neutral profiles. The geometry and conditions considered are similar to those of the MARIA experiment at UW-Madison. Initial results qualitatively show the expected neutral gas depletion effect in which neutrals in the plasma core are not replenished at a sufficient rate to sustain a higher plasma density. This work is funded by the NSF CAREER award PHY-1455210 and NSF Grant PHY-1206421.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less
A semi-empirical model for the formation and depletion of the high burnup structure in UO 2
Pizzocri, D.; Cappia, F.; Luzzi, L.; ...
2017-01-31
In the rim zone of UO 2 nuclear fuel pellets, the combination of high burnup and low temperature drives a microstructural change, leading to the formation of the high burnup structure (HBS). In this work, we propose a semi-empirical model to describe the formation of the HBS, which embraces the polygonisation/recrystallization process and the depletion of intra-granular fission gas, describing them as inherently related. To this end, we per-formed grain-size measurements on samples at radial positions in which the restructuring was incomplete. Moreover, based on these new experimental data, we assume an exponential reduction of the average grain size withmore » local effective burnup, paired with a simultaneous depletion of intra-granular fission gas driven by diffusion. The comparison with currently used models indicates the applicability of the herein developed model within integral fuel performance codes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vidal, Jean-Marc; Eschbach, Romain; Launay, Agnes
CEA and AREVA-NC have developed and used a depletion code named CESAR for 30 years. This user-friendly industrial tool provides fast characterizations for all types of nuclear fuel (PWR / UOX or MOX or reprocess Uranium, BWR / UOX or MOX, MTR and SFR) and the wastes associated. CESAR can evaluate 100 heavy nuclides, 200 fission products and 150 activation products (with Helium and Tritium formation). It can also characterize the structural material of the fuel (Zircalloy, stainless steel, M5 alloy). CESAR provides depletion calculations for any reactor irradiation history and from 3 months to 1 million years of coolingmore » time. CESAR5.3 is based on the latest calculation schemes recommended by the CEA and on an international nuclear data base (JEFF-3.1.1). It is constantly checked against the CEA referenced and qualified depletion code DARWIN. CESAR incorporates the CEA qualification based on the dissolution analyses of fuel rod samples and the 'La Hague' reprocessing plant feedback experience. AREVA-NC uses CESAR intensively at 'La Hague' plant, not only for prospective studies but also for characterizations at different industrial facilities all along the reprocessing process and waste conditioning (near 150 000 calculations per year). CESAR is the reference code for AREVA-NC. CESAR is used directly or indirectly with other software, data bank or special equipment in many parts of the La Hague plants. The great flexibility of CESAR has rapidly interested other projects. CESAR became a 'tool' directly integrated in some other softwares. Finally, coupled with a Graphical User Interface, it can be easily used independently, responding to many needs for prospective studies as a support for nuclear facilities or transport. An English version is available. For the principal isotopes of U and Pu, CESAR5 benefits from the CEA experimental validation for the PWR UOX fuels, up to a burnup of 60 GWd/t and for PWR MOX fuels, up to 45 GWd/t. CESAR version 5.3 uses the CEA reference calculation codes for neutron physics with the JEFF-3.1.1 nuclear data set. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benoit, J. C.; Bourdot, P.; Eschbach, R.
2012-07-01
A Decay Heat (DH) experiment on the whole core of the French Sodium-Cooled Fast Reactor PHENIX has been conducted in May 2008. The measurements began an hour and a half after the shutdown of the reactor and lasted twelve days. It is one of the experiments used for the experimental validation of the depletion code DARWIN thereby confirming the excellent performance of the aforementioned code. Discrepancies between measured and calculated decay heat do not exceed 8%. (authors)
Method for depleting BWRs using optimal control rod patterns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taner, M.S.; Levine, S.H.; Hsiao, M.Y.
1991-01-01
Control rod (CR) programming is an essential core management activity for boiling water reactors (BWRs). After establishing a core reload design for a BWR, CR programming is performed to develop a sequence of exposure-dependent CR patterns that assure the safe and effective depletion of the core through a reactor cycle. A time-variant target power distribution approach has been assumed in this study. The authors have developed OCTOPUS to implement a new two-step method for designing semioptimal CR programs for BWRs. The optimization procedure of OCTOPUS is based on the method of approximation programming and uses the SIMULATE-E code for nucleonicsmore » calculations.« less
Comparative analysis of LWR and FBR spent fuels for nuclear forensics evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Permana, Sidik; Suzuki, Mitsutoshi; Su'ud, Zaki
2012-06-06
Some interesting issues are attributed to nuclide compositions of spent fuels from thermal reactors as well as fast reactors such as a potential to reuse as recycled fuel, and a possible capability to be manage as a fuel for destructive devices. In addition, analysis on nuclear forensics which is related to spent fuel compositions becomes one of the interesting topics to evaluate the origin and the composition of spent fuels from the spent fuel foot-prints. Spent fuel compositions of different fuel types give some typical spent fuel foot prints and can be estimated the origin of source of those spentmore » fuel compositions. Some technics or methods have been developing based on some science and technological capability including experimental and modeling or theoretical aspects of analyses. Some foot-print of nuclear forensics will identify the typical information of spent fuel compositions such as enrichment information, burnup or irradiation time, reactor types as well as the cooling time which is related to the age of spent fuels. This paper intends to evaluate the typical spent fuel compositions of light water (LWR) and fast breeder reactors (FBR) from the view point of some foot prints of nuclear forensics. An established depletion code of ORIGEN is adopted to analyze LWR spent fuel (SF) for several burnup constants and decay times. For analyzing some spent fuel compositions of FBR, some coupling codes such as SLAROM code, JOINT and CITATION codes including JFS-3-J-3.2R as nuclear data library have been adopted. Enriched U-235 fuel composition of oxide type is used for fresh fuel of LWR and a mixed oxide fuel (MOX) for FBR fresh fuel. Those MOX fuels of FBR come from the spent fuels of LWR. Some typical spent fuels from both LWR and FBR will be compared to distinguish some typical foot-prints of SF based on nuclear forensic analysis.« less
Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli
van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.
2016-01-01
Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438
How to establish, maintain and use timber depletion accounts
William C. Siegel
2001-01-01
Section 1221 of the Internal Revenue Code defines capital expenditures. In general, these are amounts spent to acquire real estate or equipment, or to make improvements that increase the value of real estate or equipment already owned. Forestry examples include land, buildings, standing timber, reforestation costs, and tractors and trucks. Property owners who incur...
Mineralogy of the Martian Surface: Crustal Composition to Surface Processes
NASA Technical Reports Server (NTRS)
Mustard, John F.
1999-01-01
Over the course of this award we have: 1) Completed and published the results of a study of the effects of hyperfine particles on reflectance spectra of olivine and quartz, which included the development of scattering codes. Research has also progressed in the analysis of the effects of fine particle sizes on clay spectra. 2) Completed the analysis of the mineralogy of dark regions, showed the insitu compositions are highly correlated to the SNC meteorites, and determined that the martian mantle was depleted in aluminum prior to 2-3 GA ago; Studies of the mineralogic heterogeneity of surficial materials on Mars have also been conducted. and 3) Performed initial work on the study of the physical and chemical processes likely to form and modify duricrust. This includes assessments of erosion rates, solubility and transport of iron in soil environments, and models of pedogenic crust formation.
Estimates of radiological risk from depleted uranium weapons in war scenarios.
Durante, Marco; Pugliese, Mariagabriella
2002-01-01
Several weapons used during the recent conflict in Yugoslavia contain depleted uranium, including missiles and armor-piercing incendiary rounds. Health concern is related to the use of these weapons, because of the heavy-metal toxicity and radioactivity of uranium. Although chemical toxicity is considered the more important source of health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict, and uranium munitions are a possible source of contamination in the environment. Actual measurements of radioactive contamination are needed to assess the risk. In this paper, a computer simulation is proposed to estimate radiological risk related to different exposure scenarios. Dose caused by inhalation of radioactive aerosols and ground contamination induced by Tomahawk missile impact are simulated using a Gaussian plume model (HOTSPOT code). Environmental contamination and committed dose to the population resident in contaminated areas are predicted by a food-web model (RESRAD code). Small values of committed effective dose equivalent appear to be associated with missile impacts (50-y CEDE < 5 mSv), or population exposure by water-independent pathways (50-y CEDE < 80 mSv). The greatest hazard is related to the water contamination in conditions of effective leaching of uranium in the groundwater (50-y CEDE < 400 mSv). Even in this worst case scenario, the chemical toxicity largely predominates over radiological risk. These computer simulations suggest that little radiological risk is associated to the use of depleted uranium weapons.
Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code
NASA Astrophysics Data System (ADS)
Wemple, Charles; Zwermann, Winfried
2017-09-01
Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.
Lv, Yuanda; Liang, Zhikai; Ge, Min; Qi, Weicong; Zhang, Tifu; Lin, Feng; Peng, Zhaohua; Zhao, Han
2016-05-11
Nitrogen (N) is an essential and often limiting nutrient to plant growth and development. Previous studies have shown that the mRNA expressions of numerous genes are regulated by nitrogen supplies; however, little is known about the expressed non-coding elements, for example long non-coding RNAs (lncRNAs) that control the response of maize (Zea mays L.) to nitrogen. LncRNAs are a class of non-coding RNAs larger than 200 bp, which have emerged as key regulators in gene expression. In this study, we surveyed the intergenic/intronic lncRNAs in maize B73 leaves at the V7 stage under conditions of N-deficiency and N-sufficiency using ribosomal RNA depletion and ultra-deep total RNA sequencing approaches. By integration with mRNA expression profiles and physiological evaluations, 7245 lncRNAs and 637 nitrogen-responsive lncRNAs were identified that exhibited unique expression patterns. Co-expression network analysis showed that the nitrogen-responsive lncRNAs were enriched mainly in one of the three co-expressed modules. The genes in the enriched module are mainly involved in NADH dehydrogenase activity, oxidative phosphorylation and the nitrogen compounds metabolic process. We identified a large number of lncRNAs in maize and illustrated their potential regulatory roles in response to N stress. The results lay the foundation for further in-depth understanding of the molecular mechanisms of lncRNAs' role in response to nitrogen stresses.
Ego Depletion and the Strength Model of Self-Control: A Meta-Analysis
ERIC Educational Resources Information Center
Hagger, Martin S.; Wood, Chantelle; Stiff, Chris; Chatzisarantis, Nikos L. D.
2010-01-01
According to the strength model, self-control is a finite resource that determines capacity for effortful control over dominant responses and, once expended, leads to impaired self-control task performance, known as "ego depletion". A meta-analysis of 83 studies tested the effect of ego depletion on task performance and related outcomes,…
Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...
2014-11-01
This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less
CESAR: A Code for Nuclear Fuel and Waste Characterisation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vidal, J.M.; Grouiller, J.P.; Launay, A.
2006-07-01
CESAR (Simplified Evolution Code Applied to Reprocessing) is a depletion code developed through a joint program between CEA and COGEMA. In the late 1980's, the first use of this code dealt with nuclear measurement at the Laboratories of the La Hague reprocessing plant. The use of CESAR was then extended to characterizations of all entrance materials and for characterisation, via tracer, of all produced waste. The code can distinguish more than 100 heavy nuclides, 200 fission products and 100 activation products, and it can characterise both the fuel and the structural material of the fuel. CESAR can also make depletionmore » calculations from 3 months to 1 million years of cooling time. Between 2003-2005, the 5. version of the code was developed. The modifications were related to the harmonisation of the code's nuclear data with the JEF2.2 nuclear data file. This paper describes the code and explains the extensive use of this code at the La Hague reprocessing plant and also for prospective studies. The second part focuses on the modifications of the latest version, and describes the application field and the qualification of the code. Many companies and the IAEA use CESAR today. CESAR offers a Graphical User Interface, which is very user-friendly. (authors)« less
GALFIT-CORSAIR: Implementing the Core-Sérsic Model Into GALFIT
NASA Astrophysics Data System (ADS)
Bonfini, Paolo
2014-10-01
We introduce GALFIT-CORSAIR: a publicly available, fully retro-compatible modification of the 2D fitting software GALFIT (v.3) which adds an implementation of the core-Sersic model. We demonstrate the software by fitting the images of NGC 5557 and NGC 5813, which have been previously identified as core-Sersic galaxies by their 1D radial light profiles. These two examples are representative of different dust obscuration conditions, and of bulge/disk decomposition. To perform the analysis, we obtained deep Hubble Legacy Archive (HLA) mosaics in the F555W filter (~V-band). We successfully reproduce the results of the previous 1D analysis, modulo the intrinsic differences between the 1D and the 2D fitting procedures. The code and the analysis procedure described here have been developed for the first coherent 2D analysis of a sample of core-Sersic galaxies, which will be presented in a forth-coming paper. As the 2D analysis provides better constraining on multi-component fitting, and is fully seeing-corrected, it will yield complementary constraints on the missing mass in depleted galaxy cores.
Dietary arginine depletion reduces depressive-like responses in male, but not female, mice.
Workman, Joanna L; Weber, Michael D; Nelson, Randy J
2011-09-30
Previous behavioral studies have manipulated nitric oxide (NO) production either by pharmacological inhibition of its synthetic enzyme, nitric oxide synthase (NOS), or by deletion of the genes that code for NOS. However manipulation of dietary intake of the NO precursor, L-arginine, has been understudied in regard to behavioral regulation. L-Arginine is a common amino acid present in many mammalian diets and is essential during development. In the brain L-arginine is converted into NO and citrulline by the enzyme, neuronal NOS (nNOS). In Experiment 1, paired mice were fed a diet comprised either of an L-arginine-depleted, L-arginine-supplemented, or standard level of L-arginine during pregnancy. Offspring were continuously fed the same diets and were tested in adulthood in elevated plus maze, forced swim, and resident-intruder aggression tests. L-Arginine depletion reduced depressive-like responses in male, but not female, mice and failed to significantly alter anxiety-like or aggressive behaviors. Arginine depletion throughout life reduced body mass overall and eliminated the sex difference in body mass. Additionally, arginine depletion significantly increased corticosterone concentrations, which negatively correlated with time spent floating. In Experiment 2, adult mice were fed arginine-defined diets two weeks prior to and during behavioral testing, and again tested in the aforementioned tests. Arginine depletion reduced depressive-like responses in the forced swim test, but did not alter behavior in the elevated plus maze or the resident intruder aggression test. Corticosterone concentrations were not altered by arginine diet manipulation in adulthood. These results indicate that arginine depletion throughout development, as well as during a discrete period during adulthood ameliorates depressive-like responses. These results may yield new insights into the etiology and sex differences of depression. Copyright © 2011 Elsevier B.V. All rights reserved.
Three-dimensional modeling of the neutral gas depletion effect in a helicon discharge plasma
NASA Astrophysics Data System (ADS)
Kollasch, Jeffrey; Schmitz, Oliver; Norval, Ryan; Reiter, Detlev; Sovinec, Carl
2016-10-01
Helicon discharges provide an attractive radio-frequency driven regime for plasma, but neutral-particle dynamics present a challenge to extending performance. A neutral gas depletion effect occurs when neutrals in the plasma core are not replenished at a sufficient rate to sustain a higher plasma density. The Monte Carlo neutral particle tracking code EIRENE was setup for the MARIA helicon experiment at UW Madison to study its neutral particle dynamics. Prescribed plasma temperature and density profiles similar to those in the MARIA device are used in EIRENE to investigate the main causes of the neutral gas depletion effect. The most dominant plasma-neutral interactions are included so far, namely electron impact ionization of neutrals, charge exchange interactions of neutrals with plasma ions, and recycling at the wall. Parameter scans show how the neutral depletion effect depends on parameters such as Knudsen number, plasma density and temperature, and gas-surface interaction accommodation coefficients. Results are compared to similar analytic studies in the low Knudsen number limit. Plans to incorporate a similar Monte Carlo neutral model into a larger helicon modeling framework are discussed. This work is funded by the NSF CAREER Award PHY-1455210.
16 CFR 260.7 - Environmental marketing claims.
Code of Federal Regulations, 2011 CFR
2011-01-01
....” Also printed on the bag is a disclosure that the bag is not designed for use in home compost piles. The... the Society of the Plastics Industry (SPI) code (which consists of a design of arrows in a triangular...% less ozone depletion. The qualified comparative claim is not likely to be deceptive. [57 FR 36363, Aug...
16 CFR 260.7 - Environmental marketing claims.
Code of Federal Regulations, 2012 CFR
2012-01-01
....” Also printed on the bag is a disclosure that the bag is not designed for use in home compost piles. The... the Society of the Plastics Industry (SPI) code (which consists of a design of arrows in a triangular...% less ozone depletion. The qualified comparative claim is not likely to be deceptive. [57 FR 36363, Aug...
Kay, Richard; Barton, Chris; Ratcliffe, Lucy; Matharoo-Ball, Balwir; Brown, Pamela; Roberts, Jane; Teale, Phil; Creaser, Colin
2008-10-01
A rapid acetonitrile (ACN)-based extraction method has been developed that reproducibly depletes high abundance and high molecular weight proteins from serum prior to mass spectrometric analysis. A nanoflow liquid chromatography/tandem mass spectrometry (nano-LC/MS/MS) multiple reaction monitoring (MRM) method for 57 high to medium abundance serum proteins was used to characterise the ACN-depleted fraction after tryptic digestion. Of the 57 targeted proteins 29 were detected and albumin, the most abundant protein in serum and plasma, was identified as the 20th most abundant protein in the extract. The combination of ACN depletion and one-dimensional nano-LC/MS/MS enabled the detection of the low abundance serum protein, insulin-like growth factor-I (IGF-I), which has a serum concentration in the region of 100 ng/mL. One-dimensional sodium dodecyl sulfate/polyacrylamide gel electrophoresis (SDS-PAGE) analysis of the depleted serum showed no bands corresponding to proteins of molecular mass over 75 kDa after extraction, demonstrating the efficiency of the method for the depletion of high molecular weight proteins. Total protein analysis of the ACN extracts showed that approximately 99.6% of all protein is removed from the serum. The ACN-depletion strategy offers a viable alternative to the immunochemistry-based protein-depletion techniques commonly used for removing high abundance proteins from serum prior to MS-based proteomic analyses.
Children's Models of the Ozone Layer and Ozone Depletion.
ERIC Educational Resources Information Center
Christidou, Vasilia; Koulaidis, Vasilis
1996-01-01
The views of 40 primary students on ozone and its depletion were recorded through individual, semi-structured interviews. The data analysis resulted in the formation of a limited number of models concerning the distribution and role of ozone in the atmosphere, the depletion process, and the consequences of ozone depletion. Identifies five target…
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. Thesemore » lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations. Beginning MCNP users are encouraged to review LA-UR-09-00380, "Criticality Calculations with MCNP: A Primer (3nd Edition)" (available at http:// mcnp.lanl.gov under "Reference Collection") prior to the class. No Monte Carlo class can be complete without having students write their own simple Monte Carlo routines for basic random sampling, use of the random number generator, and simplified particle transport simulation.« less
Klingenberg, Marcel; Groß, Matthias; Goyal, Ashish; Polycarpou-Schwarz, Maria; Miersch, Thilo; Ernst, Anne-Sophie; Leupold, Jörg; Patil, Nitin; Warnken, Uwe; Allgayer, Heike; Longerich, Thomas; Schirmacher, Peter; Boutros, Michael; Diederichs, Sven
2018-05-23
The identification of viability-associated long non-coding RNAs (lncRNA) might be a promising rationale for new therapeutic approaches in liver cancer. Here, we applied the first RNAi screening approach in hepatocellular carcinoma (HCC) cell lines to find viability-associated lncRNAs. Among the multiple identified lncRNAs with a significant impact on HCC cell viability, we selected CASC9 (Cancer Susceptibility 9) due to the strength of its phenotype, expression, and upregulation in HCC versus normal liver. CASC9 regulated viability across multiple HCC cell lines as shown by CRISPR interference, single siRNA- and siPOOL-mediated depletion of CASC9. Further, CASC9 depletion caused an increase in apoptosis and decrease of proliferation. We identified the RNA binding protein heterogeneous nuclear ribonucleoprotein L (HNRNPL) as a CASC9 interacting protein by RNA affinity purification (RAP) and validated it by native RNA immunoprecipitation (RIP). Knockdown of HNRNPL mimicked the loss-of-viability phenotype observed upon CASC9 depletion. Analysis of the proteome (SILAC) of CASC9- and HNRNPL-depleted cells revealed a set of co-regulated genes which implied a role of the CASC9:HNRNPL complex in AKT-signaling and DNA damage sensing. CASC9 expression levels were elevated in patient-derived tumor samples compared to normal control tissue and had a significant association with overall survival of HCC patients. In a xenograft chicken chorioallantoic membrane model, we measured a decreased tumor size after knockdown of CASC9. Taken together, we provide a comprehensive list of viability-associated lncRNAs in HCC. We identified the CASC9:HNRNPL complex as a clinically relevant viability-associated lncRNA/protein complex which affects AKT-signaling and DNA damage sensing in HCC. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.
Numerical study of phase conjugation in stimulated Brillouin scattering from an optical waveguide
NASA Astrophysics Data System (ADS)
Lehmberg, R. H.
1983-05-01
Stimulated Brillouin scattering (SBS) in a multimode optical waveguide is examined, and the parameters that affect the wavefront conjugation fidelity are studied. The nonlinear propagation code is briefly described and the calculated quantities are defined. The parameter study in the low reflectivity limit is described, and the effects of pump depletion are considered. The waveguide produced significantly higher fidelities than the focused configuration, in agreement with several experimental studies. The light scattered back through the phase aberrator exhibited a farfield intenstiy profile closely matching that of the incident beam; however, the nearfield intensity exhibited large and rapid spatial inhomogeneities across the entire aberrator, even for conjugation fidelities as high as 98 percent. In the absence of pump depletion, the fidelity increased with average pump intensity for amplitude gains up to around e to the 10th and then decreased slowly and monotonically with higher intensity. For all cases, pump depletion significantly enhanced the fidelity of the wavefront conjugation by inhibiting the small-scale pulling effect.
The Long Noncoding RNA Transcriptome of Dictyostelium discoideum Development.
Rosengarten, Rafael D; Santhanam, Balaji; Kokosar, Janez; Shaulsky, Gad
2017-02-09
Dictyostelium discoideum live in the soil as single cells, engulfing bacteria and growing vegetatively. Upon starvation, tens of thousands of amoebae enter a developmental program that includes aggregation, multicellular differentiation, and sporulation. Major shifts across the protein-coding transcriptome accompany these developmental changes. However, no study has presented a global survey of long noncoding RNAs (ncRNAs) in D. discoideum To characterize the antisense and long intergenic noncoding RNA (lncRNA) transcriptome, we analyzed previously published developmental time course samples using an RNA-sequencing (RNA-seq) library preparation method that selectively depletes ribosomal RNAs (rRNAs). We detected the accumulation of transcripts for 9833 protein-coding messenger RNAs (mRNAs), 621 lncRNAs, and 162 putative antisense RNAs (asRNAs). The noncoding RNAs were interspersed throughout the genome, and were distinct in expression level, length, and nucleotide composition. The noncoding transcriptome displayed a temporal profile similar to the coding transcriptome, with stages of gradual change interspersed with larger leaps. The transcription profiles of some noncoding RNAs were strongly correlated with known differentially expressed coding RNAs, hinting at a functional role for these molecules during development. Examining the mitochondrial transcriptome, we modeled two novel antisense transcripts. We applied yet another ribosomal depletion method to a subset of the samples to better retain transfer RNA (tRNA) transcripts. We observed polymorphisms in tRNA anticodons that suggested a post-transcriptional means by which D. discoideum compensates for codons missing in the genomic complement of tRNAs. We concluded that the prevalence and characteristics of long ncRNAs indicate that these molecules are relevant to the progression of molecular and cellular phenotypes during development. Copyright © 2017 Rosengarten et al.
Activating RNAs associate with Mediator to enhance chromatin architecture and transcription.
Lai, Fan; Orom, Ulf A; Cesaroni, Matteo; Beringer, Malte; Taatjes, Dylan J; Blobel, Gerd A; Shiekhattar, Ramin
2013-02-28
Recent advances in genomic research have revealed the existence of a large number of transcripts devoid of protein-coding potential in multiple organisms. Although the functional role for long non-coding RNAs (lncRNAs) has been best defined in epigenetic phenomena such as X-chromosome inactivation and imprinting, different classes of lncRNAs may have varied biological functions. We and others have identified a class of lncRNAs, termed ncRNA-activating (ncRNA-a), that function to activate their neighbouring genes using a cis-mediated mechanism. To define the precise mode by which such enhancer-like RNAs function, we depleted factors with known roles in transcriptional activation and assessed their role in RNA-dependent activation. Here we report that depletion of the components of the co-activator complex, Mediator, specifically and potently diminished the ncRNA-induced activation of transcription in a heterologous reporter assay using human HEK293 cells. In vivo, Mediator is recruited to ncRNA-a target genes and regulates their expression. We show that ncRNA-a interact with Mediator to regulate its chromatin localization and kinase activity towards histone H3 serine 10. The Mediator complex harbouring disease- displays diminished ability to associate with activating ncRNAs. Chromosome conformation capture confirmed the presence of DNA looping between the ncRNA-a loci and its targets. Importantly, depletion of Mediator subunits or ncRNA-a reduced the chromatin looping between the two loci. Our results identify the human Mediator complex as the transducer of activating ncRNAs and highlight the importance of Mediator and activating ncRNA association in human disease.
Parameter estimation and sensitivity analysis in an agent-based model of Leishmania major infection
Jones, Douglas E.; Dorman, Karin S.
2009-01-01
Computer models of disease take a systems biology approach toward understanding host-pathogen interactions. In particular, data driven computer model calibration is the basis for inference of immunological and pathogen parameters, assessment of model validity, and comparison between alternative models of immune or pathogen behavior. In this paper we describe the calibration and analysis of an agent-based model of Leishmania major infection. A model of macrophage loss following uptake of necrotic tissue is proposed to explain macrophage depletion following peak infection. Using Gaussian processes to approximate the computer code, we perform a sensitivity analysis to identify important parameters and to characterize their influence on the simulated infection. The analysis indicates that increasing growth rate can favor or suppress pathogen loads, depending on the infection stage and the pathogen’s ability to avoid detection. Subsequent calibration of the model against previously published biological observations suggests that L. major has a relatively slow growth rate and can replicate for an extended period of time before damaging the host cell. PMID:19837088
NASA Astrophysics Data System (ADS)
Díez, C. J.; Cabellos, O.; Martínez, J. S.
2014-04-01
The uncertainties on the isotopic composition throughout the burnup due to the nuclear data uncertainties are analysed. The different sources of uncertainties: decay data, fission yield and cross sections; are propagated individually, and their effect assessed. Two applications are studied: EFIT (an ADS-like reactor) and ESFR (Sodium Fast Reactor). The impact of the uncertainties on cross sections provided by the EAF-2010, SCALE6.1 and COMMARA-2.0 libraries are compared. These Uncertainty Quantification (UQ) studies have been carried out with a Monte Carlo sampling approach implemented in the depletion/activation code ACAB. Such implementation has been improved to overcome depletion/activation problems with variations of the neutron spectrum.
Abundances and Depletions of Neutron-capture Elements in the Interstellar Medium
NASA Astrophysics Data System (ADS)
Ritchey, A. M.; Federman, S. R.; Lambert, D. L.
2018-06-01
We present an extensive analysis of the gas-phase abundances and depletion behaviors of neutron-capture elements in the interstellar medium (ISM). Column densities (or upper limits to the column densities) of Ga II, Ge II, As II, Kr I, Cd II, Sn II, and Pb II are determined for a sample of 69 sight lines with high- and/or medium-resolution archival spectra obtained with the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope. An additional 59 sight lines with column density measurements reported in the literature are included in our analysis. Parameters that characterize the depletion trends of the elements are derived according to the methodology developed by Jenkins. (In an appendix, we present similar depletion results for the light element B.) The depletion patterns exhibited by Ga and Ge comport with expectations based on the depletion results obtained for many other elements. Arsenic exhibits much less depletion than expected, and its abundance in low-depletion sight lines may even be supersolar. We confirm a previous finding by Jenkins that the depletion of Kr increases as the overall depletion level increases from one sight line to another. Cadmium shows no such evidence of increasing depletion. We find a significant amount of scatter in the gas-phase abundances of Sn and Pb. For Sn, at least, the scatter may be evidence of real intrinsic abundance variations due to s-process enrichment combined with inefficient mixing in the ISM.
Xia, Bairong; Hou, Yan; Chen, Hong; Yang, Shanshan; Liu, Tianbo; Lin, Mei; Lou, Ge
2017-03-21
We reported that long non-coding RNA ZFAS1 was upregulated in epithelial ovarian cancer tissues, and was negatively correlated to the overall survival rate of patients with epithelial ovarian cancer in this study. While depletion of ZFAS1 inhibited proliferation, migration, and development of chemoresistance, overexpression of ZFAS1 exhibited an even higher proliferation rate, migration activity, and chemoresistance in epithelial ovarian cancer cell lines. We further found miR-150-5p was a potential target of ZFAS1, which was downregulated in epithelial ovarian cancer tissue. MiR-150-5p subsequently inhibited expression of transcription factor Sp1, as evidence by luciferase assays. Inhibition of miR-150-5p rescued the suppressed proliferation and migration induced by depletion of ZFAS1 in epithelial ovarian cancer cells, at least in part. Taken together, our findings revealed a critical role of ZFAS1/miR-150-5p/Sp1 axis in promoting proliferation rate, migration activity, and development of chemoresistance in epithelial ovarian cancer. And ZFAS1/miR-150-5p may serve as novel markers and therapeutic targets of epithelial ovarian cancer.
Collins, Richard A; Stajich, Jason E; Field, Deborah J; Olive, Joan E; DeAbreu, Diane M
2015-05-01
When we expressed a small (0.9 kb) nonprotein-coding transcript derived from the mitochondrial VS plasmid in the nucleus of Neurospora we found that it was efficiently spliced at one or more of eight 5' splice sites and ten 3' splice sites, which are present apparently by chance in the sequence. Further experimental and bioinformatic analyses of other mitochondrial plasmids, random sequences, and natural nuclear genes in Neurospora and other fungi indicate that fungal spliceosomes recognize a wide range of 5' splice site and branchpoint sequences and predict introns to be present at high frequency in random sequence. In contrast, analysis of intronless fungal nuclear genes indicates that branchpoint, 5' splice site and 3' splice site consensus sequences are underrepresented compared with random sequences. This underrepresentation of splicing signals is sufficient to deplete the nuclear genome of splice sites at locations that do not comprise biologically relevant introns. Thus, the splicing machinery can recognize a wide range of splicing signal sequences, but splicing still occurs with great accuracy, not because the splicing machinery distinguishes correct from incorrect introns, but because incorrect introns are substantially depleted from the genome. © 2015 Collins et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scaglione, John M; Mueller, Don; Wagner, John C
2011-01-01
One of the most significant remaining challenges associated with expanded implementation of burnup credit in the United States is the validation of depletion and criticality calculations used in the safety evaluation - in particular, the availability and use of applicable measured data to support validation, especially for fission products. Applicants and regulatory reviewers have been constrained by both a scarcity of data and a lack of clear technical basis or approach for use of the data. U.S. Nuclear Regulatory Commission (NRC) staff have noted that the rationale for restricting their Interim Staff Guidance on burnup credit (ISG-8) to actinide-only ismore » based largely on the lack of clear, definitive experiments that can be used to estimate the bias and uncertainty for computational analyses associated with using burnup credit. To address the issue of validation, the NRC initiated a project with the Oak Ridge National Laboratory to (1) develop and establish a technically sound validation approach (both depletion and criticality) for commercial spent nuclear fuel (SNF) criticality safety evaluations based on best-available data and methods and (2) apply the approach for representative SNF storage and transport configurations/conditions to demonstrate its usage and applicability, as well as to provide reference bias results. The purpose of this paper is to describe the criticality (k{sub eff}) validation approach, and resulting observations and recommendations. Validation of the isotopic composition (depletion) calculations is addressed in a companion paper at this conference. For criticality validation, the approach is to utilize (1) available laboratory critical experiment (LCE) data from the International Handbook of Evaluated Criticality Safety Benchmark Experiments and the French Haut Taux de Combustion (HTC) program to support validation of the principal actinides and (2) calculated sensitivities, nuclear data uncertainties, and the limited available fission product LCE data to predict and verify individual biases for relevant minor actinides and fission products. This paper (1) provides a detailed description of the approach and its technical bases, (2) describes the application of the approach for representative pressurized water reactor and boiling water reactor safety analysis models to demonstrate its usage and applicability, (3) provides reference bias results based on the prerelease SCALE 6.1 code package and ENDF/B-VII nuclear cross-section data, and (4) provides recommendations for application of the results and methods to other code and data packages.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sublet, J.-Ch., E-mail: jean-christophe.sublet@ukaea.uk; Eastwood, J.W.; Morgan, J.G.
Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2more » and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.« less
FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling
NASA Astrophysics Data System (ADS)
Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.; Gilbert, M. R.; Fleming, M.; Arter, W.
2017-01-01
Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.
Comparison of Depletion Strategies for the Enrichment of Low-Abundance Proteins in Urine.
Filip, Szymon; Vougas, Konstantinos; Zoidakis, Jerome; Latosinska, Agnieszka; Mullen, William; Spasovski, Goce; Mischak, Harald; Vlahou, Antonia; Jankowski, Joachim
2015-01-01
Proteome analysis of complex biological samples for biomarker identification remains challenging, among others due to the extended range of protein concentrations. High-abundance proteins like albumin or IgG of plasma and urine, may interfere with the detection of potential disease biomarkers. Currently, several options are available for the depletion of abundant proteins in plasma. However, the applicability of these methods in urine has not been thoroughly investigated. In this study, we compared different, commercially available immunodepletion and ion-exchange based approaches on urine samples from both healthy subjects and CKD patients, for their reproducibility and efficiency in protein depletion. A starting urine volume of 500 μL was used to simulate conditions of a multi-institutional biomarker discovery study. All depletion approaches showed satisfactory reproducibility (n=5) in protein identification as well as protein abundance. Comparison of the depletion efficiency between the unfractionated and fractionated samples and the different depletion strategies, showed efficient depletion in all cases, with the exception of the ion-exchange kit. The depletion efficiency was found slightly higher in normal than in CKD samples and normal samples yielded more protein identifications than CKD samples when using both initial as well as corresponding depleted fractions. Along these lines, decrease in the amount of albumin and other targets as applicable, following depletion, was observed. Nevertheless, these depletion strategies did not yield a higher number of identifications in neither the urine from normal nor CKD patients. Collectively, when analyzing urine in the context of CKD biomarker identification, no added value of depletion strategies can be observed and analysis of unfractionated starting urine appears to be preferable.
Comparison of Depletion Strategies for the Enrichment of Low-Abundance Proteins in Urine
Filip, Szymon; Vougas, Konstantinos; Zoidakis, Jerome; Latosinska, Agnieszka; Mullen, William; Spasovski, Goce; Mischak, Harald; Vlahou, Antonia; Jankowski, Joachim
2015-01-01
Proteome analysis of complex biological samples for biomarker identification remains challenging, among others due to the extended range of protein concentrations. High-abundance proteins like albumin or IgG of plasma and urine, may interfere with the detection of potential disease biomarkers. Currently, several options are available for the depletion of abundant proteins in plasma. However, the applicability of these methods in urine has not been thoroughly investigated. In this study, we compared different, commercially available immunodepletion and ion-exchange based approaches on urine samples from both healthy subjects and CKD patients, for their reproducibility and efficiency in protein depletion. A starting urine volume of 500 μL was used to simulate conditions of a multi-institutional biomarker discovery study. All depletion approaches showed satisfactory reproducibility (n=5) in protein identification as well as protein abundance. Comparison of the depletion efficiency between the unfractionated and fractionated samples and the different depletion strategies, showed efficient depletion in all cases, with the exception of the ion-exchange kit. The depletion efficiency was found slightly higher in normal than in CKD samples and normal samples yielded more protein identifications than CKD samples when using both initial as well as corresponding depleted fractions. Along these lines, decrease in the amount of albumin and other targets as applicable, following depletion, was observed. Nevertheless, these depletion strategies did not yield a higher number of identifications in neither the urine from normal nor CKD patients. Collectively, when analyzing urine in the context of CKD biomarker identification, no added value of depletion strategies can be observed and analysis of unfractionated starting urine appears to be preferable. PMID:26208298
Environmental performance of green building code and certification systems.
Suh, Sangwon; Tomar, Shivira; Leighton, Matthew; Kneifel, Joshua
2014-01-01
We examined the potential life-cycle environmental impact reduction of three green building code and certification (GBCC) systems: LEED, ASHRAE 189.1, and IgCC. A recently completed whole-building life cycle assessment (LCA) database of NIST was applied to a prototype building model specification by NREL. TRACI 2.0 of EPA was used for life cycle impact assessment (LCIA). The results showed that the baseline building model generates about 18 thousand metric tons CO2-equiv. of greenhouse gases (GHGs) and consumes 6 terajoule (TJ) of primary energy and 328 million liter of water over its life-cycle. Overall, GBCC-compliant building models generated 0% to 25% less environmental impacts than the baseline case (average 14% reduction). The largest reductions were associated with acidification (25%), human health-respiratory (24%), and global warming (GW) (22%), while no reductions were observed for ozone layer depletion (OD) and land use (LU). The performances of the three GBCC-compliant building models measured in life-cycle impact reduction were comparable. A sensitivity analysis showed that the comparative results were reasonably robust, although some results were relatively sensitive to the behavioral parameters, including employee transportation and purchased electricity during the occupancy phase (average sensitivity coefficients 0.26-0.29).
Protecting the Ozone Shield: A New Public Policy
1991-04-01
Public Policy Issue; Alterna- 11 tives; Risk Management; Clean Air Act; Global Warming 16. PRICE CODE 17. SECURITY CLASSIFICATION . SECURITY...pattern of global warming , commonly known as "the greenhouse effect. 1 OVERVIEW OF THE OZONE DEPLETION PUBLIC POLICY ISSUE In 1974, two atmospheric...inhabitants from the harmful effects of increased UVb radiation and global warming . Another dilemma surrounds this public policy issue since the first
NASA Astrophysics Data System (ADS)
Sloma, Tanya Noel
When representing the behavior of commercial spent nuclear fuel (SNF), credit is sought for the reduced reactivity associated with the net depletion of fissile isotopes and the creation of neutron-absorbing isotopes, a process that begins when a commercial nuclear reactor is first operated at power. Burnup credit accounts for the reduced reactivity potential of a fuel assembly and varies with the fuel burnup, cooling time, and the initial enrichment of fissile material in the fuel. With regard to long-term SNF disposal and transportation, tremendous benefits, such as increased capacity, flexibility of design and system operations, and reduced overall costs, provide an incentive to seek burnup credit for criticality safety evaluations. The Nuclear Regulatory Commission issued Interim Staff Guidance 8, Revision 2 in 2002, endorsing burnup credit of actinide composition changes only; credit due to actinides encompasses approximately 30% of exiting pressurized water reactor SNF inventory and could potentially be increased to 90% if fission product credit were accepted. However, one significant issue for utilizing full burnup credit, compensating for actinide and fission product composition changes, is establishing a set of depletion parameters that produce an adequately conservative representation of the fuel's isotopic inventory. Depletion parameters can have a significant effect on the isotopic inventory of the fuel, and thus the residual reactivity. This research seeks to quantify the reactivity impact on a system from dominant depletion parameters (i.e., fuel temperature, moderator density, burnable poison rod, burnable poison rod history, and soluble boron concentration). Bounding depletion parameters were developed by statistical evaluation of a database containing reactor operating histories. The database was generated from summary reports of commercial reactor criticality data. Through depletion calculations, utilizing the SCALE 6 code package, several light water reactor assembly designs and in-core locations are analyzed in establishing a combination of depletion parameters that conservatively represent the fuel's isotopic inventory as an initiative to take credit for fuel burnup in criticality safety evaluations for transportation and storage of SNF.
Effects of developer depletion on image quality of Kodak Insight and Ektaspeed Plus films.
Casanova, M S; Casanova, M L S; Haiter-Neto, F
2004-03-01
To evaluate the effect of processing solution depletion on the image quality of F-speed dental X-ray film (Insight), compared with Ektaspeed Plus. The films were exposed with a phantom and developed in manual and automatic conditions, in fresh and progressively depleted solutions. The comparison was based on densitometric analysis and subjective appraisal. The processing solution depletion presented a different behaviour depending on whether manual or automatic technique was used. The films were distinctly affected by depleted processing solutions. The developer depletion was faster in automatic than manual conditions. Insight film was more resistant than Ektaspeed Plus to the effects of processing solution depletion. In the present study there was agreement between the objective and subjective appraisals.
Development of Selective Clk1 and -4 Inhibitors for Cellular Depletion of Cancer-Relevant Proteins.
ElHady, Ahmed K; Abdel-Halim, Mohammad; Abadi, Ashraf H; Engel, Matthias
2017-07-13
In cancer cells, kinases of the Clk family control the supply of full-length, functional mRNAs coding for a variety of proteins essential to cell growth and survival. Thus, inhibition of Clks might become a novel anticancer strategy, leading to a selective depletion of cancer-relevant proteins after turnover. On the basis of a Weinreb amide hit compound, we designed and synthesized a diverse set of methoxybenzothiophene-2-carboxamides, of which the N-benzylated derivative showed enhanced Clk1 inhibitory activity. Introduction of a m-fluorine in the benzyl moiety eventually led to the discovery of compound 21b, a potent inhibitor of Clk1 and -4 (IC 50 = 7 and 2.3 nM, respectively), exhibiting an unprecedented selectivity over Dyrk1A. 21b triggered the depletion of EGFR, HDAC1, and p70S6 kinase from the cancer cells, with potencies in line with the measured GI 50 values. In contrast, the cellular effects of congener 21a, which inhibited Clk1 only weakly, were substantially lower.
Sadhukhan, Ratan; Chowdhury, Priyanka; Ghosh, Sourav; Ghosh, Utpal
2018-06-01
Telomere DNA can form specialized nucleoprotein structure with telomere-associated proteins to hide free DNA ends or G-quadruplex structures under certain conditions especially in presence of G-quadruplex ligand. Telomere DNA is transcribed to form non-coding telomere repeat-containing RNA (TERRA) whose biogenesis and function is poorly understood. Our aim was to find the role of telomere-associated proteins and telomere structures in TERRA transcription. We silenced four [two shelterin (TRF1, TRF2) and two non-shelterin (PARP-1, SLX4)] telomere-associated genes using siRNA and verified depletion in protein level. Knocking down of one gene modulated expression of other telomere-associated genes and increased TERRA from 10q, 15q, XpYp and XqYq chromosomes in A549 cells. Telomere was destabilized or damaged by G-quadruplex ligand pyridostatin (PDS) and bleomycin. Telomere dysfunction-induced foci (TIFs) were observed for each case of depletion of proteins, treatment with PDS or bleomycin. TERRA level was elevated by PDS and bleomycin treatment alone or in combination with depletion of telomere-associated proteins.
Meta-analysis of depleted uranium levels in the Balkan region.
Besic, Larisa; Muhovic, Imer; Asic, Adna; Kurtovic-Kozaric, Amina
2017-06-01
In recent years, contradicting data has been published on the connection between the presence of depleted uranium and an increased cancer incidence among military personnel deployed in the Balkans during the 1992-1999 wars. This has led to numerous research articles investigating possible depleted uranium contamination of the afflicted regions of the Balkan Peninsula, namely Bosnia & Herzegovina, Serbia, Kosovo and Montenegro. The aim of this study was to collect data from previously published reports investigating the levels of depleted uranium in the Balkans and to present the data in the form of a meta-analysis. This would provide a clear image of the extent of depleted uranium contamination after the Balkan conflict. In addition, we tested the hypothesis that there is a correlation between the levels of depleted uranium and the assumed depleted uranium-related health effects. Our results suggest that the majority of the examined sites contain natural uranium, while the area of Kosovo appears to be most heavily afflicted by depleted uranium pollution, followed by Bosnia & Herzegovina. Furthermore, the results indicate that it is not possible to make a valid correlation between the health effects and depleted uranium-contaminated areas. We therefore suggest a structured collaborative plan of action where long-term monitoring of the residents of depleted uranium-afflicted areas would be performed. In conclusion, while the possibility of depleted uranium toxicity in post-conflict regions appears to exist, there currently exists no definitive proof of such effects, due to insufficient studies of potentially afflicted populations, in addition to the lack of a common epidemiological approach in the reviewed literature. Copyright © 2017 Elsevier Ltd. All rights reserved.
Public money and human purpose: The future of taxes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roodman, D.M.
1995-09-01
Most countries use taxes and subsidies that undermine the well-being of both the taxpayers and the environment. But there are some positive-and now proven alternatives. One of the most powerfull tools that a government can use to guide its economy is its tax code. What politicians often overlook is that even though taxes are inevitable distortionary ones are not. In fact some taxes do no harm to the economy and other such as pollution taxes actually help it to work better. However there is a chronic tendency to undertax destructive activities such as pollution and resource depletion activities which canmore » threaten long term economic security. By making environmental destruction cheap or even free governments let people and businesses ignore the costs they imposed on others and on the future. This article explores the possibilities of turning today`s taxing philosophy and subsidizing priorities completely around. To shore up economic security and brake economic decline good activities need to be taxed less. To preserve the environmental viability of modern economies over the long term, bad activities need to be taxed more. The topics discussed include the following: What should Tax codes do: (1) shift from taxing income and sales to taxing exploitation or resources, when that exploitation generates windfall profits; (2) Calibrate the new taxes so polluters and depleters will feel the costs of the harm they do others of their own and future generations; (3) shape the tax code to help people participate and survive in the modern economy; What Will Fiscal Reform Do To Businesses? 2 tabs.« less
NASA Astrophysics Data System (ADS)
Lee, Hyomin; Jung, Yeonsu; Park, Sungmin; Kim, Ho-Young; Kim, Sung Jae
2016-11-01
Generally, an ion depletion region near a permselective medium is induced by predominant ion flux through the medium. External electric field or hydraulic pressure has been reported as the driving forces. Among these driving forces, an imbibition through the nanoporous medium was chosen as the mechanism to spontaneously generate the ion depletion region. The water-absorbing process leads to the predominant ion flux so that the spontaneous formation of the ion depletion zone is expected even if there are no additional driving forces except for the inherent capillary action. In this presentation, we derived the analytical solutions using perturbation method and asymptotic analysis for the spontaneous phenomenon. Using the analysis, we found that there is also spontaneous accumulation regime depending on the mobility of dissolved electrolytic species. Therefore, the rigorous analysis of the spontaneous ion depletion and accumulation phenomena would provide a key perspective for the control of ion transportation in nanofluidic system such as desalinator, preconcentrator, and energy harvesting device, etc. Samsung Research Funding Center of Samsung Electronics (SRFC-MA1301-02) and BK21 plus program of Creative Research Engineer Development IT, Seoul National University.
A Multilab Preregistered Replication of the Ego-Depletion Effect.
Hagger, Martin S; Chatzisarantis, Nikos L D; Alberts, Hugo; Anggono, Calvin Octavianus; Batailler, Cédric; Birt, Angela R; Brand, Ralf; Brandt, Mark J; Brewer, Gene; Bruyneel, Sabrina; Calvillo, Dustin P; Campbell, W Keith; Cannon, Peter R; Carlucci, Marianna; Carruth, Nicholas P; Cheung, Tracy; Crowell, Adrienne; De Ridder, Denise T D; Dewitte, Siegfried; Elson, Malte; Evans, Jacqueline R; Fay, Benjamin A; Fennis, Bob M; Finley, Anna; Francis, Zoë; Heise, Elke; Hoemann, Henrik; Inzlicht, Michael; Koole, Sander L; Koppel, Lina; Kroese, Floor; Lange, Florian; Lau, Kevin; Lynch, Bridget P; Martijn, Carolien; Merckelbach, Harald; Mills, Nicole V; Michirev, Alexej; Miyake, Akira; Mosser, Alexandra E; Muise, Megan; Muller, Dominique; Muzi, Milena; Nalis, Dario; Nurwanti, Ratri; Otgaar, Henry; Philipp, Michael C; Primoceri, Pierpaolo; Rentzsch, Katrin; Ringos, Lara; Schlinkert, Caroline; Schmeichel, Brandon J; Schoch, Sarah F; Schrama, Michel; Schütz, Astrid; Stamos, Angelos; Tinghög, Gustav; Ullrich, Johannes; vanDellen, Michelle; Wimbarti, Supra; Wolff, Wanja; Yusainy, Cleoputri; Zerhouni, Oulmann; Zwienenberg, Maria
2016-07-01
Good self-control has been linked to adaptive outcomes such as better health, cohesive personal relationships, success in the workplace and at school, and less susceptibility to crime and addictions. In contrast, self-control failure is linked to maladaptive outcomes. Understanding the mechanisms by which self-control predicts behavior may assist in promoting better regulation and outcomes. A popular approach to understanding self-control is the strength or resource depletion model. Self-control is conceptualized as a limited resource that becomes depleted after a period of exertion resulting in self-control failure. The model has typically been tested using a sequential-task experimental paradigm, in which people completing an initial self-control task have reduced self-control capacity and poorer performance on a subsequent task, a state known as ego depletion Although a meta-analysis of ego-depletion experiments found a medium-sized effect, subsequent meta-analyses have questioned the size and existence of the effect and identified instances of possible bias. The analyses served as a catalyst for the current Registered Replication Report of the ego-depletion effect. Multiple laboratories (k = 23, total N = 2,141) conducted replications of a standardized ego-depletion protocol based on a sequential-task paradigm by Sripada et al. Meta-analysis of the studies revealed that the size of the ego-depletion effect was small with 95% confidence intervals (CIs) that encompassed zero (d = 0.04, 95% CI [-0.07, 0.15]. We discuss implications of the findings for the ego-depletion effect and the resource depletion model of self-control. © The Author(s) 2016.
Bennett, Steffany A. L.; Valenzuela, Nicolas; Xu, Hongbin; Franko, Bettina; Fai, Stephen; Figeys, Daniel
2013-01-01
Not all of the mysteries of life lie in our genetic code. Some can be found buried in our membranes. These shells of fat, sculpted in the central nervous system into the cellular (and subcellular) boundaries of neurons and glia, are themselves complex systems of information. The diversity of neural phospholipids, coupled with their chameleon-like capacity to transmute into bioactive molecules, provides a vast repertoire of immediate response second messengers. The effects of compositional changes on synaptic function have only begun to be appreciated. Here, we mined 29 neurolipidomic datasets for changes in neuronal membrane phospholipid metabolism in Alzheimer's Disease (AD). Three overarching metabolic disturbances were detected. We found that an increase in the hydrolysis of platelet activating factor precursors and ethanolamine-containing plasmalogens, coupled with a failure to regenerate relatively rare alkyl-acyl and alkenyl-acyl structural phospholipids, correlated with disease severity. Accumulation of specific bioactive metabolites [i.e., PC(O-16:0/2:0) and PE(P-16:0/0:0)] was associated with aggravating tau pathology, enhancing vesicular release, and signaling neuronal loss. Finally, depletion of PI(16:0/20:4), PI(16:0/22:6), and PI(18:0/22:6) was implicated in accelerating Aβ42 biogenesis. Our analysis further suggested that converging disruptions in platelet activating factor, plasmalogen, phosphoinositol, phosphoethanolamine (PE), and docosahexaenoic acid metabolism may contribute mechanistically to catastrophic vesicular depletion, impaired receptor trafficking, and morphological dendritic deformation. Together, this analysis supports an emerging hypothesis that aberrant phospholipid metabolism may be one of multiple critical determinants required for Alzheimer disease conversion. PMID:23882219
Constitutional trisomy 8 mosaicism as a model for epigenetic studies of aneuploidy
2013-01-01
Background To investigate epigenetic patterns associated with aneuploidy we used constitutional trisomy 8 mosaicism (CT8M) as a model, enabling analyses of single cell clones, harboring either trisomy or disomy 8, from the same patient; this circumvents any bias introduced by using cells from unrelated, healthy individuals as controls. We profiled gene and miRNA expression as well as genome-wide and promoter specific DNA methylation and hydroxymethylation patterns in trisomic and disomic fibroblasts, using microarrays and methylated DNA immunoprecipitation. Results Trisomy 8-positive fibroblasts displayed a characteristic expression and methylation phenotype distinct from disomic fibroblasts, with the majority (65%) of chromosome 8 genes in the trisomic cells being overexpressed. However, 69% of all deregulated genes and non-coding RNAs were not located on this chromosome. Pathway analysis of the deregulated genes revealed that cancer, genetic disorder, and hematopoiesis were top ranked. The trisomy 8-positive cells displayed depletion of 5-hydroxymethylcytosine and global hypomethylation of gene-poor regions on chromosome 8, thus partly mimicking the inactivated X chromosome in females. Conclusions Trisomy 8 affects genes situated also on other chromosomes which, in cooperation with the observed chromosome 8 gene dosage effect, has an impact on the clinical features of CT8M, as demonstrated by the pathway analysis revealing key features that might explain the increased incidence of hematologic malignancies in CT8M patients. Furthermore, we hypothesize that the general depletion of hydroxymethylation and global hypomethylation of chromosome 8 may be unrelated to gene expression regulation, instead being associated with a general mechanism of chromatin processing and compartmentalization of additional chromosomes. PMID:23816241
Compound specific isotope analysis was combined with phospholipid fatty acid (PLFA) analysis to identify methanotrophic activity in members of the sedimentary microbial community in the Altamaha and Savannah River estuaries in Georgia. 13C-depleted PLFAs indicate methane utilizat...
Quantum Mechanical Modeling of Ballistic MOSFETs
NASA Technical Reports Server (NTRS)
Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, Bryan (Technical Monitor)
2001-01-01
The objective of this project was to develop theory, approximations, and computer code to model quasi 1D structures such as nanotubes, DNA, and MOSFETs: (1) Nanotubes: Influence of defects on ballistic transport, electro-mechanical properties, and metal-nanotube coupling; (2) DNA: Model electron transfer (biochemistry) and transport experiments, and sequence dependence of conductance; and (3) MOSFETs: 2D doping profiles, polysilicon depletion, source to drain and gate tunneling, understand ballistic limit.
XPOSE: the Exxon Nuclear revised LEOPARD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skogen, F.B.
1975-04-01
Main differences between XPOSE and LEOPARD codes used to generate fast and thermal neutron spectra and cross sections are presented. Models used for fast and thermal spectrum calculations as well as the depletion calculations considering U-238 chain, U-235 chain, xenon and samarium, fission products and boron-10 are described. A detailed description of the input required to run XPOSE and a description of the output are included. (FS)
Performance upgrades to the MCNP6 burnup capability for large scale depletion calculations
Fensin, M. L.; Galloway, J. D.; James, M. R.
2015-04-11
The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. With the merger of MCNPX and MCNP5, MCNP6 combined the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. The new MCNP6 depletion capability was first showcased at the International Congress for Advancements in Nuclear Power Plants (ICAPP) meeting in 2012. At that conference the new capabilities addressed included the combined distributive and shared memory parallel architecture for the burnup capability, improved memory management, physics enhancements, and newmore » predictability as compared to the H.B Robinson Benchmark. At Los Alamos National Laboratory, a special purpose cluster named “tebow,” was constructed such to maximize available RAM per CPU, as well as leveraging swap space with solid state hard drives, to allow larger scale depletion calculations (allowing for significantly more burnable regions than previously examined). As the MCNP6 burnup capability was scaled to larger numbers of burnable regions, a noticeable slowdown was realized.This paper details two specific computational performance strategies for improving calculation speedup: (1) retrieving cross sections during transport; and (2) tallying mechanisms specific to burnup in MCNP. To combat this slowdown new performance upgrades were developed and integrated into MCNP6 1.2.« less
CREME96 and Related Error Rate Prediction Methods
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.
2012-01-01
Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and Analysis of Cosmic Ray Effects in Electronics). The Single Event Figure of Merit method was also revised to use the solar minimum galactic cosmic ray spectrum and extended to circular orbits down to 200 km at any inclination. More recently a series of commercial codes was developed by TRAD (Test & Radiations) which includes the OMERE code which calculates single event effects. There are other error rate prediction methods which use Monte Carlo techniques. In this chapter the analytic methods for estimating the environment within spacecraft will be discussed.
Villada, Juan C.; Brustolini, Otávio José Bernardes
2017-01-01
Abstract Gene codon optimization may be impaired by the misinterpretation of frequency and optimality of codons. Although recent studies have revealed the effects of codon usage bias (CUB) on protein biosynthesis, an integrated perspective of the biological role of individual codons remains unknown. Unlike other previous studies, we show, through an integrated framework that attributes of codons such as frequency, optimality and positional dependency should be combined to unveil individual codon contribution for protein biosynthesis. We designed a codon quantification method for assessing CUB as a function of position within genes with a novel constraint: the relativity of position-dependent codon usage shaped by coding sequence length. Thus, we propose a new way of identifying the enrichment, depletion and non-uniform positional distribution of codons in different regions of yeast genes. We clustered codons that shared attributes of frequency and optimality. The cluster of non-optimal codons with rare occurrence displayed two remarkable characteristics: higher codon decoding time than frequent–non-optimal cluster and enrichment at the 5′-end region, where optimal codons with the highest frequency are depleted. Interestingly, frequent codons with non-optimal adaptation to tRNAs are uniformly distributed in the Saccharomyces cerevisiae genes, suggesting their determinant role as a speed regulator in protein elongation. PMID:28449100
Comparative functional characterization of the CSR-1 22G-RNA pathway in Caenorhabditis nematodes
Tu, Shikui; Wu, Monica Z.; Wang, Jie; Cutter, Asher D.; Weng, Zhiping; Claycomb, Julie M.
2015-01-01
As a champion of small RNA research for two decades, Caenorhabditis elegans has revealed the essential Argonaute CSR-1 to play key nuclear roles in modulating chromatin, chromosome segregation and germline gene expression via 22G-small RNAs. Despite CSR-1 being preserved among diverse nematodes, the conservation and divergence in function of the targets of small RNA pathways remains poorly resolved. Here we apply comparative functional genomic analysis between C. elegans and Caenorhabditis briggsae to characterize the CSR-1 pathway, its targets and their evolution. C. briggsae CSR-1-associated small RNAs that we identified by immunoprecipitation-small RNA sequencing overlap with 22G-RNAs depleted in cbr-csr-1 RNAi-treated worms. By comparing 22G-RNAs and target genes between species, we defined a set of CSR-1 target genes with conserved germline expression, enrichment in operons and more slowly evolving coding sequences than other genes, along with a small group of evolutionarily labile targets. We demonstrate that the association of CSR-1 with chromatin is preserved, and show that depletion of cbr-csr-1 leads to chromosome segregation defects and embryonic lethality. This first comparative characterization of a small RNA pathway in Caenorhabditis establishes a conserved nuclear role for CSR-1 and highlights its key role in germline gene regulation across multiple animal species. PMID:25510497
Villada, Juan C; Brustolini, Otávio José Bernardes; Batista da Silveira, Wendel
2017-08-01
Gene codon optimization may be impaired by the misinterpretation of frequency and optimality of codons. Although recent studies have revealed the effects of codon usage bias (CUB) on protein biosynthesis, an integrated perspective of the biological role of individual codons remains unknown. Unlike other previous studies, we show, through an integrated framework that attributes of codons such as frequency, optimality and positional dependency should be combined to unveil individual codon contribution for protein biosynthesis. We designed a codon quantification method for assessing CUB as a function of position within genes with a novel constraint: the relativity of position-dependent codon usage shaped by coding sequence length. Thus, we propose a new way of identifying the enrichment, depletion and non-uniform positional distribution of codons in different regions of yeast genes. We clustered codons that shared attributes of frequency and optimality. The cluster of non-optimal codons with rare occurrence displayed two remarkable characteristics: higher codon decoding time than frequent-non-optimal cluster and enrichment at the 5'-end region, where optimal codons with the highest frequency are depleted. Interestingly, frequent codons with non-optimal adaptation to tRNAs are uniformly distributed in the Saccharomyces cerevisiae genes, suggesting their determinant role as a speed regulator in protein elongation. © The Author 2017. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
Is Ego Depletion Real? An Analysis of Arguments.
Friese, Malte; Loschelder, David D; Gieseler, Karolin; Frankenbach, Julius; Inzlicht, Michael
2018-03-01
An influential line of research suggests that initial bouts of self-control increase the susceptibility to self-control failure (ego depletion effect). Despite seemingly abundant evidence, some researchers have suggested that evidence for ego depletion was the sole result of publication bias and p-hacking, with the true effect being indistinguishable from zero. Here, we examine (a) whether the evidence brought forward against ego depletion will convince a proponent that ego depletion does not exist and (b) whether arguments that could be brought forward in defense of ego depletion will convince a skeptic that ego depletion does exist. We conclude that despite several hundred published studies, the available evidence is inconclusive. Both additional empirical and theoretical works are needed to make a compelling case for either side of the debate. We discuss necessary steps for future work toward this aim.
Nonlinear pulse propagation and phase velocity of laser-driven plasma waves
NASA Astrophysics Data System (ADS)
Benedetti, Carlo; Rossi, Francesco; Schroeder, Carl; Esarey, Eric; Leemans, Wim
2014-10-01
We investigate and characterize the laser evolution and plasma wave excitation by a relativistically intense, short-pulse laser propagating in a preformed parabolic plasma channel, including the effects of pulse steepening, frequency redshifting, and energy depletion. We derived in 3D, and in the weakly relativistic intensity regime, analytical expressions for the laser energy depletion, the pulse self-steepening rate, the laser intensity centroid velocity, and the phase velocity of the plasma wave. Analytical results have been validated numerically using the 2D-cylindrical, ponderomotive code INF&RNO. We also discuss the extension of these results to the nonlinear regime, where an analytical theory of the nonlinear wake phase velocity is lacking. Work supported by the Office of Science, Office of High Energy Physics, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.
The Impact of Operating Parameters and Correlated Parameters for Extended BWR Burnup Credit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ade, Brian J.; Marshall, William B. J.; Ilas, Germina
Applicants for certificates of compliance for spent nuclear fuel (SNF) transportation and dry storage systems perform analyses to demonstrate that these systems are adequately subcritical per the requirements of Title 10 of the Code of Federal Regulations (10 CFR) Parts 71 and 72. For pressurized water reactor (PWR) SNF, these analyses may credit the reduction in assembly reactivity caused by depletion of fissile nuclides and buildup of neutron-absorbing nuclides during power operation. This credit for reactivity reduction during depletion is commonly referred to as burnup credit (BUC). US Nuclear Regulatory Commission (NRC) staff review BUC analyses according to the guidancemore » in the Division of Spent Fuel Storage and Transportation Interim Staff Guidance (ISG) 8, Revision 3, Burnup Credit in the Criticality Safety Analyses of PWR Spent Fuel in Transportation and Storage Casks.« less
In-Depth, Reproducible Analysis of Human Plasma Using IgY 14 and SuperMix Immunodepletion.
Beer, Lynn A; Ky, Bonnie; Barnhart, Kurt T; Speicher, David W
2017-01-01
Identification of cancer and other disease biomarkers in human plasma has been exceptionally challenging due to the complex nature of plasma and the presence of a moderate number of high- and medium-abundance proteins which mask low-abundance proteins of interest. As a result, immunoaffinity depletion formats combining multiple antibodies to target the most abundant plasma proteins have become the first stage in most plasma proteome discovery schemes. This protocol describes the use of tandem IgY 14 and SuperMix immunoaffinity depletion to reproducibly remove >99% of total plasma protein. This greatly increases the depth of analysis of human plasma proteomes. Depleted plasma samples can then be analyzed in a single high-resolution LC-MS/MS run on a Q Exactive Plus mass spectrometer, followed by label-free quantitation. If greater depth of analysis is desired, the depleted plasma can be further fractionated by separating the sample for a short distance on a 1D SDS gel and cutting the gel into uniform slices prior to trypsin digestion. Alternatively, the depleted plasma can be reduced, alkylated, and digested with trypsin followed by high-pH reversed-phase HPLC separation.
NASA Astrophysics Data System (ADS)
Malley, C. S.; Braban, C. F.; Dumitrean, P.; Cape, J. N.; Heal, M. R.
2015-03-01
The impact of 27 volatile organic compounds (VOC) on the regional O3 increment was investigated using measurements made at the UK EMEP supersites Harwell (1999-2001 and 2010-2012) and Auchencorth (2012). Ozone at these sites is representative of rural O3 in south-east England and northern UK, respectively. Monthly-diurnal regional O3 increment was defined as the difference between the regional and hemispheric background O3 concentrations, respectively derived from oxidant vs. NOx correlation plots, and cluster analysis of back trajectories arriving at Mace Head, Ireland. At Harwell, which had substantially greater regional ozone increments than at Auchencorth, variation in the regional O3 increment mirrored afternoon depletion of VOCs due to photochemistry (after accounting for diurnal changes in boundary layer mixing depth, and weighting VOC concentrations according to their photochemical ozone creation potential). A positive regional O3 increment occurred consistently during the summer, during which time afternoon photochemical depletion was calculated for the majority of measured VOCs, and to the greatest extent for ethene and m + p-xylene. This indicates that, of the measured VOCs, ethene and m + p-xylene emissions reduction would be most effective in reducing the regional O3 increment, but that reductions in a larger number of VOCs would be required for further improvement. The VOC diurnal photochemical depletion was linked to the sources of the VOC emissions through the integration of gridded VOC emissions estimates over 96 h air-mass back trajectories. This demonstrated that the effectiveness of VOC gridded emissions for use in measurement and modelling studies is limited by the highly aggregated nature of the 11 SNAP source sectors in which they are reported, as monthly variation in speciated VOC trajectory emissions did not reflect monthly changes in individual VOC diurnal photochemical depletion. Additionally, the major VOC emission source sectors during elevated regional O3 increment at Harwell were more narrowly defined through disaggregation of the SNAP emissions to 91 NFR codes (i.e. sectors 3D2 (domestic solvent use), 3D3 (other product use) and 2D2 (food and drink)). However, spatial variation in the contribution of NFR sectors to parent SNAP emissions could only be accounted for at the country level. Hence, the future reporting of gridded VOC emissions in source sectors more highly disaggregated than currently (e.g. to NFR codes) would facilitate a more precise identification of those VOC sources most important for mitigation of the impact of VOCs on O3 formation. In summary, this work presents a clear methodology for achieving a coherent VOC regional-O3-impact chemical climate using measurement data and explores the effect of limited emission and measurement species on the understanding of the regional VOC contribution to O3 concentrations.
NASA Astrophysics Data System (ADS)
Malley, C. S.; Braban, C. F.; Dumitrean, P.; Cape, J. N.; Heal, M. R.
2015-07-01
The impact of 27 volatile organic compounds (VOCs) on the regional O3 increment was investigated using measurements made at the UK EMEP supersites Harwell (1999-2001 and 2010-2012) and Auchencorth (2012). Ozone at these sites is representative of rural O3 in south-east England and northern UK, respectively. The monthly-diurnal regional O3 increment was defined as the difference between the regional and hemispheric background O3 concentrations, respectively, derived from oxidant vs. NOx correlation plots, and cluster analysis of back trajectories arriving at Mace Head, Ireland. At Harwell, which had substantially greater regional O3 increments than Auchencorth, variation in the regional O3 increment mirrored afternoon depletion of anthropogenic VOCs due to photochemistry (after accounting for diurnal changes in boundary layer mixing depth, and weighting VOC concentrations according to their photochemical ozone creation potential). A positive regional O3 increment occurred consistently during the summer, during which time afternoon photochemical depletion was calculated for the majority of measured VOCs, and to the greatest extent for ethene and m+p-xylene. This indicates that, of the measured VOCs, ethene and m+p-xylene emissions reduction would be most effective in reducing the regional O3 increment but that reductions in a larger number of VOCs would be required for further improvement. The VOC diurnal photochemical depletion was linked to anthropogenic sources of the VOC emissions through the integration of gridded anthropogenic VOC emission estimates over 96 h air-mass back trajectories. This demonstrated that one factor limiting the effectiveness of VOC gridded emissions for use in measurement and modelling studies is the highly aggregated nature of the 11 SNAP (Selected Nomenclature for Air Pollution) source sectors in which they are reported, as monthly variation in speciated VOC trajectory emissions did not reflect monthly changes in individual VOC diurnal photochemical depletion. Additionally, the major VOC emission source sectors during elevated regional O3 increment at Harwell were more narrowly defined through disaggregation of the SNAP emissions to 91 NFR (Nomenclature for Reporting) codes (i.e. sectors 3D2 (domestic solvent use), 3D3 (other product use) and 2D2 (food and drink)). However, spatial variation in the contribution of NFR sectors to parent SNAP emissions could only be accounted for at the country level. Hence, the future reporting of gridded VOC emissions in source sectors more highly disaggregated than currently (e.g. to NFR codes) would facilitate a more precise identification of those VOC sources most important for mitigation of the impact of VOCs on O3 formation. In summary, this work presents a clear methodology for achieving a coherent VOC, regional-O3-impact chemical climate using measurement data and explores the effect of limited emission and measurement species on the understanding of the regional VOC contribution to O3 concentrations.
Spent fuel pool storage calculations using the ISOCRIT burnup credit tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kucukboyaci, Vefa; Marshall, William BJ J
2012-01-01
In order to conservatively apply burnup credit in spent fuel pool criticality safety analyses, Westinghouse has developed a software tool, ISOCRIT, for generating depletion isotopics. This tool is used to create isotopics data based on specific reactor input parameters, such as design basis assembly type; bounding power/burnup profiles; reactor specific moderator temperature profiles; pellet percent theoretical density; burnable absorbers, axial blanket regions, and bounding ppm boron concentration. ISOCRIT generates burnup dependent isotopics using PARAGON; Westinghouse's state-of-the-art and licensed lattice physics code. Generation of isotopics and passing the data to the subsequent 3D KENO calculations are performed in an automated fashion,more » thus reducing the chance for human error. Furthermore, ISOCRIT provides the means for responding to any customer request regarding re-analysis due to changed parameters (e.g., power uprate, exit temperature changes, etc.) with a quick turnaround.« less
Simulation Analysis of Computer-Controlled pressurization for Mixture Ratio Control
NASA Technical Reports Server (NTRS)
Alexander, Leslie A.; Bishop-Behel, Karen; Benfield, Michael P. J.; Kelley, Anthony; Woodcock, Gordon R.
2005-01-01
A procedural code (C++) simulation was developed to investigate potentials for mixture ratio control of pressure-fed spacecraft rocket propulsion systems by measuring propellant flows, tank liquid quantities, or both, and using feedback from these measurements to adjust propellant tank pressures to set the correct operating mixture ratio for minimum propellant residuals. The pressurization system eliminated mechanical regulators in favor of a computer-controlled, servo- driven throttling valve. We found that a quasi-steady state simulation (pressure and flow transients in the pressurization systems resulting from changes in flow control valve position are ignored) is adequate for this purpose. Monte-Carlo methods are used to obtain simulated statistics on propellant depletion. Mixture ratio control algorithms based on proportional-integral-differential (PID) controller methods were developed. These algorithms actually set target tank pressures; the tank pressures are controlled by another PID controller. Simulation indicates this approach can provide reductions in residual propellants.
A long and abundant non-coding RNA in Lactobacillus salivarius.
Cousin, Fabien J; Lynch, Denise B; Chuat, Victoria; Bourin, Maxence J B; Casey, Pat G; Dalmasso, Marion; Harris, Hugh M B; McCann, Angela; O'Toole, Paul W
2017-09-01
Lactobacillus salivarius , found in the intestinal microbiota of humans and animals, is studied as an example of the sub-dominant intestinal commensals that may impart benefits upon their host. Strains typically harbour at least one megaplasmid that encodes functions contributing to contingency metabolism and environmental adaptation. RNA sequencing (RNA-seq)transcriptomic analysis of L. salivarius strain UCC118 identified the presence of a novel unusually abundant long non-coding RNA (lncRNA) encoded by the megaplasmid, and which represented more than 75 % of the total RNA-seq reads after depletion of rRNA species. The expression level of this 520 nt lncRNA in L. salivarius UCC118 exceeded that of the 16S rRNA, it accumulated during growth, was very stable over time and was also expressed during intestinal transit in a mouse. This lncRNA sequence is specific to the L. salivarius species; however, among 45 L . salivarius genomes analysed, not all (only 34) harboured the sequence for the lncRNA. This lncRNA was produced in 27 tested L. salivarius strains, but at strain-specific expression levels. High-level lncRNA expression correlated with high megaplasmid copy number. Transcriptome analysis of a deletion mutant lacking this lncRNA identified altered expression levels of genes in a number of pathways, but a definitive function of this new lncRNA was not identified. This lncRNA presents distinctive and unique properties, and suggests potential basic and applied scientific developments of this phenomenon.
Transequatorial Propagation and Depletion Precursors
NASA Astrophysics Data System (ADS)
Miller, E. S.; Bust, G. S.; Kaeppler, S. R.; Frissell, N. A.; Paxton, L. J.
2014-12-01
The bottomside equatorial ionosphere in the afternoon and evening sector frequently evolves rapidly from smoothly stratified to violently unstable with large wedges of depleted plasma growing through to the topside on timescales of a few tens of minutes. These depletions have numerous practical impacts on radio propagation, including amplitude scintillation, field-aligned irregularity scatter, HF blackouts, and long-distance transequatorial propagation at frequencies above the MUF. Practical impacts notwithstanding, the pathways and conditions under which depletions form remain a topic of vigorous inquiry some 80 years after their first report. Structuring of the pre-sunset ionosphere---morphology of the equatorial anomalies and long-wavelength undulations of the isodensity contours on the bottomside---are likely to hold some clues to conditions that are conducive to depletion formation. The Conjugate Depletion Experiment is an upcoming transequatorial forward-scatter HF/VHF experiment to investigate pre-sunset undulations and their connection with depletion formation. We will present initial results from the Conjugate Depletion Experiment, as well as a companion analysis of a massive HF propagation data set.
Carter, Evan C; McCullough, Michael E
2014-01-01
Few models of self-control have generated as much scientific interest as has the limited strength model. One of the entailments of this model, the depletion effect, is the expectation that acts of self-control will be less effective when they follow prior acts of self-control. Results from a previous meta-analysis concluded that the depletion effect is robust and medium in magnitude (d = 0.62). However, when we applied methods for estimating and correcting for small-study effects (such as publication bias) to the data from this previous meta-analysis effort, we found very strong signals of publication bias, along with an indication that the depletion effect is actually no different from zero. We conclude that until greater certainty about the size of the depletion effect can be established, circumspection about the existence of this phenomenon is warranted, and that rather than elaborating on the model, research efforts should focus on establishing whether the basic effect exists. We argue that the evidence for the depletion effect is a useful case study for illustrating the dangers of small-study effects as well as some of the possible tools for mitigating their influence in psychological science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sublet, J.-Ch.; Koning, A.J.; Forrest, R.A.
The reasons for the conversion of the European Activation File, EAF into ENDF-6 format are threefold. First, it significantly enhances the JEFF-3.0 release by the addition of an activation file. Second, to considerably increase its usage by using a recognized, official file format, allowing existing plug-in processes to be effective; and third, to move towards a universal nuclear data file in contrast to the current separate general and special-purpose files. The format chosen for the JEFF-3.0/A file uses reaction cross sections (MF-3), cross sections (MF-10), and multiplicities (MF-9). Having the data in ENDF-6 format allows the ENDF suite of utilitiesmore » and checker codes to be used alongside many other utility, visualizing, and processing codes. It is based on the EAF activation file used for many applications from fission to fusion, including dosimetry, inventories, depletion-transmutation, and geophysics. JEFF-3.0/A takes advantage of four generations of EAF files. Extensive benchmarking activities on these files provide feedback and validation with integral measurements. These, in parallel with a detailed graphical analysis based on EXFOR, have been applied stimulating new measurements, significantly increasing the quality of this activation file. The next step is to include the EAF uncertainty data for all channels into JEFF-3.0/A.« less
Disruption of the zinc metabolism in rat fœtal brain after prenatal exposure to cadmium.
Ben Mimouna, Safa; Boughammoura, Sana; Chemek, Marouane; Haouas, Zohra; Banni, Mohamed; Messaoudi, Imed
2018-04-25
This study was carried out to investigate the effects of maternal Cd and/or Zn exposure on some parameters of Zn metabolism in fetal brain of Wistar rats. Thus, female controls and other exposed by the oral route during the gestation period to Cd (50 mg CdCl 2 /L) and/or Zn (ZnCl 2 60 mg/L) were used. The male fetuses at age 20 days of gestation (GD20) were sacrificed and their brains were taken for histological, chemical and molecular analysis. Zn depletion was observed in the brains of fetuses issued from mothers exposed to Cd. Histological analysis showed that Cd exposure induces pyknosis in cortical region and CA1 region of the hippocampus compared to controls. Under Cd exposure, we noted an overexpression of the genes coding for membrane transporter involved in the intracellular incorporation of Zn (ZIP6) associated with inhibition of that encoding the transporters involved in the output of the Zn into the extracellular medium (ZnT1 and ZnT3). A decrease in the expression of the gene encoding the neuro-trophic factor (BDNF) associated with overexpression of the encoding the metal regulatory transcription factor 1 (MTF1), factor involved in the homeostasis of Zn, was also noted in Cd group. Interestingly, Zn supply provided a total or partial restauration of the changes induced by the Cd exposure. The depletion of brain Zn contents as well as the modification of the profile of expression of genes encoding membrane Zn transporters, suggest that the toxicity of Cd observed in fetal brain level are mediated, in part, by impairment of Zn metabolism. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Liffman, Kurt
1990-01-01
The effects of catastrophic collisional fragmentation and diffuse medium accretion on a the interstellar dust system are computed using a Monte Carlo computer model. The Monte Carlo code has as its basis an analytic solution of the bulk chemical evolution of a two-phase interstellar medium, described by Liffman and Clayton (1989). The model is subjected to numerous different interstellar processes as it transfers from one interstellar phase to another. Collisional fragmentation was found to be the dominant physical process that shapes the size spectrum of interstellar dust. It was found that, in the diffuse cloud phase, 90 percent of the refractory material is locked up in the dust grains, primarily due to accretion in the molecular medium. This result is consistent with the observed depletions of silicon. Depletions were found to be affected only slightly by diffuse cloud accretion.
Understnding Oxyaquic Classification in Light of Filed Data
NASA Astrophysics Data System (ADS)
Lindbo, David L.; Anderson, Debbie; Vick, Roy; Vepraskas, Michael; Amoozegar, Aziz
2014-05-01
Hydropedologic studies related to seasonal saturation and hydraulic conductivity add to our knowledge to make accurate land use interpretations, particularly as related to land application of waste (liquid and solids) and many urban land uses. Soils mapped in the Carolina Slate Belt in the southeastern region of the United States, including the benchmark Tatum and Chewacla Series, are no exception to this and proper identification of seasonal saturation in these soils is critical as urban and suburban development increases in this region. Soils related to the catena may lack the typical 2 chroma redox depletions commonly used to identify seasonal saturation even though high water table is often directly observed in these soils. When a seasonal high water table is determined, the soil may be classified as oxyaquic. However, if 2 chroma depletions are absent (or present at deeper depths than seasonal saturation) local or state land use codes may misidentify the depth to saturation. The hydropedologic data from this study has shown that the redox depletions in this area are indeed related to saturation. This fact has been debated by consultants and local health departments. Prior to this study one prevailing view was that the low chroma features were simply due to stripping or leaching of Fe in old cotton or tobacco fields and in no way was related to saturation. Based on the evidence in this study the interpretation of the redox depletions, oxyaquic conditions, and occurrence of episaturation will need to be reconsidered.
Molecular Pathways: Disrupting polyamine homeostasis as a therapeutic strategy for neuroblastoma
Evageliou, Nicholas F.; Hogarty, Michael D.
2009-01-01
MYC genes are deregulated in a plurality of human cancers. Through direct and indirect mechanisms the MYC network regulates the expression of >15% of the human genome, including both protein-coding and non-coding RNAs. This complexity has complicated efforts to define the principal pathways mediating MYC’s oncogenic activity. MYC plays a central role providing for the bioenergetic and biomass needs of proliferating cells, and polyamines are essential cell constituents supporting many of these functions. The rate-limiting enzyme in polyamine biosynthesis, ODC, is a bona fide MYC target, as are other regulatory enzymes in this pathway. A wealth of data link enhanced polyamine biosynthesis to cancer progression, and polyamine-depletion may limit malignant transformation of pre-neoplastic lesions. Studies using transgenic cancer models also supports that the effect of MYC on tumor initiation and progression can be attenuated through repression of polyamine production. High-risk neuroblastomas (an often lethal embryonal tumor in which MYC activation is paramount) deregulate numerous polyamine enzymes to promote expansion of intracellular polyamine pools. Selective inhibition of key enzymes in this pathway, e.g., using DFMO and/or SAM486, reduces tumorigenesis and synergizes with chemotherapy to regress tumors in pre-clinical models. Here we review the potential clinical application of these and additional polyamine-depletion agents to neuroblastoma and other advanced cancers in which MYC is operative. PMID:19789308
A genome-wide survey of maternal and embryonic transcripts during Xenopus tropicalis development.
Paranjpe, Sarita S; Jacobi, Ulrike G; van Heeringen, Simon J; Veenstra, Gert Jan C
2013-11-06
Dynamics of polyadenylation vs. deadenylation determine the fate of several developmentally regulated genes. Decay of a subset of maternal mRNAs and new transcription define the maternal-to-zygotic transition, but the full complement of polyadenylated and deadenylated coding and non-coding transcripts has not yet been assessed in Xenopus embryos. To analyze the dynamics and diversity of coding and non-coding transcripts during development, both polyadenylated mRNA and ribosomal RNA-depleted total RNA were harvested across six developmental stages and subjected to high throughput sequencing. The maternally loaded transcriptome is highly diverse and consists of both polyadenylated and deadenylated transcripts. Many maternal genes show peak expression in the oocyte and include genes which are known to be the key regulators of events like oocyte maturation and fertilization. Of all the transcripts that increase in abundance between early blastula and larval stages, about 30% of the embryonic genes are induced by fourfold or more by the late blastula stage and another 35% by late gastrulation. Using a gene model validation and discovery pipeline, we identified novel transcripts and putative long non-coding RNAs (lncRNA). These lncRNA transcripts were stringently selected as spliced transcripts generated from independent promoters, with limited coding potential and a codon bias characteristic of noncoding sequences. Many lncRNAs are conserved and expressed in a developmental stage-specific fashion. These data reveal dynamics of transcriptome polyadenylation and abundance and provides a high-confidence catalogue of novel and long non-coding RNAs.
Integral experiments on thorium assemblies with D-T neutron source
NASA Astrophysics Data System (ADS)
Liu, Rong; Yang, Yiwei; Feng, Song; Zheng, Lei; Lai, Caifeng; Lu, Xinxin; Wang, Mei; Jiang, Li
2017-09-01
To validate nuclear data and code in the neutronics design of a hybrid reactor with thorium, integral experiments in two kinds of benchmark thorium assemblies with a D-T fusion neutron source have been performed. The one kind of 1D assemblies consists of polyethylene and depleted uranium shells. The other kind of 2D assemblies consists of three thorium oxide cylinders. The capture reaction rates, fission reaction rates, and (n, 2n) reaction rates in 232Th in the assemblies are measured by ThO2 foils. The leakage neutron spectra from the ThO2 cylinders are measured by a liquid scintillation detector. The experimental uncertainties in all the results are analyzed. The measured results are compared to the calculated ones with MCNP code and ENDF/B-VII.0 library data.
NASA Technical Reports Server (NTRS)
Caruso, Salvadore V.; Clark-Ingram, Marceia A.
2000-01-01
This paper presents a memorandum of agreement on Clean Air Regulations. NASA headquarters (code JE and code M) has asked MSFC to serve as principle center for review of Clean Air Act (CAA) regulations. The purpose of the principle center is to provide centralized support to NASA headquarters for the management and leadership of NASA's CAA regulation review process and to identify the potential impact of proposed CAA reguations on NASA program hardware and supporting facilities. The materials and processes utilized in the manufacture of NASA's programmatic hardware contain HAPs (Hazardous Air Pollutants), VOCs (Volatile Organic Compounds), and ODC (Ozone Depleting Chemicals). This paper is presented in viewgraph form.
Impact of Americium-241 (n,γ) Branching Ratio on SFR Core Reactivity and Spent Fuel Characteristics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiruta, Hikaru; Youinou, Gilles J.; Dixon, Brent W.
An accurate prediction of core physics and fuel cycle parameters largely depends on the order of details and accuracy in nuclear data taken into account for actual calculations. 241Am is a major gateway nuclide for most of minor actinides and thus important nuclide for core physics and fuel-cycle calculations. The 241Am(n,?) branching ratio (BR) is in fact the energy dependent (see Fig. 1), therefore, it is necessary to taken into account the spectrum effect on the calculation of the average BR for the full-core depletion calculations. Moreover, the accuracy of the BR used in the depletion calculations could significantly influencemore » the core physics performance and post irradiated fuel compositions. The BR of 241Am(n,?) in ENDF/B-VII.0 library is relatively small and flat in thermal energy range, gradually increases within the intermediate energy range, and even becomes larger at the fast energy range. This indicates that the properly collapsed BR for fast reactors could be significantly different from that of thermal reactors. The evaluated BRs are also differ from one evaluation to another. As seen in Table I, average BRs for several evaluated libraries calculated by means of a fast spectrum are similar but have some differences. Most of currently available depletion codes use a pre-determined single value BR for each library. However, ideally it should be determined on-the-fly basis like that of one-group cross sections. These issues provide a strong incentive to investigate the effect of different 241Am(n,?) BRs on core and spent fuel parameters. This paper investigates the impact of the 241Am(n,?) BR on the results of SFR full-core based fuel-cycle calculations. The analysis is performed by gradually increasing the value of BR from 0.15 to 0.25 and studying its impact on the core reactivity and characteristics of SFR spent fuels over extended storage times (~10,000 years).« less
Modeling and Simulations for the High Flux Isotope Reactor Cycle 400
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilas, Germina; Chandler, David; Ade, Brian J
2015-03-01
A concerted effort over the past few years has been focused on enhancing the core model for the High Flux Isotope Reactor (HFIR), as part of a comprehensive study for HFIR conversion from high-enriched uranium (HEU) to low-enriched uranium (LEU) fuel. At this time, the core model used to perform analyses in support of HFIR operation is an MCNP model for the beginning of Cycle 400, which was documented in detail in a 2005 technical report. A HFIR core depletion model that is based on current state-of-the-art methods and nuclear data was needed to serve as reference for the designmore » of an LEU fuel for HFIR. The recent enhancements in modeling and simulations for HFIR that are discussed in the present report include: (1) revision of the 2005 MCNP model for the beginning of Cycle 400 to improve the modeling data and assumptions as necessary based on appropriate primary reference sources HFIR drawings and reports; (2) improvement of the fuel region model, including an explicit representation for the involute fuel plate geometry that is characteristic to HFIR fuel; and (3) revision of the Monte Carlo-based depletion model for HFIR in use since 2009 but never documented in detail, with the development of a new depletion model for the HFIR explicit fuel plate representation. The new HFIR models for Cycle 400 are used to determine various metrics of relevance to reactor performance and safety assessments. The calculated metrics are compared, where possible, with measurement data from preconstruction critical experiments at HFIR, data included in the current HFIR safety analysis report, and/or data from previous calculations performed with different methods or codes. The results of the analyses show that the models presented in this report provide a robust and reliable basis for HFIR analyses.« less
Hyung, Seok Won; Piehowski, Paul D.; Moore, Ronald J.; ...
2014-09-06
Removal of highly abundant proteins in plasma is often carried out using immunoaffinity depletion to extend the dynamic range of measurements to lower abundance species. While commercial depletion columns are available for this purpose, they generally are not applicable to limited sample quantities (<20 µL) due to low yields stemming from losses caused by nonspecific binding to the column matrix. Additionally, the cost of the depletion media can be prohibitive for larger scale studies. Modern LC-MS instrumentation provides the sensitivity necessary to scale-down depletion methods with minimal sacrifice to proteome coverage, which makes smaller volume depletion columns desirable for maximizingmore » sample recovery when samples are limited, as well as for reducing the expense of large scale studies. We characterized the performance of a 346 µL column volume micro-scale depletion system, using four different flow rates to determine the most effective depletion conditions for ~6 μL injections of human plasma proteins and then evaluated depletion reproducibility at the optimum flow rate condition. Depletion of plasma using a commercial 10 mL depletion column served as the control. Results showed depletion efficiency of the micro-scale column increased as flow rate decreased, and that our micro-depletion was reproducible. We found, in an initial application, a 600 µL sample of human cerebral spinal fluid (CSF) pooled from multiple sclerosis patients was depleted and then analyzed using reversed phase liquid chromatography-mass spectrometry to demonstrate the utility of the system for this important biofluid where sample quantities are more commonly limited.« less
NASA Astrophysics Data System (ADS)
Denton, M. H.; Kivi, R.; Ulich, T.; Clilverd, M. A.; Rodger, C. J.; von der Gathen, P.
2018-02-01
Ozonesonde data from four sites are analyzed in relation to 191 solar proton events from 1989 to 2016. Analysis shows ozone depletion ( 10-35 km altitude) commencing following the SPEs. Seasonally corrected ozone data demonstrate that depletions occur only in winter/early spring above sites where the northern hemisphere polar vortex (PV) can be present. A rapid reduction in stratospheric ozone is observed with the maximum decrease occurring 10-20 days after solar proton events. Ozone levels remain depleted in excess of 30 days. No depletion is observed above sites completely outside the PV. No depletion is observed in relation to 191 random epochs at any site at any time of year. Results point to the role of indirect ozone destruction, most likely via the rapid descent of long-lived NOx species in the PV during the polar winter.
Gu, Henry Y.; Marks, Neil D.; Winter, Alan D.; Weir, William; Tzelos, Thomas; McNeilly, Tom N.; Britton, Collette
2017-01-01
microRNAs are small non-coding RNAs that are important regulators of gene expression in a range of animals, including nematodes. We have analysed a cluster of four miRNAs from the pathogenic nematode species Haemonchus contortus that are closely linked in the genome. We find that the cluster is conserved only in clade V parasitic nematodes and in some ascarids, but not in other clade III species nor in clade V free-living nematodes. Members of the cluster are present in parasite excretory-secretory products and can be detected in the abomasum and draining lymph nodes of infected sheep, indicating their release in vitro and in vivo. As observed for other parasitic nematodes, H. contortus adult worms release extracellular vesicles (EV). Small RNA libraries were prepared from vesicle-enriched and vesicle-depleted supernatants from both adult worms and L4 stage larvae. Comparison of the miRNA species in the different fractions indicated that specific miRNAs are packaged within vesicles, while others are more abundant in vesicle-depleted supernatant. Hierarchical clustering analysis indicated that the gut is the likely source of vesicle-associated miRNAs in the L4 stage, but not in the adult worm. These findings add to the growing body of work demonstrating that miRNAs released from parasitic helminths may play an important role in host-parasite interactions. PMID:29145392
Comparative functional characterization of the CSR-1 22G-RNA pathway in Caenorhabditis nematodes.
Tu, Shikui; Wu, Monica Z; Wang, Jie; Cutter, Asher D; Weng, Zhiping; Claycomb, Julie M
2015-01-01
As a champion of small RNA research for two decades, Caenorhabditis elegans has revealed the essential Argonaute CSR-1 to play key nuclear roles in modulating chromatin, chromosome segregation and germline gene expression via 22G-small RNAs. Despite CSR-1 being preserved among diverse nematodes, the conservation and divergence in function of the targets of small RNA pathways remains poorly resolved. Here we apply comparative functional genomic analysis between C. elegans and Caenorhabditis briggsae to characterize the CSR-1 pathway, its targets and their evolution. C. briggsae CSR-1-associated small RNAs that we identified by immunoprecipitation-small RNA sequencing overlap with 22G-RNAs depleted in cbr-csr-1 RNAi-treated worms. By comparing 22G-RNAs and target genes between species, we defined a set of CSR-1 target genes with conserved germline expression, enrichment in operons and more slowly evolving coding sequences than other genes, along with a small group of evolutionarily labile targets. We demonstrate that the association of CSR-1 with chromatin is preserved, and show that depletion of cbr-csr-1 leads to chromosome segregation defects and embryonic lethality. This first comparative characterization of a small RNA pathway in Caenorhabditis establishes a conserved nuclear role for CSR-1 and highlights its key role in germline gene regulation across multiple animal species. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Gehrke, T.; Burigo, L.; Arico, G.; Berke, S.; Jakubek, J.; Turecek, D.; Tessonnier, T.; Mairani, A.; Martišíková, M.
2017-04-01
In the field of ion-beam radiotherapy and space applications, measurements of the energy deposition of single ions in thin layers are of interest for dosimetry and imaging. The present work investigates the capability of a pixelated detector Timepix to measure the energy deposition of single ions in therapeutic proton, helium- and carbon-ion beams in a 300 μm-thick sensitive silicon layer. For twelve different incident beams, the measured energy deposition distributions of single ions are compared to the expected energy deposition spectra, which were predicted by detailed Monte Carlo simulations using the FLUKA code. A methodology for the analysis of the measured data is introduced in order to identify and reject signals that are either degraded or caused by multiple overlapping ions. Applying a newly proposed linear recalibration, the energy deposition measurements are in good agreement with the simulations. The twelve measured mean energy depositions between 0.72 MeV/mm and 56.63 MeV/mm in a partially depleted silicon sensor do not deviate more than 7% from the corresponding simulated values. Measurements of energy depositions above 10 MeV/mm with a fully depleted sensor are found to suffer from saturation effects due to the too high per-pixel signal. The utilization of thinner sensors, in which a lower signal is induced, could further improve the performance of the Timepix detector for energy deposition measurements.
NASA Astrophysics Data System (ADS)
Karriem, Veronica V.
Nuclear reactor design incorporates the study and application of nuclear physics, nuclear thermal hydraulic and nuclear safety. Theoretical models and numerical methods implemented in computer programs are utilized to analyze and design nuclear reactors. The focus of this PhD study's is the development of an advanced high-fidelity multi-physics code system to perform reactor core analysis for design and safety evaluations of research TRIGA-type reactors. The fuel management and design code system TRIGSIMS was further developed to fulfill the function of a reactor design and analysis code system for the Pennsylvania State Breazeale Reactor (PSBR). TRIGSIMS, which is currently in use at the PSBR, is a fuel management tool, which incorporates the depletion code ORIGEN-S (part of SCALE system) and the Monte Carlo neutronics solver MCNP. The diffusion theory code ADMARC-H is used within TRIGSIMS to accelerate the MCNP calculations. It manages the data and fuel isotopic content and stores it for future burnup calculations. The contribution of this work is the development of an improved version of TRIGSIMS, named TRIGSIMS-TH. TRIGSIMS-TH incorporates a thermal hydraulic module based on the advanced sub-channel code COBRA-TF (CTF). CTF provides the temperature feedback needed in the multi-physics calculations as well as the thermal hydraulics modeling capability of the reactor core. The temperature feedback model is using the CTF-provided local moderator and fuel temperatures for the cross-section modeling for ADMARC-H and MCNP calculations. To perform efficient critical control rod calculations, a methodology for applying a control rod position was implemented in TRIGSIMS-TH, making this code system a modeling and design tool for future core loadings. The new TRIGSIMS-TH is a computer program that interlinks various other functional reactor analysis tools. It consists of the MCNP5, ADMARC-H, ORIGEN-S, and CTF. CTF was coupled with both MCNP and ADMARC-H to provide the heterogeneous temperature distribution throughout the core. Each of these codes is written in its own computer language performing its function and outputs a set of data. TRIGSIMS-TH provides an effective use and data manipulation and transfer between different codes. With the implementation of feedback and control- rod-position modeling methodologies, the TRIGSIMS-TH calculations are more accurate and in a better agreement with measured data. The PSBR is unique in many ways and there are no "off-the-shelf" codes, which can model this design in its entirety. In particular, PSBR has an open core design, which is cooled by natural convection. Combining several codes into a unique system brings many challenges. It also requires substantial knowledge of both operation and core design of the PSBR. This reactor is in operation decades and there is a fair amount of studies and developments in both PSBR thermal hydraulics and neutronics. Measured data is also available for various core loadings and can be used for validation activities. The previous studies and developments in PSBR modeling also aids as a guide to assess the findings of the work herein. In order to incorporate new methods and codes into exiting TRIGSIMS, a re-evaluation of various components of the code was performed to assure the accuracy and efficiency of the existing CTF/MCNP5/ADMARC-H multi-physics coupling. A new set of ADMARC-H diffusion coefficients and cross sections was generated using the SERPENT code. This was needed as the previous data was not generated with thermal hydraulic feedback and the ARO position was used as the critical rod position. The B4C was re-evaluated for this update. The data exchange between ADMARC-H and MCNP5 was modified. The basic core model is given a flexibility to allow for various changes within the core model, and this feature was implemented in TRIGSIMS-TH. The PSBR core in the new code model can be expanded and changed. This allows the new code to be used as a modeling tool for design and analyses of future code loadings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vo, Mai-Tram; Ko, Myoung Seok; Lee, Unn Hwa
Mitochondrial dynamics, including constant fusion and fission, play critical roles in maintaining mitochondrial morphology and function. Here, we report that developmentally regulated GTP-binding protein 2 (DRG2) regulates mitochondrial morphology by modulating the expression of the mitochondrial fission gene dynamin-related protein 1 (Drp1). shRNA-mediated silencing of DRG2 induced mitochondrial swelling, whereas expression of an shRNA-resistant version of DRG2 decreased mitochondrial swelling in DRG2-depleted cells. Analysis of the expression levels of genes involved in mitochondrial fusion and fission revealed that DRG2 depletion significantly decreased the level of Drp1. Overexpression of Drp1 rescued the defect in mitochondrial morphology induced by DRG2 depletion. DRG2more » depletion reduced the mitochondrial membrane potential, oxygen consumption rate (OCR), and amount of mitochondrial DNA (mtDNA), whereas it increased reactive oxygen species (ROS) production and apoptosis. Taken together, our data demonstrate that DRG2 acts as a regulator of mitochondrial fission by controlling the expression of Drp1. - Highlights: • DRG2 depletion increased mitochondrial swelling. • DRG2 depletion inhibited the expression of Drp1. • Overexpression of DRG2 or Drp1 rescued mitochondrial shape in DRG2 depleted cells. • DRG2 depletion induced mitochondrial dysfunction.« less
Advanced multiphysics coupling for LWR fuel performance analysis
Hales, J. D.; Tonks, M. R.; Gleicher, F. N.; ...
2015-10-01
Even the most basic nuclear fuel analysis is a multiphysics undertaking, as a credible simulation must consider at a minimum coupled heat conduction and mechanical deformation. The need for more realistic fuel modeling under a variety of conditions invariably leads to a desire to include coupling between a more complete set of the physical phenomena influencing fuel behavior, including neutronics, thermal hydraulics, and mechanisms occurring at lower length scales. This paper covers current efforts toward coupled multiphysics LWR fuel modeling in three main areas. The first area covered in this paper concerns thermomechanical coupling. The interaction of these two physics,more » particularly related to the feedback effect associated with heat transfer and mechanical contact at the fuel/clad gap, provides numerous computational challenges. An outline is provided of an effective approach used to manage the nonlinearities associated with an evolving gap in BISON, a nuclear fuel performance application. A second type of multiphysics coupling described here is that of coupling neutronics with thermomechanical LWR fuel performance. DeCART, a high-fidelity core analysis program based on the method of characteristics, has been coupled to BISON. DeCART provides sub-pin level resolution of the multigroup neutron flux, with resonance treatment, during a depletion or a fast transient simulation. Two-way coupling between these codes was achieved by mapping fission rate density and fast neutron flux fields from DeCART to BISON and the temperature field from BISON to DeCART while employing a Picard iterative algorithm. Finally, the need for multiscale coupling is considered. Fission gas production and evolution significantly impact fuel performance by causing swelling, a reduction in the thermal conductivity, and fission gas release. The mechanisms involved occur at the atomistic and grain scale and are therefore not the domain of a fuel performance code. However, it is possible to use lower length scale models such as those used in the mesoscale MARMOT code to compute average properties, e.g. swelling or thermal conductivity. These may then be used by an engineering-scale model. Examples of this type of multiscale, multiphysics modeling are shown.« less
Hyperspectral stimulated emission depletion microscopy and methods of use thereof
Timlin, Jerilyn A; Aaron, Jesse S
2014-04-01
A hyperspectral stimulated emission depletion ("STED") microscope system for high-resolution imaging of samples labeled with multiple fluorophores (e.g., two to ten fluorophores). The hyperspectral STED microscope includes a light source, optical systems configured for generating an excitation light beam and a depletion light beam, optical systems configured for focusing the excitation and depletion light beams on a sample, and systems for collecting and processing data generated by interaction of the excitation and depletion light beams with the sample. Hyperspectral STED data may be analyzed using multivariate curve resolution analysis techniques to deconvolute emission from the multiple fluorophores. The hyperspectral STED microscope described herein can be used for multi-color, subdiffraction imaging of samples (e.g., materials and biological materials) and for analyzing a tissue by Forster Resonance Energy Transfer ("FRET").
Hyung, Seok-Won; Piehowski, Paul D; Moore, Ronald J; Orton, Daniel J; Schepmoes, Athena A; Clauss, Therese R; Chu, Rosalie K; Fillmore, Thomas L; Brewer, Heather; Liu, Tao; Zhao, Rui; Smith, Richard D
2014-11-01
Removal of highly abundant proteins in plasma is often carried out using immunoaffinity depletion to extend the dynamic range of measurements to lower abundance species. While commercial depletion columns are available for this purpose, they generally are not applicable to limited sample quantities (<20 μL) due to low yields stemming from losses caused by nonspecific binding to the column matrix and concentration of large eluent volumes. Additionally, the cost of the depletion media can be prohibitive for larger-scale studies. Modern LC-MS instrumentation provides the sensitivity necessary to scale-down depletion methods with minimal sacrifice to proteome coverage, which makes smaller volume depletion columns desirable for maximizing sample recovery when samples are limited, as well as for reducing the expense of large-scale studies. We characterized the performance of a 346 μL column volume microscale depletion system, using four different flow rates to determine the most effective depletion conditions for ∼6-μL injections of human plasma proteins and then evaluated depletion reproducibility at the optimum flow rate condition. Depletion of plasma using a commercial 10-mL depletion column served as the control. Results showed depletion efficiency of the microscale column increased as flow rate decreased, and that our microdepletion was reproducible. In an initial application, a 600-μL sample of human cerebrospinal fluid (CSF) pooled from multiple sclerosis patients was depleted and then analyzed using reversed phase liquid chromatography-mass spectrometry to demonstrate the utility of the system for this important biofluid where sample quantities are more commonly limited.
Novoa-Herran, Susana; Umaña-Perez, Adriana; Canals, Francesc; Sanchez-Gomez, Myriam
2016-01-01
How nutrition and growth factor restriction due to serum depletion affect trophoblast function remains poorly understood. We performed a proteomic differential study of the effects of serum depletion on a first trimester human immortalized trophoblast cell line. The viability of HTR-8/SVneo trophoblast cells in culture with 0, 0.5 and 10 % fetal bovine serum (FBS) were assayed via MTT at 24, 48 and 64 h. A comparative proteomic analysis of the cells grown with those FBS levels for 24 h was performed using two-dimensional electrophoresis (2DE), followed by mass spectrometry for protein spot identification, and a database search and bioinformatics analysis of the expressed proteins. Differential spots were identified using the Kolmogorov-Smirnov test ( n = 3, significance level 0.10, D > 0.642) and/or ANOVA ( n = 3, p < 0.05). The results showed that low serum doses or serum depletion differentially affect cell growth and protein expression. Differential expression was seen in 25 % of the protein spots grown with 0.5 % FBS and in 84 % of those grown with 0 % FBS, using 10 % serum as the physiological control. In 0.5 % FBS, this difference was related with biological processes typically affected by the serum, such as cell cycle, regulation of apoptosis and proliferation. In addition to these changes, in the serum-depleted proteome we observed downregulation of keratin 8, and upregulation of vimentin, the glycolytic enzymes enolase and pyruvate kinase (PKM2) and tumor progression-related inosine-5'-monophosphate dehydrogenase 2 (IMPDH2) enzyme. The proteins regulated by total serum depletion, but not affected by growth in 0.5 % serum, are members of the glycolytic and nucleotide metabolic pathways and the epithelial-to-mesenchymal transition (EMT), suggesting an adaptive switch characteristic of malignant cells. This comparative proteomic analysis and the identified proteins are the first evidence of a protein expression response to serum depletion in a trophoblast cell model. Our results show that serum depletion induces specific changes in protein expression concordant with main cell metabolic adaptations and EMT, resembling the progression to a malignant phenotype.
Global analysis of depletion and recovery of seabed biota after bottom trawling disturbance.
Hiddink, Jan Geert; Jennings, Simon; Sciberras, Marija; Szostek, Claire L; Hughes, Kathryn M; Ellis, Nick; Rijnsdorp, Adriaan D; McConnaughey, Robert A; Mazor, Tessa; Hilborn, Ray; Collie, Jeremy S; Pitcher, C Roland; Amoroso, Ricardo O; Parma, Ana M; Suuronen, Petri; Kaiser, Michel J
2017-08-01
Bottom trawling is the most widespread human activity affecting seabed habitats. Here, we collate all available data for experimental and comparative studies of trawling impacts on whole communities of seabed macroinvertebrates on sedimentary habitats and develop widely applicable methods to estimate depletion and recovery rates of biota after trawling. Depletion of biota and trawl penetration into the seabed are highly correlated. Otter trawls caused the least depletion, removing 6% of biota per pass and penetrating the seabed on average down to 2.4 cm, whereas hydraulic dredges caused the most depletion, removing 41% of biota and penetrating the seabed on average 16.1 cm. Median recovery times posttrawling (from 50 to 95% of unimpacted biomass) ranged between 1.9 and 6.4 y. By accounting for the effects of penetration depth, environmental variation, and uncertainty, the models explained much of the variability of depletion and recovery estimates from single studies. Coupled with large-scale, high-resolution maps of trawling frequency and habitat, our estimates of depletion and recovery rates enable the assessment of trawling impacts on unprecedented spatial scales.
Global analysis of depletion and recovery of seabed biota after bottom trawling disturbance
Hiddink, Jan Geert; Jennings, Simon; Sciberras, Marija; Szostek, Claire L.; Hughes, Kathryn M.; Ellis, Nick; Rijnsdorp, Adriaan D.; McConnaughey, Robert A.; Mazor, Tessa; Hilborn, Ray; Collie, Jeremy S.; Pitcher, C. Roland; Amoroso, Ricardo O.; Parma, Ana M.; Suuronen, Petri; Kaiser, Michel J.
2017-01-01
Bottom trawling is the most widespread human activity affecting seabed habitats. Here, we collate all available data for experimental and comparative studies of trawling impacts on whole communities of seabed macroinvertebrates on sedimentary habitats and develop widely applicable methods to estimate depletion and recovery rates of biota after trawling. Depletion of biota and trawl penetration into the seabed are highly correlated. Otter trawls caused the least depletion, removing 6% of biota per pass and penetrating the seabed on average down to 2.4 cm, whereas hydraulic dredges caused the most depletion, removing 41% of biota and penetrating the seabed on average 16.1 cm. Median recovery times posttrawling (from 50 to 95% of unimpacted biomass) ranged between 1.9 and 6.4 y. By accounting for the effects of penetration depth, environmental variation, and uncertainty, the models explained much of the variability of depletion and recovery estimates from single studies. Coupled with large-scale, high-resolution maps of trawling frequency and habitat, our estimates of depletion and recovery rates enable the assessment of trawling impacts on unprecedented spatial scales. PMID:28716926
LITHIUM DEPLETION IS A STRONG TEST OF CORE-ENVELOPE RECOUPLING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somers, Garrett; Pinsonneault, Marc H., E-mail: somers@astronomy.ohio-state.edu
2016-09-20
Rotational mixing is a prime candidate for explaining the gradual depletion of lithium from the photospheres of cool stars during the main sequence. However, previous mixing calculations have relied primarily on treatments of angular momentum transport in stellar interiors incompatible with solar and stellar data in the sense that they overestimate the internal differential rotation. Instead, recent studies suggest that stars are strongly differentially rotating at young ages but approach a solid body rotation during their lifetimes. We modify our rotating stellar evolution code to include an additional source of angular momentum transport, a necessary ingredient for explaining the openmore » cluster rotation pattern, and examine the consequences for mixing. We confirm that core-envelope recoupling with a ∼20 Myr timescale is required to explain the evolution of the mean rotation pattern along the main sequence, and demonstrate that it also provides a more accurate description of the Li depletion pattern seen in open clusters. Recoupling produces a characteristic pattern of efficient mixing at early ages and little mixing at late ages, thus predicting a flattening of Li depletion at a few Gyr, in agreement with the observed late-time evolution. Using Li abundances we argue that the timescale for core-envelope recoupling during the main sequence decreases sharply with increasing mass. We discuss the implications of this finding for stellar physics, including the viability of gravity waves and magnetic fields as agents of angular momentum transport. We also raise the possibility of intrinsic differences in initial conditions in star clusters using M67 as an example.« less
Long, Lin; He, Jian-Zhong; Chen, Ye; Xu, Xiu-E; Liao, Lian-Di; Xie, Yang-Min; Li, En-Min; Xu, Li-Yan
2018-05-07
Riboflavin is an essential component of the human diet and its derivative cofactors play an established role in oxidative metabolism. Riboflavin deficiency has been linked with various human diseases. The objective of this study was to identify whether riboflavin depletion promotes tumorigenesis. HEK293T and NIH3T3 cells were cultured in riboflavin-deficient or riboflavin-sufficient medium and passaged every 48 h. Cells were collected every 5 generations and plate colony formation assays were performed to observe cell proliferation. Subcutaneous tumorigenicity assays in NU/NU mice were used to observe tumorigenicity of riboflavin-depleted HEK293T cells. Mechanistically, gene expression profiling and gene ontology analysis were used to identify abnormally expressed genes induced by riboflavin depletion. Western blot analyses, cell cycle analyses, and chromatin immunoprecipitation were used to validate the expression of cell cycle-related genes. Plate colony formation of NIH3T3 and HEK293T cell lines was enhanced >2-fold when cultured in riboflavin-deficient medium for 10-20 generations. Moreover, we observed enhanced subcutaneous tumorigenicity in NU/NU mice following injection of riboflavin-depleted compared with normal HEK293T cells (55.6% compared with 0.0% tumor formation, respectively). Gene expression profiling and gene ontology analysis revealed that riboflavin depletion induced the expression of cell cycle-related genes. Validation experiments also found that riboflavin depletion decreased p21 and p27 protein levels by ∼20%, and increased cell cycle-related and expression-elevated protein in tumor (CREPT) protein expression >2-fold, resulting in cyclin D1 and CDK4 levels being increased ∼1.5-fold, and cell cycle acceleration. We also observed that riboflavin depletion decreased intracellular riboflavin levels by 20% and upregulated expression of riboflavin transporter genes, particularly SLC52A3, and that the changes in CREPT and SLC52A3 correlated with specific epigenetic changes in their promoters in riboflavin-depleted HEK293T cells. Riboflavin depletion contributes to HEK293T and NIH3T3 cell tumorigenesis and may be a risk factor for tumor development.
Neural code alterations and abnormal time patterns in Parkinson’s disease
NASA Astrophysics Data System (ADS)
Andres, Daniela Sabrina; Cerquetti, Daniel; Merello, Marcelo
2015-04-01
Objective. The neural code used by the basal ganglia is a current question in neuroscience, relevant for the understanding of the pathophysiology of Parkinson’s disease. While a rate code is known to participate in the communication between the basal ganglia and the motor thalamus/cortex, different lines of evidence have also favored the presence of complex time patterns in the discharge of the basal ganglia. To gain insight into the way the basal ganglia code information, we studied the activity of the globus pallidus pars interna (GPi), an output node of the circuit. Approach. We implemented the 6-hydroxydopamine model of Parkinsonism in Sprague-Dawley rats, and recorded the spontaneous discharge of single GPi neurons, in head-restrained conditions at full alertness. Analyzing the temporal structure function, we looked for characteristic scales in the neuronal discharge of the GPi. Main results. At a low-scale, we observed the presence of dynamic processes, which allow the transmission of time patterns. Conversely, at a middle-scale, stochastic processes force the use of a rate code. Regarding the time patterns transmitted, we measured the word length and found that it is increased in Parkinson’s disease. Furthermore, it showed a positive correlation with the frequency of discharge, indicating that an exacerbation of this abnormal time pattern length can be expected, as the dopamine depletion progresses. Significance. We conclude that a rate code and a time pattern code can co-exist in the basal ganglia at different temporal scales. However, their normal balance is progressively altered and replaced by pathological time patterns in Parkinson’s disease.
Evaluation of three high abundance protein depletion kits for umbilical cord serum proteomics
2011-01-01
Background High abundance protein depletion is a major challenge in the study of serum/plasma proteomics. Prior to this study, most commercially available kits for depletion of highly abundant proteins had only been tested and evaluated in adult serum/plasma, while the depletion efficiency on umbilical cord serum/plasma had not been clarified. Structural differences between some adult and fetal proteins (such as albumin) make it likely that depletion approaches for adult and umbilical cord serum/plasma will be variable. Therefore, the primary purposes of the present study are to investigate the efficiencies of several commonly-used commercial kits during high abundance protein depletion from umbilical cord serum and to determine which kit yields the most effective and reproducible results for further proteomics research on umbilical cord serum. Results The immunoaffinity based kits (PROTIA-Sigma and 5185-Agilent) displayed higher depletion efficiency than the immobilized dye based kit (PROTBA-Sigma) in umbilical cord serum samples. Both the PROTIA-Sigma and 5185-Agilent kit maintained high depletion efficiency when used three consecutive times. Depletion by the PROTIA-Sigma Kit improved 2DE gel quality by reducing smeared bands produced by the presence of high abundance proteins and increasing the intensity of other protein spots. During image analysis using the identical detection parameters, 411 ± 18 spots were detected in crude serum gels, while 757 ± 43 spots were detected in depleted serum gels. Eight spots unique to depleted serum gels were identified by MALDI- TOF/TOF MS, seven of which were low abundance proteins. Conclusions The immunoaffinity based kits exceeded the immobilized dye based kit in high abundance protein depletion of umbilical cord serum samples and dramatically improved 2DE gel quality for detection of trace biomarkers. PMID:21554704
Effects of depletion of dihydropyrimidine dehydrogenase on focus formation and RPA phosphorylation.
Someya, Masanori; Sakata, Koh-ichi; Matsumoto, Yoshihisa; Tauchi, Hiroshi; Kai, Masahiro; Hareyama, Masato; Fukushima, Masakazu
2012-01-01
Gimeracil, an inhibitor of dihydropyrimidine dehydrogenase (DPYD), partially inhibits homologous recombination (HR) repair and has a radiosensitizing effect as well as enhanced sensitivity to Camptothecin (CPT). DPYD is the target protein for radiosensitization by Gimeracil. We investigated the mechanisms of sensitization of radiation and CPT by DPYD inhibition using DLD-1 cells treated with siRNA for DPYD. We investigated the focus formation of various kinds of proteins involved in HR and examined the phosphorylation of RPA by irradiation using Western blot analysis. DPYD depletion by siRNA significantly restrained the formation of radiation-induced foci of Rad51 and RPA, whereas it increased the number of foci of NBS1. The numbers of colocalization of NBS1 and RPA foci in DPYD-depleted cells after radiation were significantly smaller than in the control cells. These results suggest that DPYD depletion is attributable to decreased single-stranded DNA generated by the Mre11/Rad50/NBS1 complex-dependent resection of DNA double-strand break ends. The phosphorylation of RPA by irradiation was partially suppressed in DPYD-depleted cells, suggesting that DPYD depletion may partially inhibit DNA repair with HR by suppressing phosphorylation of RPA. DPYD depletion showed a radiosensitizing effect as well as enhanced sensitivity to CPT. The radiosensitizing effect of DPYD depletion plus CPT was the additive effect of DPYD depletion and CPT. DPYD depletion did not have a cell-killing effect, suggesting that DPYD depletion may not be so toxic. Considering these results, the combination of CPT and drugs that inhibit DPYD may prove useful for radiotherapy as a method of radiosensitization.
Incorporation of copper ions into crystals of T2 copper-depleted laccase from Botrytis aclada
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osipov, E. M., E-mail: e.m.osipov@gmail.com; Polyakov, K. M.; Engelhardt Institute of Molecular Biology, Vavilova str. 32, Moscow 119991
2015-11-18
The restoration of the native form of laccase from B. aclada from the type 2 copper-depleted form of the enzyme was investigated. Copper ions were found to be incorporated into the active site after soaking the depleted enzyme in a Cu{sup +}-containing solution. Laccases belong to the class of multicopper oxidases catalyzing the oxidation of phenols accompanied by the reduction of molecular oxygen to water without the formation of hydrogen peroxide. The activity of laccases depends on the number of Cu atoms per enzyme molecule. The structure of type 2 copper-depleted laccase from Botrytis aclada has been solved previously. Withmore » the aim of obtaining the structure of the native form of the enzyme, crystals of the depleted laccase were soaked in Cu{sup +}- and Cu{sup 2+}-containing solutions. Copper ions were found to be incorporated into the active site only when Cu{sup +} was used. A comparative analysis of the native and depleted forms of the enzymes was performed.« less
Radial Diffusion study of the 1 June 2013 CME event using MHD simulations.
NASA Astrophysics Data System (ADS)
Patel, M.; Hudson, M.; Wiltberger, M. J.; Li, Z.; Boyd, A. J.
2016-12-01
The June 1, 2013 storm was a CME-shock driven geomagnetic storm (Dst = -119 nT) that caused a dropout affecting all radiation belt electron energies measured by the Energetic Particle, Composition and Thermal Plasma Suite (ECT) instrument on Van Allen Probes at higher L-shells following dynamic pressure enhancement in the solar wind. Lower energies (up to about 700 keV) were enhanced by the storm while MeV electrons were depleted throughout the belt. We focus on depletion through radial diffusion caused by the enhanced ULF wave activity due to the CME-shock. This study utilities the Lyon-Fedder-Mobarry (LFM) model, a 3D global magnetospheric simulation code based on the ideal MHD equations, coupled with the Magnetosphere Ionosphere Coupler (MIX) and Rice Convection Model (RCM). The MHD electric and magnetic fields with equations described by Fei et al. [JGR, 2006] are used to calculate radial diffusion coefficients (DLL). These DLL values are input into a radial diffusion code to recreate the dropouts observed by the Van Allen Probes. The importance of understanding the complex role that ULF waves play in radial transport and the effects of CME-driven storms on the relativistic energy electrons in the radiation belts can be accomplished using MHD simulations to obtain diffusion coefficients, initial phase space density and the outer boundary condition from the ECT instrument suite and a radial diffusion model to reproduce observed fluxes which compare favorably with Van Allen Probes ECT measurements.
Muscle Mass Depletion Associated with Poor Outcome of Sepsis in the Emergency Department.
Lee, YoonJe; Park, Hyun Kyung; Kim, Won Young; Kim, Myung Chun; Jung, Woong; Ko, Byuk Sung
2018-05-08
Muscle mass depletion has been suggested to predict morbidity and mortality in various diseases. However, it is not well known whether muscle mass depletion is associated with poor outcome in sepsis. We hypothesized that muscle mass depletion is associated with poor outcome in sepsis. Retrospective observational study was conducted in an emergency department during a 9-year period. Medical records of 627 patients with sepsis were reviewed. We divided the patients into 2 groups according to 28-day mortality and compared the presence of muscle mass depletion assessed by the cross-sectional area of the psoas muscle at the level of the third lumbar vertebra on abdomen CT scans. Univariate and multivariate logistic regression analyses were conducted to examine the association of scarcopenia on the outcome of sepsis. A total of 274 patients with sepsis were finally included in the study: 45 (16.4%) did not survive on 28 days and 77 patients (28.1%) were identified as having muscle mass depletion. The presence of muscle mass depletion was independently associated with 28-day mortality on multivariate logistic analysis (OR 2.79; 95% CI 1.35-5.74, p = 0.01). Muscle mass depletion evaluated by CT scan was associated with poor outcome of sepsis patients. Further studies on the appropriateness of specific treatment for muscle mass depletion with sepsis are needed. © 2018 S. Karger AG, Basel.
Neutron-Irradiated Samples as Test Materials for MPEX
Ellis, Ronald James; Rapp, Juergen
2015-10-09
Plasma Material Interaction (PMI) is a major concern in fusion reactor design and analysis. The Material-Plasma Exposure eXperiment (MPEX) will explore PMI under fusion reactor plasma conditions. Samples with accumulated displacements per atom (DPA) damage produced by fast neutron irradiations in the High Flux Isotope Reactor (HFIR) at Oak Ridge National Laboratory (ORNL) will be studied in the MPEX facility. This paper presents assessments of the calculated induced radioactivity and resulting radiation dose rates of a variety of potential fusion reactor plasma-facing materials (such as tungsten). The scientific code packages MCNP and SCALE were used to simulate irradiation of themore » samples in HFIR including the generation and depletion of nuclides in the material and the subsequent composition, activity levels, gamma radiation fields, and resultant dose rates as a function of cooling time. A challenge of the MPEX project is to minimize the radioactive inventory in the preparation of the samples and the sample dose rates for inclusion in the MPEX facility.« less
Origins of Energetic Ions in the Earth's Magnetosheath
NASA Technical Reports Server (NTRS)
Fuselter, S. A.; Shelley, E. G.; Klumpar, D. M.
1992-01-01
The analysis and interpretation of the combined scientific data from the Hot Plasma Composition Experiment (HPCE) and the Charge Energy Mass (CHEM) spectrometer on the Active Mesospheric Particle Tracer Experiment (AMPTE) Charge Composition Explorer (CCE) spacecraft are discussed. These combined data sets have and will be used to survey the energetic ion environment in the Earth's magnetosheath to determine the origins and relative strengths of the energetic ion populations found there. A computer code was developed to analyze and interpret the data sets. The focus of the first year was on the determination of the contribution of leaked magnetospheric protons to the total energetic proton population. Emphasis was placed on intervals when the AMPTE spacecraft was in the plasma depletion layer because it was argued that in this region, only the leaked population contributes to the energetic ion population. Manipulation of the CHEM data and comparison of the CHEM and HPCE data over their common energy range near the magnetopause also contributed directly to a second study of that region.
New developments and prospects on COSI, the simulation software for fuel cycle analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eschbach, R.; Meyer, M.; Coquelet-Pascal, C.
2013-07-01
COSI, software developed by the Nuclear Energy Direction of the CEA, is a code simulating a pool of nuclear power plants with its associated fuel cycle facilities. This code has been designed to study various short, medium and long term options for the introduction of various types of nuclear reactors and for the use of associated nuclear materials. In the frame of the French Act for waste management, scenario studies are carried out with COSI, to compare different options of evolution of the French reactor fleet and options of partitioning and transmutation of plutonium and minor actinides. Those studies aimmore » in particular at evaluating the sustainability of Sodium cooled Fast Reactors (SFR) deployment and the possibility to transmute minor actinides. The COSI6 version is a completely renewed software released in 2006. COSI6 is now coupled with the last version of CESAR (CESAR5.3 based on JEFF3.1.1 nuclear data) allowing the calculations on irradiated fuel with 200 fission products and 100 heavy nuclides. A new release is planned in 2013, including in particular the coupling with a recommended database of reactors. An exercise of validation of COSI6, carried out on the French PWR historic nuclear fleet, has been performed. During this exercise quantities like cumulative natural uranium consumption, or cumulative depleted uranium, or UOX/MOX spent fuel storage, or stocks of reprocessed uranium, or plutonium content in fresh MOX fuel, or the annual production of high level waste, have been computed by COSI6 and compared to industrial data. The results have allowed us to validate the essential phases of the fuel cycle computation, and reinforces the credibility of the results provided by the code.« less
Kelsen, Judith R.; Dawany, Noor; Moran, Christopher J.; Petersen, Britt-Sabina; Sarmady, Mahdi; Sasson, Ariella; Pauly-Hubbard, Helen; Martinez, Alejandro; Maurer, Kelly; Soong, Joanne; Rappaport, Eric; Franke, Andre; Keller, Andreas; Winter, Harland S.; Mamula, Petar; Piccoli, David; Artis, David; Sonnenberg, Gregory F.; Daly, Mark; Sullivan, Kathleen E.; Baldassano, Robert N.; Devoto, Marcella
2016-01-01
Background & Aims Very early onset inflammatory bowel disease (VEO-IBD), IBD diagnosed ≤5 y of age, frequently presents with a different and more severe phenotype than older-onset IBD. We investigated whether patients with VEO-IBD carry rare or novel variants in genes associated with immunodeficiencies that might contribute to disease development. Methods Patients with VEO-IBD and parents (when available) were recruited from the Children's Hospital of Philadelphia from March 2013 through July 2014. We analyzed DNA from 125 patients with VEO-IBD (ages 3 weeks to 4 y) and 19 parents, 4 of whom also had IBD. Exome capture was performed by Agilent SureSelect V4, and sequencing was performed using the Illumina HiSeq platform. Alignment to human genome GRCh37 was achieved followed by post-processing and variant calling. Following functional annotation, candidate variants were analyzed for change in protein function, minor allele frequency <0.1%, and scaled combined annotation dependent depletion scores ≤10. We focused on genes associated with primary immunodeficiencies and related pathways. An additional 210 exome samples from patients with pediatric IBD (n=45) or adult-onset Crohn's disease (n=20) and healthy individuals (controls, n=145) were obtained from the University of Kiel, Germany and used as control groups. Results Four-hundred genes and regions associated with primary immunodeficiency, covering approximately 6500 coding exons totaling > 1 Mbp of coding sequence, were selected from the whole exome data. Our analysis revealed novel and rare variants within these genes that could contribute to the development of VEO-IBD, including rare heterozygous missense variants in IL10RA and previously unidentified variants in MSH5 and CD19. Conclusions In an exome sequence analysis of patients with VEO-IBD and their parents, we identified variants in genes that regulate B- and T-cell functions and could contribute to pathogenesis. Our analysis could lead to the identification of previously unidentified IBD-associated variants. PMID:26193622
DOE Office of Scientific and Technical Information (OSTI.GOV)
Downar, Thomas
This report summarizes the current status of VERA-CS Verification and Validation for PWR Core Follow operation and proposes a multi-phase plan for continuing VERA-CS V&V in FY17 and FY18. The proposed plan recognizes the hierarchical nature of a multi-physics code system such as VERA-CS and the importance of first achieving an acceptable level of V&V on each of the single physics codes before focusing on the V&V of the coupled physics solution. The report summarizes the V&V of each of the single physics codes systems currently used for core follow analysis (ie MPACT, CTF, Multigroup Cross Section Generation, and BISONmore » / Fuel Temperature Tables) and proposes specific actions to achieve a uniformly acceptable level of V&V in FY17. The report also recognizes the ongoing development of other codes important for PWR Core Follow (e.g. TIAMAT, MAMBA3D) and proposes Phase II (FY18) VERA-CS V&V activities in which those codes will also reach an acceptable level of V&V. The report then summarizes the current status of VERA-CS multi-physics V&V for PWR Core Follow and the ongoing PWR Core Follow V&V activities for FY17. An automated procedure and output data format is proposed for standardizing the output for core follow calculations and automatically generating tables and figures for the VERA-CS Latex file. A set of acceptance metrics is also proposed for the evaluation and assessment of core follow results that would be used within the script to automatically flag any results which require further analysis or more detailed explanation prior to being added to the VERA-CS validation base. After the Automation Scripts have been completed and tested using BEAVRS, the VERA-CS plan proposes the Watts Bar cycle depletion cases should be performed with the new cross section library and be included in the first draft of the new VERA-CS manual for release at the end of PoR15. Also, within the constraints imposed by the proprietary nature of plant data, as many as possible of the FY17 AMA Plant Core Follow cases should also be included in the VERA-CS manual at the end of PoR15. After completion of the ongoing development of TIAMAT for fully coupled, full core calculations with VERA-CS / BISON 1.5D, and after the completion of the refactoring of MAMBA3D for CIPS analysis in FY17, selected cases from the VERA-CS validation based should be performed, beginning with the legacy cases of Watts Bar and BEAVRS in PoR16. Finally, as potential Phase III future work some additional considerations are identified for extending the VERA-CS V&V to other reactor types such as the BWR.« less
Cloutier, Sara C; Wang, Siwen; Ma, Wai Kit; Al Husini, Nadra; Dhoondia, Zuzer; Ansari, Athar; Pascuzzi, Pete E; Tran, Elizabeth J
2016-02-04
Long non-coding (lnc)RNAs, once thought to merely represent noise from imprecise transcription initiation, have now emerged as major regulatory entities in all eukaryotes. In contrast to the rapidly expanding identification of individual lncRNAs, mechanistic characterization has lagged behind. Here we provide evidence that the GAL lncRNAs in the budding yeast S. cerevisiae promote transcriptional induction in trans by formation of lncRNA-DNA hybrids or R-loops. The evolutionarily conserved RNA helicase Dbp2 regulates formation of these R-loops as genomic deletion or nuclear depletion results in accumulation of these structures across the GAL cluster gene promoters and coding regions. Enhanced transcriptional induction is manifested by lncRNA-dependent displacement of the Cyc8 co-repressor and subsequent gene looping, suggesting that these lncRNAs promote induction by altering chromatin architecture. Moreover, the GAL lncRNAs confer a competitive fitness advantage to yeast cells because expression of these non-coding molecules correlates with faster adaptation in response to an environmental switch. Copyright © 2016 Elsevier Inc. All rights reserved.
Tarasova, Irina A; Lobas, Anna A; Černigoj, Urh; Solovyeva, Elizaveta M; Mahlberg, Barbara; Ivanov, Mark V; Panić-Janković, Tanja; Nagy, Zoltan; Pridatchenko, Marina L; Pungor, Andras; Nemec, Blaž; Vidic, Urška; Gašperšič, Jernej; Krajnc, Nika Lendero; Vidič, Jana; Gorshkov, Mikhail V; Mitulović, Goran
2016-09-01
Affinity depletion of abundant proteins such as HSA is an important stage in routine sample preparation prior to MS/MS analysis of biological samples with high range of concentrations. Due to the charge competition effects in electrospray ion source that results in discrimination of the low-abundance species, as well as limited dynamic range of MS/MS, restricted typically by three orders of magnitude, the identification of low-abundance proteins becomes a challenge unless the sample is depleted from high-concentration compounds. This dictates a need for developing efficient separation technologies allowing fast and automated protein depletion. In this study, we performed evaluation of a novel immunoaffinity-based Convective Interaction Media analytical columns (CIMac) depletion column with specificity to HSA (CIMac-αHSA). Because of the convective flow-through channels, the polymethacrylate CIMac monoliths afford flow rate independent binding capacity and resolution that results in relatively short analysis time compared with traditional chromatographic supports. Seppro IgY14 depletion kit was used as a benchmark to control the results of depletion. Bottom-up proteomic approach followed by label-free quantitation using normalized spectral indexes were employed for protein quantification in G1/G2 and cleavage/blastocyst in vitro fertilization culture media widely utilized in clinics for embryo growth in vitro. The results revealed approximately equal HSA level of 100 ± 25% in albumin-enriched fractions relative to the nondepleted samples for both CIMac-αHSA column and Seppro kit. In the albumin-free fractions concentrated 5.5-fold by volume, serum albumin was identified at the levels of 5-30% and 20-30% for the CIMac-αHSA and Seppro IgY14 spin columns, respectively. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bilichak, Andriy; Ilnystkyy, Yaroslav; Hollunder, Jens; Kovalchuk, Igor
2012-01-01
Plants are able to acclimate to new growth conditions on a relatively short time-scale. Recently, we showed that the progeny of plants exposed to various abiotic stresses exhibited changes in genome stability, methylation patterns and stress tolerance. Here, we performed a more detailed analysis of methylation patterns in the progeny of Arabidopsis thaliana (Arabidopsis) plants exposed to 25 and 75 mM sodium chloride. We found that the majority of gene promoters exhibiting changes in methylation were hypermethylated, and this group was overrepresented by regulators of the chromatin structure. The analysis of DNA methylation at gene bodies showed that hypermethylation in the progeny of stressed plants was primarily due to changes in the 5′ and 3′ ends as well as in exons rather than introns. All but one hypermethylated gene tested had lower gene expression. The analysis of histone modifications in the promoters and coding sequences showed that hypermethylation and lower gene expression correlated with the enrichment of H3K9me2 and depletion of H3K9ac histones. Thus, our work demonstrated a high degree of correlation between changes in DNA methylation, histone modifications and gene expression in the progeny of salt-stressed plants. PMID:22291972
NASA Astrophysics Data System (ADS)
Guber, C. R.; Richter, P.; Wendt, M.
2018-01-01
Aims: We aim to investigate the dust depletion properties of optically thick gas in and around galaxies and its origin we study in detail the dust depletion patterns of Ti, Mn, and Ca in the multi-component damped Lymanα (DLA) absorber at zabs = 0.313 toward the quasar PKS 1127-145. Methods: We performed a detailed spectral analysis of the absorption profiles of Ca II, Mn II, Ti II, and Na I associated with the DLA toward PKS 1127-145, based on optical high-resolution data obtained with the UVES instrument at the Very Large Telescope. We obtained column densities and Doppler-parameters for the ions listed above and determine their gas-phase abundances, from which we conclude on their dust depletion properties. We compared the Ca and Ti depletion properties of this DLA with that of other DLAs. Results: One of the six analyzed absorption components (component 3) shows a striking underabundance of Ti and Mn in the gas-phase, indicating the effect of dust depletion for these elements and a locally enhanced dust-to-gas ratio. In this DLA and in other similar absorbers, the Mn II abundance follows that of Ti II very closely, implying that both ions are equally sensitive to the dust depletion effects. Conclusions: Our analysis indicates that the DLA toward PKS 1127-145 has multiple origins. With its narrow line width and its strong dust depletion, component 3 points toward the presence of a neutral gas disk from a faint LSB galaxy in front of PKS 1127-145, while the other, more diffuse and dust-poor, absorption components possibly are related to tidal gas features from the interaction between the various, optically confirmed galaxy-group members. In general, the Mn/Ca II ratio in sub-DLAs and DLAs possibly serves as an important indicator to discriminate between dust-rich and dust-poor in neutral gas in and around galaxies.
Wu, Chia-Lung; McNeill, Jenna; Goon, Kelsey; Little, Dianne; Kimmerling, Kelly; Huebner, Janet; Kraus, Virginia; Guilak, Farshid
2017-09-01
To investigate whether short-term, systemic depletion of macrophages can mitigate osteoarthritis (OA) following injury in the setting of obesity. CSF-1R-GFP+ macrophage Fas-induced apoptosis (MaFIA)-transgenic mice that allow conditional depletion of macrophages were placed on a high-fat diet and underwent surgery to induce knee OA. A small molecule (AP20187) was administrated to deplete macrophages in MaFIA mice. The effects of macrophage depletion on acute joint inflammation, OA severity, and arthritic bone changes were evaluated using histology and micro-computed tomography. Immunohistochemical analysis was performed to identify various immune cells. The levels of serum and synovial fluid cytokines were also measured. Macrophage-depleted mice had significantly fewer M1 and M2 macrophages in the surgically operated joints relative to controls and exhibited decreased osteophyte formation immediately following depletion. Surprisingly, macrophage depletion did not attenuate the severity of OA in obese mice; instead, it induced systemic inflammation and led to a massive infiltration of CD3+ T cells and particularly neutrophils, but not B cells, into the injured joints. Macrophage-depleted mice also demonstrated a markedly increased number of proinflammatory cytokines including granulocyte colony-stimulating factor, interleukin-1β (IL-1β), IL-6, IL-8, and tumor necrosis factor in both serum and joint synovial fluid, although the mice showed a trend toward decreased levels of insulin and leptin in serum after macrophage depletion. Our findings indicate that macrophages are vital for modulating homeostasis of immune cells in the setting of obesity and suggest that more targeted approaches of depleting specific macrophage subtypes may be necessary to mitigate inflammation and OA in the setting of obesity. © 2017, American College of Rheumatology.
Snail1 transcription factor controls telomere transcription and integrity
Mazzolini, Rocco; Gonzàlez, Núria; Garcia-Garijo, Andrea; Millanes-Romero, Alba; Peiró, Sandra; Smith, Susan
2018-01-01
Abstract Besides controlling epithelial-to-mesenchymal transition (EMT) and cell invasion, the Snail1 transcriptional factor also provides cells with cancer stem cell features. Since telomere maintenance is essential for stemness, we have examined the control of telomere integrity by Snail1. Fluorescence in situ hybridization (FISH) analysis indicates that Snail1-depleted mouse mesenchymal stem cells (MSC) have both a dramatic increase of telomere alterations and shorter telomeres. Remarkably, Snail1-deficient MSC present higher levels of both telomerase activity and the long non-coding RNA called telomeric repeat-containing RNA (TERRA), an RNA that controls telomere integrity. Accordingly, Snail1 expression downregulates expression of the telomerase gene (TERT) as well as of TERRA 2q, 11q and 18q. TERRA and TERT are transiently downregulated during TGFβ-induced EMT in NMuMG cells, correlating with Snail1 expression. Global transcriptome analysis indicates that ectopic expression of TERRA affects the transcription of some genes induced during EMT, such as fibronectin, whereas that of TERT does not modify those genes. We propose that Snail1 repression of TERRA is required not only for telomere maintenance but also for the expression of a subset of mesenchymal genes. PMID:29059385
Colombo, Lívia Tavares; de Oliveira, Marcelo Nagem Valério; Carneiro, Deisy Guimarães; de Souza, Robson Assis; Alvim, Mariana Caroline Tocantins; Dos Santos, Josenilda Carlos; da Silva, Cynthia Canêdo; Vidigal, Pedro Marcus Pereira; da Silveira, Wendel Batista; Passos, Flávia Maria Lopes
2016-09-01
Environments where lignocellulosic biomass is naturally decomposed are sources for discovery of new hydrolytic enzymes that can reduce the high cost of enzymatic cocktails for second-generation ethanol production. Metagenomic analysis was applied to discover genes coding carbohydrate-depleting enzymes from a microbial laboratory subculture using a mix of sugarcane bagasse and cow manure in the thermophilic composting phase. From a fosmid library, 182 clones had the ability to hydrolyse carbohydrate. Sequencing of 30 fosmids resulted in 12 contigs encoding 34 putative carbohydrate-active enzymes belonging to 17 glycosyl hydrolase (GH) families. One third of the putative proteins belong to the GH3 family, which includes β-glucosidase enzymes known to be important in the cellulose-deconstruction process but present with low activity in commercial enzyme preparations. Phylogenetic analysis of the amino acid sequences of seven selected proteins, including three β-glucosidases, showed low relatedness with protein sequences deposited in databases. These findings highlight microbial consortia obtained from a mixture of decomposing biomass residues, such as sugar cane bagasse and cow manure, as a rich resource of novel enzymes potentially useful in biotechnology for saccharification of lignocellulosic substrate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
Equilibrium cycle pin by pin transport depletion calculations with DeCART
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kochunas, B.; Downar, T.; Taiwo, T.
As the Advanced Fuel Cycle Initiative (AFCI) program has matured it has become more important to utilize more advanced simulation methods. The work reported here was performed as part of the AFCI fellowship program to develop and demonstrate the capability of performing high fidelity equilibrium cycle calculations. As part of the work here, a new multi-cycle analysis capability was implemented in the DeCART code which included modifying the depletion modules to perform nuclide decay calculations, implementing an assembly shuffling pattern description, and modifying iteration schemes. During the work, stability issues were uncovered with respect to converging simultaneously the neutron flux,more » isotopics, and fluid density and temperature distributions in 3-D. Relaxation factors were implemented which considerably improved the stability of the convergence. To demonstrate the capability two core designs were utilized, a reference UOX core and a CORAIL core. Full core equilibrium cycle calculations were performed on both cores and the discharge isotopics were compared. From this comparison it was noted that the improved modeling capability was not drastically different in its prediction of the discharge isotopics when compared to 2-D single assembly or 2-D core models. For fissile isotopes such as U-235, Pu-239, and Pu-241 the relative differences were 1.91%, 1.88%, and 0.59%), respectively. While this difference may not seem large it translates to mass differences on the order of tens of grams per assembly, which may be significant for the purposes of accounting of special nuclear material. (authors)« less
NASA Astrophysics Data System (ADS)
Wang, Lei; Bai, Bing; Li, Xiaochun; Liu, Mingze; Wu, Haiqing; Hu, Shaobin
2016-07-01
Induced seismicity and fault reactivation associated with fluid injection and depletion were reported in hydrocarbon, geothermal, and waste fluid injection fields worldwide. Here, we establish an analytical model to assess fault reactivation surrounding a reservoir during fluid injection and extraction that considers the stress concentrations at the fault tips and the effects of fault length. In this model, induced stress analysis in a full-space under the plane strain condition is implemented based on Eshelby's theory of inclusions in terms of a homogeneous, isotropic, and poroelastic medium. The stress intensity factor concept in linear elastic fracture mechanics is adopted as an instability criterion for pre-existing faults in surrounding rocks. To characterize the fault reactivation caused by fluid injection and extraction, we define a new index, the "fault reactivation factor" η, which can be interpreted as an index of fault stability in response to fluid pressure changes per unit within a reservoir resulting from injection or extraction. The critical fluid pressure change within a reservoir is also determined by the superposition principle using the in situ stress surrounding a fault. Our parameter sensitivity analyses show that the fault reactivation tendency is strongly sensitive to fault location, fault length, fault dip angle, and Poisson's ratio of the surrounding rock. Our case study demonstrates that the proposed model focuses on the mechanical behavior of the whole fault, unlike the conventional methodologies. The proposed method can be applied to engineering cases related to injection and depletion within a reservoir owing to its efficient computational codes implementation.
Analysis of the depletion of a stored aerosol in low gravity
NASA Technical Reports Server (NTRS)
Squires, P.
1977-01-01
The depletion of an aerosol stored in a container has been studied in l-g and in low gravity. Models were developed for sedimentation, coagulation and diffusional losses to the walls. The overall depletion caused by these three mechanisms is predicted to be of order 5 to 8 percent per hour in terrestrial conditions, which agrees with laboratory experience. Applying the models to a low gravity situation indicates that there only coagulation will be significant. (Gravity influences diffusional losses because of convection currents caused by random temperature gradients). For the types of aerosol studied, the rate of depletion of particles should be somewhat less than 0.001 N percent per hour, where N is the concentration per cu cm.
Remanent Activation in the Mini-SHINE Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Micklich, Bradley J.
2015-04-16
Argonne National Laboratory is assisting SHINE Medical Technologies in developing a domestic source of the medical isotope 99Mo through the fission of low-enrichment uranium in a uranyl sulfate solution. In Phase 2 of these experiments, electrons from a linear accelerator create neutrons by interacting in a depleted uranium target, and these neutrons are used to irradiate the solution. The resulting neutron and photon radiation activates the target, the solution vessels, and a shielded cell that surrounds the experimental apparatus. When the experimental campaign is complete, the target must be removed into a shielding cask, and the experimental components must bemore » disassembled. The radiation transport code MCNPX and the transmutation code CINDER were used to calculate the radionuclide inventories of the solution, the target assembly, and the shielded cell, and to determine the dose rates and shielding requirements for selected removal scenarios for the target assembly and the solution vessels.« less
Stochastic interactions of two Brownian hard spheres in the presence of depletants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karzar-Jeddi, Mehdi; Fan, Tai-Hsi, E-mail: thfan@engr.uconn.edu; Tuinier, Remco
2014-06-07
A quantitative analysis is presented for the stochastic interactions of a pair of Brownian hard spheres in non-adsorbing polymer solutions. The hard spheres are hypothetically trapped by optical tweezers and allowed for random motion near the trapped positions. The investigation focuses on the long-time correlated Brownian motion. The mobility tensor altered by the polymer depletion effect is computed by the boundary integral method, and the corresponding random displacement is determined by the fluctuation-dissipation theorem. From our computations it follows that the presence of depletion layers around the hard spheres has a significant effect on the hydrodynamic interactions and particle dynamicsmore » as compared to pure solvent and uniform polymer solution cases. The probability distribution functions of random walks of the two interacting hard spheres that are trapped clearly shift due to the polymer depletion effect. The results show that the reduction of the viscosity in the depletion layers around the spheres and the entropic force due to the overlapping of depletion zones have a significant influence on the correlated Brownian interactions.« less
Seligmann, Hervé
2013-03-01
Usual DNA→RNA transcription exchanges T→U. Assuming different systematic symmetric nucleotide exchanges during translation, some GenBank RNAs match exactly human mitochondrial sequences (exchange rules listed in decreasing transcript frequencies): C↔U, A↔U, A↔U+C↔G (two nucleotide pairs exchanged), G↔U, A↔G, C↔G, none for A↔C, A↔G+C↔U, and A↔C+G↔U. Most unusual transcripts involve exchanging uracil. Independent measures of rates of rare replicational enzymatic DNA nucleotide misinsertions predict frequencies of RNA transcripts systematically exchanging the corresponding misinserted nucleotides. Exchange transcripts self-hybridize less than other gene regions, self-hybridization increases with length, suggesting endoribonuclease-limited elongation. Blast detects stop codon depleted putative protein coding overlapping genes within exchange-transcribed mitochondrial genes. These align with existing GenBank proteins (mainly metazoan origins, prokaryotic and viral origins underrepresented). These GenBank proteins frequently interact with RNA/DNA, are membrane transporters, or are typical of mitochondrial metabolism. Nucleotide exchange transcript frequencies increase with overlapping gene densities and stop densities, indicating finely tuned counterbalancing regulation of expression of systematic symmetric nucleotide exchange-encrypted proteins. Such expression necessitates combined activities of suppressor tRNAs matching stops, and nucleotide exchange transcription. Two independent properties confirm predicted exchanged overlap coding genes: discrepancy of third codon nucleotide contents from replicational deamination gradients, and codon usage according to circular code predictions. Predictions from both properties converge, especially for frequent nucleotide exchange types. Nucleotide exchanging transcription apparently increases coding densities of protein coding genes without lengthening genomes, revealing unsuspected functional DNA coding potential. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Rotureau, Elise; Billard, Patrick; Duval, Jérôme F L
2015-01-20
Bioavailability of trace metals is a key parameter for assessment of toxicity on living organisms. Proper evaluation of metal bioavailability requires monitoring the various interfacial processes that control metal partitioning dynamics at the biointerface, which includes metal transport from solution to cell membrane, adsorption at the biosurface, internalization, and possible excretion. In this work, a methodology is proposed to quantitatively describe the dynamics of Cd(II) uptake by Pseudomonas putida. The analysis is based on the kinetic measurement of Cd(II) depletion from bulk solution at various initial cell concentrations using electroanalytical probes. On the basis of a recent formalism on the dynamics of metal uptake by complex biointerphases, the cell concentration-dependent depletion time scales and plateau values reached by metal concentrations at long exposure times (>3 h) are successfully rationalized in terms of limiting metal uptake flux, rate of excretion, and metal affinity to internalization sites. The analysis shows the limits of approximate depletion models valid in the extremes of high and weak metal affinities. The contribution of conductive diffusion transfer of metals from the solution to the cell membrane in governing the rate of Cd(II) uptake is further discussed on the basis of estimated resistances for metal membrane transfer and extracellular mass transport.
Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO
NASA Technical Reports Server (NTRS)
Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III
1996-01-01
Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.
Ryu, Moon-Suhn; Langkamp-Henken, Bobbi; Chang, Shou-Mei; Shankar, Meena N; Cousins, Robert J
2011-12-27
Implementation of zinc interventions for subjects suspected of being zinc-deficient is a global need, but is limited due to the absence of reliable biomarkers. To discover molecular signatures of human zinc deficiency, a combination of transcriptome, cytokine, and microRNA analyses was applied to a dietary zinc depletion/repletion protocol with young male human subjects. Concomitant with a decrease in serum zinc concentration, changes in buccal and blood gene transcripts related to zinc homeostasis occurred with zinc depletion. Microarray analyses of whole blood RNA revealed zinc-responsive genes, particularly, those associated with cell cycle regulation and immunity. Responses of potential signature genes of dietary zinc depletion were further assessed by quantitative real-time PCR. The diagnostic properties of specific serum microRNAs for dietary zinc deficiency were identified by acute responses to zinc depletion, which were reversible by subsequent zinc repletion. Depression of immune-stimulated TNFα secretion by blood cells was observed after low zinc consumption and may serve as a functional biomarker. Our findings introduce numerous novel candidate biomarkers for dietary zinc status assessment using a variety of contemporary technologies and which identify changes that occur prior to or with greater sensitivity than the serum zinc concentration which represents the current zinc status assessment marker. In addition, the results of gene network analysis reveal potential clinical outcomes attributable to suboptimal zinc intake including immune function defects and predisposition to cancer. These demonstrate through a controlled depletion/repletion dietary protocol that the illusive zinc biomarker(s) can be identified and applied to assessment and intervention strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, S.M.
1995-01-01
The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies inmore » the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The other two benchmark critical calculations were the beginning-of-cycle (BOC) startup at both hot, zero-power (HZP) and HFP critical conditions. These latter calculations were used to check for consistency in the calculated results for different burnups and downtimes. The k{sub eff} results were in the range of 1.00014 to 1.00259 with a standard deviation of less than 0.001.« less
NASA Astrophysics Data System (ADS)
Korkut, A.
It is well known that the semiconductor surface is easily oxidized by air-media in time. This work studieds the characterization of Schottky diodes and changes in depletion capacitance, which is caused by air exposure of a group of Cu/n-Si/Al Schottky diodes. First, data for current-voltage and capacitance-voltage were a Ren, and then ideality factor, barrier height, built-in potential (Vbi), donor concentration and Fermi level, interfacial oxide thickness, interface state density were calculated. It is seen that depletion capacitance was calculate; whereafter built-in potential played an important role in Schottky diodes characteristic. Built-in potential directly affects the characteristic of Schottky diodes and a turning point occurs. In case of forward and reverse bias, depletion capacitance versus voltage graphics are matched, but in an opposite direction. In case of forward bias, differential depletion capacitance begins from minus values, it is raised to first Vbi, then reduced to second Vbi under the minus condition. And it sharply gones up to positive apex, then sharply falls down to near zero, but it takes positive values depending on DC voltage. In case of reverse bias, differential depletion capacitance takes to small positive values. In other respects, we see that depletion characteristics change considerably under DC voltage.
Moschonis, George; Mavrogianni, Christina; Giannopoulou, Angeliki; Damianidi, Louisa; Chrousos, George P.; Manios, Yannis
2013-01-01
The aim of the present study was to investigate the associations of iron depletion (ID) with menstrual blood losses, lifestyle, and dietary habits, in pubertal girls. The study sample comprised 1222 girls aged 9–13 years old. Biochemical, anthropometrical, dietary, clinical, and physical activity data were collected. Out of 274 adolescent girls with menses, 33.5% were found to be iron depleted (defined as serum ferritin < 12 μg/L) compared to 15.9% out of 948 girls without menses. Iron-depleted girls without menses were found to have lower consumption of poultry (P = 0.017) and higher consumption of fruits (P = 0.044) and fast food (P = 0.041) compared to their peers having normal iron status. Multivariate logistic regression analysis showed that girls with menses were 2.57 (95% CI: 1.37, 4.81) times more likely of being iron depleted compared to girls with no menses. Iron depletion was found to be associated with high calcium intake, high consumption of fast foods, and low consumption of poultry and fruits. Menses was the only factor that was found to significantly increase the likelihood of ID in these girls. More future research is probably needed in order to better understand the role of diet and menses in iron depletion. PMID:24455693
NASA Astrophysics Data System (ADS)
Ameur, Fatah; Amichi, Hichem; Kuper, Marcel; Hammani, Ali
2017-09-01
Much attention has been paid to the issue of groundwater depletion linked to intensive groundwater-based agriculture in (semi-)arid areas. Often referred to as the "overexploitation" of aquifers, groundwater depletion is generally attributed to the entire agricultural sector without distinguishing between different uses and users. Although it expresses a general concern for future users, the ambiguous term of "overexploitation" does not acknowledge the contested nature of groundwater use and emerging inequalities. Also, the impact of inequality on groundwater depletion is rarely questioned. The aim of this article is to investigate how and by whom groundwater is depleted, and in turn, how unequal access to groundwater fuels the socioeconomic differentiation of farms and groundwater depletion. Based on a detailed analysis of groundwater use from a user perspective in two irrigated areas in North Africa (Morocco and Algeria), this study shows how the context of groundwater depletion exacerbates—and is exacerbated by—existing inequalities. The paper concludes that knowing how much is withdrawn, where, and by whom provides helpful information for more informed groundwater management by a better understanding of the response of users to declining groundwater conditions and the interests and incentives of different social categories of famers to contribute to groundwater management.
Evaluation of SDS depletion using an affinity spin column and IMS-MS detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hengel, Shawna M.; Floyd, Erica A.; Baker, Erin Shammel
2012-11-01
While the use of detergents is necessary for a variety of protein isolation preparation protocols, often prior to mass spectral (MS) analysis, they are not compatible with MS analysis due to ion suppression and adduct formation. This manuscript describes optimization of detergent removal, using commercially available SDS depletion spin columns containing an affinity resin, providing for both increased protein recovery and thorough SDS removal. Ion mobility spectrometry coupled with mass spectrometry (IMS-MS) allowed for a concurrent analysis of both analyte and detergent. In the case of both proteins and peptides, higher detergent concentrations than previously reported provided an increase ofmore » sample recovery; however there was a limit as SDS was detected by IMS-MS at higher levels of SDS indicating incomplete detergent depletion. The results also suggest optimal conditions for SDS removal are dependent on the sample concentration. Overall, this study provides a useful guide for proteomic studies where SDS is required for efficient sample preparation.« less
Emergent rules for codon choice elucidated by editing rare arginine codons in Escherichia coli
Napolitano, Michael G.; Landon, Matthieu; Gregg, Christopher J.; Lajoie, Marc J.; Govindarajan, Lakshmi; Mosberg, Joshua A.; Kuznetsov, Gleb; Goodman, Daniel B.; Vargas-Rodriguez, Oscar; Isaacs, Farren J.; Söll, Dieter; Church, George M.
2016-01-01
The degeneracy of the genetic code allows nucleic acids to encode amino acid identity as well as noncoding information for gene regulation and genome maintenance. The rare arginine codons AGA and AGG (AGR) present a case study in codon choice, with AGRs encoding important transcriptional and translational properties distinct from the other synonymous alternatives (CGN). We created a strain of Escherichia coli with all 123 instances of AGR codons removed from all essential genes. We readily replaced 110 AGR codons with the synonymous CGU codons, but the remaining 13 “recalcitrant” AGRs required diversification to identify viable alternatives. Successful replacement codons tended to conserve local ribosomal binding site-like motifs and local mRNA secondary structure, sometimes at the expense of amino acid identity. Based on these observations, we empirically defined metrics for a multidimensional “safe replacement zone” (SRZ) within which alternative codons are more likely to be viable. To evaluate synonymous and nonsynonymous alternatives to essential AGRs further, we implemented a CRISPR/Cas9-based method to deplete a diversified population of a wild-type allele, allowing us to evaluate exhaustively the fitness impact of all 64 codon alternatives. Using this method, we confirmed the relevance of the SRZ by tracking codon fitness over time in 14 different genes, finding that codons that fall outside the SRZ are rapidly depleted from a growing population. Our unbiased and systematic strategy for identifying unpredicted design flaws in synthetic genomes and for elucidating rules governing codon choice will be crucial for designing genomes exhibiting radically altered genetic codes. PMID:27601680
Van Rechem, Capucine; Black, Joshua C; Greninger, Patricia; Zhao, Yang; Donado, Carlos; Burrowes, Paul D; Ladd, Brendon; Christiani, David C; Benes, Cyril H; Whetstine, Johnathan R
2015-03-01
SNPs occur within chromatin-modulating factors; however, little is known about how these variants within the coding sequence affect cancer progression or treatment. Therefore, there is a need to establish their biochemical and/or molecular contribution, their use in subclassifying patients, and their impact on therapeutic response. In this report, we demonstrate that coding SNP-A482 within the lysine tridemethylase gene KDM4A/JMJD2A has different allelic frequencies across ethnic populations, associates with differential outcome in patients with non-small cell lung cancer (NSCLC), and promotes KDM4A protein turnover. Using an unbiased drug screen against 87 preclinical and clinical compounds, we demonstrate that homozygous SNP-A482 cells have increased mTOR inhibitor sensitivity. mTOR inhibitors significantly reduce SNP-A482 protein levels, which parallels the increased drug sensitivity observed with KDM4A depletion. Our data emphasize the importance of using variant status as candidate biomarkers and highlight the importance of studying SNPs in chromatin modifiers to achieve better targeted therapy. This report documents the first coding SNP within a lysine demethylase that associates with worse outcome in patients with NSCLC. We demonstrate that this coding SNP alters the protein turnover and associates with increased mTOR inhibitor sensitivity, which identifies a candidate biomarker for mTOR inhibitor therapy and a therapeutic target for combination therapy. ©2015 American Association for Cancer Research.
A deep search for H2D+ in protoplanetary disks. Perspectives for ALMA
NASA Astrophysics Data System (ADS)
Chapillon, E.; Parise, B.; Guilloteau, S.; Du, F.
2011-09-01
Context. The structure in density and temperature of protoplanetary disks surrounding low-mass stars is not well known yet. The protoplanetary disks' midplane are expected to be very cold and thus depleted in molecules in gas phase, especially CO. Recent observations of molecules at very low apparent temperatures (~6 K) challenge this current picture of the protoplanetary disk structures. Aims: We aim at constraining the physical conditions and, in particular, the gas-phase CO abundance in the midplane of protoplanetary disks. Methods: The light molecule H2D+ is a tracer of cold and CO-depleted environment. It is therefore a good candidate for exploring the disks midplanes. We performed a deep search for H2D+ in the two well-known disks surrounding TW Hya and DM Tau using the APEX and JCMT telescopes. The analysis of the observations was done with DISKFIT, a radiative transfer code dedicated to disks. In addition, we used a chemical model describing deuterium chemistry to infer the implications of our observations on the level of CO depletion and on the ionization rate in the disk midplane. Results: The ortho-H2D+ (11,0-11,1) line at 372 GHz was not detected. Although our limit is three times better than previous observations, comparison with the chemical modeling indicates that it is still insufficient for putting useful constraints on the CO abundance in the disk midplane. Conclusions: Even with ALMA, the detection of H2D+ may not be straightforward, and H2D+ may not be sensitive enough to trace the protoplanetary disks midplane. Based on observations carried out with the Atacama Pathfinder Experiment and the James Clerk Maxwell Telescope. APEX is a collaboration between the Max-Planck-Institut für Radioastronomie, the European Southern Observatory, and the Onsala Space Observatory. The JCMT is operated by the Joint Astronomy Centre on behalf of the Science and Technology Facilities Council of the United Kingdom, the Netherlands Organisation for Scientific Research, and the National Research Council of Canada.
Electrophoretic analysis of cyanide depletion by Pseudomonas alcaligenes.
Zaugg, S E; Davidson, R A; Walker, J C; Walker, E B
1997-02-01
Bacterial-facilitated depletion of cyanide is under development for remediation of heap leach operations in the gold mining industry. Capillary electrophoresis was found to be a powerful tool for quantifying cyanide depletion. Changes in cyanide concentration in aqueous suspensions of Pseudomonas alcaligenes bacteria and cyanide at elevated pH were easily monitored by capillary electrophoresis. The resulting data can be used to study rates of cyanide depletion by this strain of bacteria. Concentrations of these bacteria at 10(5) cells/mL were found to reduce cyanide from 100 ppm to less than 8 ppm in four days. In addition, other ions of interest in cyanide metabolism, such as formate, can be simultaneously analyzed. Direct UV detection of cyanide at 192 nm further simplifies the analytical method for these ions.
Lee, Hye Min; Gupta, Ravi; Kim, Sun Hyung; Wang, Yiming; Rakwal, Randeep; Agrawal, Ganesh Kumar; Kim, Sun Tae
2015-05-01
High-abundance proteins (HAPs) hamper in-depth proteome study necessitating development of a HAPs depletion method. Here, we report a novel ethanol precipitation method (EPM) for HAPs depletion from total tuber proteins. Ethanol showed a dose-dependent effect on depletion of sporamin from sweet potato and patatin from potato tubers, respectively. The 50% ethanol was an optimal concentration. 2DE analysis of EPM-prepared sweet potato proteins also revealed enrichment of storage proteins (SPs) in ethanol supernatant (ES) resulting in detection of new low-abundance proteins in ethanol pellet (EP), compared to total fraction. The ES fraction showed even higher trypsin inhibitor activity than total proteins, further showing the efficacy of EPM in enrichment of sporamin in ES fraction. Application of this method was demonstrated for comparative proteomics of two sweet potato cultivars (Hwang-geum and Ho-bac) and purification of SP (sporamin) in its native form, as examples. Comparative proteomics identified many cultivar specific protein spots and selected spots were confidently assigned for their protein identity using MALDI-TOF-TOF analysis. Overall, the EPM is simple, reproducible, and economical for depletion of SPs and is suitable for downstream proteomics study. This study opens a door for its potential application to other tuber crops or fruits rich in carbohydrates. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Laberge, Monique; Huang, Qing; Schweitzer-Stenner, Reinhard; Fidy, Judit
2003-01-01
Horseradish peroxidase C (HRPC) binds 2 mol calcium per mol of enzyme with binding sites located distal and proximal to the heme group. The effect of calcium depletion on the conformation of the heme was investigated by combining polarized resonance Raman dispersion spectroscopy with normal coordinate structural decomposition analysis of the hemes extracted from models of Ca2+-bound and Ca2+-depleted HRPC generated and equilibrated using molecular dynamics simulations. Results show that calcium removal causes reorientation of heme pocket residues. We propose that these rearrangements significantly affect both the in-plane and out-of-plane deformations of the heme. Analysis of the experimental depolarization ratios are clearly consistent with increased B1g- and B2g-type distortions in the Ca2+-depleted species while the normal coordinate structural decomposition results are indicative of increased planarity for the heme of Ca2+-depleted HRPC and of significant changes in the relative contributions of three of the six lowest frequency deformations. Most noteworthy is the decrease of the strong saddling deformation that is typical of all peroxidases, and an increase in ruffling. Our results confirm previous work proposing that calcium is required to maintain the structural integrity of the heme in that we show that the preferred geometry for catalysis is lost upon calcium depletion. PMID:12668462
A method for the measurement of protein thiols (PrSH), un-reacted as well as oxidized, i.e. dithiothreitol recoverable, was adapted for the determination of PrSH depletion in isolated rainbow trout hepatocytes exposed to an arylating agent, 1,4-benzoquinone (BQ). Toxicant analysi...
Bayesian analysis of multimethod ego-depletion studies favours the null hypothesis.
Etherton, Joseph L; Osborne, Randall; Stephenson, Katelyn; Grace, Morgan; Jones, Chas; De Nadai, Alessandro S
2018-04-01
Ego-depletion refers to the purported decrease in performance on a task requiring self-control after engaging in a previous task involving self-control, with self-control proposed to be a limited resource. Despite many published studies consistent with this hypothesis, recurrent null findings within our laboratory and indications of publication bias have called into question the validity of the depletion effect. This project used three depletion protocols involved three different depleting initial tasks followed by three different self-control tasks as dependent measures (total n = 840). For each method, effect sizes were not significantly different from zero When data were aggregated across the three different methods and examined meta-analytically, the pooled effect size was not significantly different from zero (for all priors evaluated, Hedges' g = 0.10 with 95% credibility interval of [-0.05, 0.24]) and Bayes factors reflected strong support for the null hypothesis (Bayes factor > 25 for all priors evaluated). © 2018 The British Psychological Society.
Methylphenidate blocks effort-induced depletion of regulatory control in healthy volunteers.
Sripada, Chandra; Kessler, Daniel; Jonides, John
2014-06-01
A recent wave of studies--more than 100 conducted over the last decade--has shown that exerting effort at controlling impulses or behavioral tendencies leaves a person depleted and less able to engage in subsequent rounds of regulation. Regulatory depletion is thought to play an important role in everyday problems (e.g., excessive spending, overeating) as well as psychiatric conditions, but its neurophysiological basis is poorly understood. Using a placebo-controlled, double-blind design, we demonstrated that the psychostimulant methylphenidate (commonly known as Ritalin), a catecholamine reuptake blocker that increases dopamine and norepinephrine at the synaptic cleft, fully blocks effort-induced depletion of regulatory control. Spectral analysis of trial-by-trial reaction times revealed specificity of methylphenidate effects on regulatory depletion in the slow-4 frequency band. This band is associated with the operation of resting-state brain networks that produce mind wandering, which raises potential connections between our results and recent brain-network-based models of control over attention. © The Author(s) 2014.
Myopathic mtDNA Depletion Syndrome Due to Mutation in TK2 Gene.
Martín-Hernández, Elena; García-Silva, María Teresa; Quijada-Fraile, Pilar; Rodríguez-García, María Elena; Rivera, Henry; Hernández-Laín, Aurelio; Coca-Robinot, David; Fernández-Toral, Joaquín; Arenas, Joaquín; Martín, Miguel A; Martínez-Azorín, Francisco
2017-01-01
Whole-exome sequencing was used to identify the disease gene(s) in a Spanish girl with failure to thrive, muscle weakness, mild facial weakness, elevated creatine kinase, deficiency of mitochondrial complex III and depletion of mtDNA. With whole-exome sequencing data, it was possible to get the whole mtDNA sequencing and discard any pathogenic variant in this genome. The analysis of whole exome uncovered a homozygous pathogenic mutation in thymidine kinase 2 gene ( TK2; NM_004614.4:c.323 C>T, p.T108M). TK2 mutations have been identified mainly in patients with the myopathic form of mtDNA depletion syndromes. This patient presents an atypical TK2-related myopathic form of mtDNA depletion syndromes, because despite having a very low content of mtDNA (<20%), she presents a slower and less severe evolution of the disease. In conclusion, our data confirm the role of TK2 gene in mtDNA depletion syndromes and expanded the phenotypic spectrum.
Hardening neutron spectrum for advanced actinide transmutation experiments in the ATR.
Chang, G S; Ambrosek, R G
2005-01-01
The most effective method for transmuting long-lived isotopes contained in spent nuclear fuel into shorter-lived fission products is in a fast neutron spectrum reactor. In the absence of a fast test reactor in the United States, initial irradiation testing of candidate fuels can be performed in a thermal test reactor that has been modified to produce a test region with a hardened neutron spectrum. Such a test facility, with a spectrum similar but somewhat softer than that of the liquid-metal fast breeder reactor (LMFBR), has been constructed in the INEEL's Advanced Test Reactor (ATR). The radial fission power distribution of the actinide fuel pin, which is an important parameter in fission gas release modelling, needs to be accurately predicted and the hardened neutron spectrum in the ATR and the LMFBR fast neutron spectrum is compared. The comparison analyses in this study are performed using MCWO, a well-developed tool that couples the Monte Carlo transport code MCNP with the isotope depletion and build-up code ORIGEN-2. MCWO analysis yields time-dependent and neutron-spectrum-dependent minor actinide and Pu concentrations and detailed radial fission power profile calculations for a typical fast reactor (LMFBR) neutron spectrum and the hardened neutron spectrum test region in the ATR. The MCWO-calculated results indicate that the cadmium basket used in the advanced fuel test assembly in the ATR can effectively depress the linear heat generation rate in the experimental fuels and harden the neutron spectrum in the test region.
New Approach For Prediction Groundwater Depletion
NASA Astrophysics Data System (ADS)
Moustafa, Mahmoud
2017-01-01
Current approaches to quantify groundwater depletion involve water balance and satellite gravity. However, the water balance technique includes uncertain estimation of parameters such as evapotranspiration and runoff. The satellite method consumes time and effort. The work reported in this paper proposes using failure theory in a novel way to predict groundwater saturated thickness depletion. An important issue in the failure theory proposed is to determine the failure point (depletion case). The proposed technique uses depth of water as the net result of recharge/discharge processes in the aquifer to calculate remaining saturated thickness resulting from the applied pumping rates in an area to evaluate the groundwater depletion. Two parameters, the Weibull function and Bayes analysis were used to model and analyze collected data from 1962 to 2009. The proposed methodology was tested in a nonrenewable aquifer, with no recharge. Consequently, the continuous decline in water depth has been the main criterion used to estimate the depletion. The value of the proposed approach is to predict the probable effect of the current applied pumping rates on the saturated thickness based on the remaining saturated thickness data. The limitation of the suggested approach is that it assumes the applied management practices are constant during the prediction period. The study predicted that after 300 years there would be an 80% probability of the saturated aquifer which would be expected to be depleted. Lifetime or failure theory can give a simple alternative way to predict the remaining saturated thickness depletion with no time-consuming processes such as the sophisticated software required.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rouxelin, Pascal Nicolas; Strydom, Gerhard
Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented bymore » the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise II 1a. The steady state core calculations were simulated with the INL coupled-code system known as the Parallel and Highly Innovative Simulation for INL Code System (PHISICS) and the system thermal-hydraulics code known as the Reactor Excursion and Leak Analysis Program (RELAP) 5 3D using the nuclear data libraries previously generated with NEWT. It was observed that significant differences in terms of multiplication factor and neutron flux exist between the various permutations of the Phase I super-cell lattice calculations. The use of these cross section libraries only leads to minor changes in the Phase II core simulation results for fresh fuel but shows significantly larger discrepancies for spent fuel cores. Furthermore, large incongruities were found between the SCALE NEWT and KENO VI results for the super cells, and while some trends could be identified, a final conclusion on this issue could not yet be reached. This report will be revised in mid 2016 with more detailed analyses of the super-cell problems and their effects on the core models, using the latest version of SCALE (6.2). The super-cell models seem to show substantial improvements in terms of neutron flux as compared to single-block models, particularly at thermal energies.« less
Quantitative Hydrocarbon Surface Analysis
NASA Technical Reports Server (NTRS)
Douglas, Vonnie M.
2000-01-01
The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
Nagai, Yuri; Nogami, Satoru; Kumagai-Sano, Fumi; Ohya, Yoshikazu
2003-03-01
VMA1-derived endonuclease (VDE), a site-specific endonuclease in Saccharomyces cerevisiae, enters the nucleus to generate a double-strand break in the VDE-negative allelic locus, mediating the self-propagating gene conversion called homing. Although VDE is excluded from the nucleus in mitotic cells, it relocalizes at premeiosis, becoming localized in both the nucleus and the cytoplasm in meiosis. The nuclear localization of VDE is induced by inactivation of TOR kinases, which constitute central regulators of cell differentiation in S. cerevisiae, and by nutrient depletion. A functional genomic approach revealed that at least two karyopherins, Srp1p and Kap142p, are required for the nuclear localization pattern. Genetic and physical interactions between Srp1p and VDE imply direct involvement of karyopherin-mediated nuclear transport in this process. Inactivation of TOR signaling or acquisition of an extra nuclear localization signal in the VDE coding region leads to artificial nuclear localization of VDE and thereby induces homing even during mitosis. These results serve as evidence that VDE utilizes the host systems of nutrient signal transduction and nucleocytoplasmic transport to ensure the propagation of its coding region.
Nagai, Yuri; Nogami, Satoru; Kumagai-Sano, Fumi; Ohya, Yoshikazu
2003-01-01
VMA1-derived endonuclease (VDE), a site-specific endonuclease in Saccharomyces cerevisiae, enters the nucleus to generate a double-strand break in the VDE-negative allelic locus, mediating the self-propagating gene conversion called homing. Although VDE is excluded from the nucleus in mitotic cells, it relocalizes at premeiosis, becoming localized in both the nucleus and the cytoplasm in meiosis. The nuclear localization of VDE is induced by inactivation of TOR kinases, which constitute central regulators of cell differentiation in S. cerevisiae, and by nutrient depletion. A functional genomic approach revealed that at least two karyopherins, Srp1p and Kap142p, are required for the nuclear localization pattern. Genetic and physical interactions between Srp1p and VDE imply direct involvement of karyopherin-mediated nuclear transport in this process. Inactivation of TOR signaling or acquisition of an extra nuclear localization signal in the VDE coding region leads to artificial nuclear localization of VDE and thereby induces homing even during mitosis. These results serve as evidence that VDE utilizes the host systems of nutrient signal transduction and nucleocytoplasmic transport to ensure the propagation of its coding region. PMID:12588991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, X. G.; Kim, Y. S.; Choi, K. Y.
2012-07-01
A SBO (station blackout) experiment named SBO-01 was performed at full-pressure IET (Integral Effect Test) facility ATLAS (Advanced Test Loop for Accident Simulation) which is scaled down from the APR1400 (Advanced Power Reactor 1400 MWe). In this study, the transient of SBO-01 is discussed and is subdivided into three phases: the SG fluid loss phase, the RCS fluid loss phase, and the core coolant depletion and core heatup phase. In addition, the typical phenomena in SBO-01 test - SG dryout, natural circulation, core coolant boiling, the PRZ full, core heat-up - are identified. Furthermore, the SBO-01 test is reproduced bymore » the MARS code calculation with the ATLAS model which represents the ATLAS test facility. The experimental and calculated transients are then compared and discussed. The comparison reveals there was malfunction of equipments: the SG leakage through SG MSSV and the measurement error of loop flow meter. As the ATLAS model is validated against the experimental results, it can be further employed to investigate the other possible SBO scenarios and to study the scaling distortions in the ATLAS. (authors)« less
NASA Astrophysics Data System (ADS)
Angeliu, Thomas M.; Was, Gary S.
1990-08-01
Grain boundary composition and carbide composition and structure were characterized for various microstructures of controlled purity alloy 690. Heat treatments produced varying degrees of grain boundary chromium depletion and precipitate distributions which were characterized via scanning transmission electron microscopy (STEM). Convergent beam electron diffraction revealed that the dominant carbide is M23C6, and energy dispersive X-ray analysis (EDAX) determined that the metallic content was about 90 at. pct chromium. A discontinuous precipitation reaction was observed and is attributed to a high degree of carbon supersaturation. Grain boundary composition measurements confirm that chromium depletion is controlled by volume diffusion of chromium to chromium-rich grain boundary carbides in the temperature range of 873 to 1073 K. Grain boundary chromium levels as low as 18.8 at. pct were obtained by thermal treatment at 873 K for 250 hours and 973 K for 1 hour. A thermodynamic and kinetic model developed for alloy 600 was modified to describe the development of the chromium depletion profile in alloy 690 during thermal treatment. Experimentally measured chromium profiles agree well with the model results for the dependence of the chromium depletion zone width and depth on various input parameters. The establishment of the model for alloy 690 allows the chromium depletion zone width and depth to be computed as a function of alloy composition, grain size, and temperature. The chromium depletion profiles and the precipitate structure and composition of controlled purity 690 are compared to those of controlled purity 600. A thermodynamic analysis of the carbide stability indicates that other factors, such as favorable orientation relationships, play an important role in controlling the precipitation of Cr23C6 in nickel-base alloys.
Wind Turbine Blade Design System - Aerodynamic and Structural Analysis
NASA Astrophysics Data System (ADS)
Dey, Soumitr
2011-12-01
The ever increasing need for energy and the depletion of non-renewable energy resources has led to more advancement in the "Green Energy" field, including wind energy. An improvement in performance of a Wind Turbine will enhance its economic viability, which can be achieved by better aerodynamic designs. In the present study, a design system that has been under development for gas turbine turbomachinery has been modified for designing wind turbine blades. This is a very different approach for wind turbine blade design, but will allow it to benefit from the features inherent in the geometry flexibility and broad design space of the presented system. It starts with key overall design parameters and a low-fidelity model that is used to create the initial geometry parameters. The low-fidelity system includes the axisymmetric solver with loss models, T-Axi (Turbomachinery-AXIsymmetric), MISES blade-to-blade solver and 2D wing analysis code XFLR5. The geometry parameters are used to define sections along the span of the blade and connected to the CAD model of the wind turbine blade through CAPRI (Computational Analysis PRogramming Interface), a CAD neutral API that facilitates the use of parametric geometry definition with CAD. Either the sections or the CAD geometry is then available for CFD and Finite Element Analysis. The GE 1.5sle MW wind turbine and NERL NASA Phase VI wind turbine have been used as test cases. Details of the design system application are described, and the resulting wind turbine geometry and conditions are compared to the published results of the GE and NREL wind turbines. A 2D wing analysis code XFLR5, is used for to compare results from 2D analysis to blade-to-blade analysis and the 3D CFD analysis. This kind of comparison concludes that, from hub to 25% of the span blade to blade effects or the cascade effect has to be considered, from 25% to 75%, the blade acts as a 2d wing and from 75% to the tip 3D and tip effects have to be taken into account for design considerations. In addition, the benefits of this approach for wind turbine design and future efforts are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Y. S.; Joo, H. G.; Yoon, J. I.
The nTRACER direct whole core transport code employing the planar MOC solution based 3-D calculation method, the subgroup method for resonance treatment, the Krylov matrix exponential method for depletion, and a subchannel thermal/hydraulic calculation solver was developed for practical high-fidelity simulation of power reactors. Its accuracy and performance is verified by comparing with the measurement data obtained for three pressurized water reactor cores. It is demonstrated that accurate and detailed multi-physic simulation of power reactors is practically realizable without any prior calculations or adjustments. (authors)
Solar powered multipurpose remotely powered aircraft
NASA Technical Reports Server (NTRS)
Alexandrou, A. N.; Durgin, W. W.; Cohn, R. F.; Olinger, D. J.; Cody, Charlotte K.; Chan, Agnes; Cheung, Kwok-Hung; Conley, Kristin; Crivelli, Paul M.; Javorski, Christian T.
1992-01-01
Increase in energy demands coupled with rapid depletion of natural energy resources have deemed solar energy as an attractive alternative source of power. The focus was to design and construct a solar powered, remotely piloted vehicle to demonstrate the feasibility of solar energy as an effective, alternate source of power. The final design included minimizing the power requirements and maximizing the strength-to-weight and lift-to-drag ratios. Given the design constraints, Surya (the code-name given to the aircraft), is a lightweight aircraft primarily built using composite materials and capable of achieving level flight powered entirely by solar energy.
Aberrations in stimulated emission depletion (STED) microscopy
NASA Astrophysics Data System (ADS)
Antonello, Jacopo; Burke, Daniel; Booth, Martin J.
2017-12-01
Like all methods of super-resolution microscopy, stimulated emission depletion (STED) microscopy can suffer from the effects of aberrations. The most important aspect of a STED microscope is that the depletion focus maintains a minimum, ideally zero, intensity point that is surrounded by a region of higher intensity. It follows that aberrations that cause a non-zero value of this minimum intensity are the most detrimental, as they inhibit fluorescence emission even at the centre of the depletion focus. We present analysis that elucidates the nature of these effects in terms of the different polarisation components at the focus for two-dimensional and three-dimensional STED resolution enhancement. It is found that only certain low-order aberration modes can affect the minimum intensity at the Gaussian focus. This has important consequences for the design of adaptive optics aberration correction systems.
Efficient ultrafiltration-based protocol to deplete extracellular vesicles from fetal bovine serum
Kornilov, Roman; Puhka, Maija; Mannerström, Bettina; Hiidenmaa, Hanna; Peltoniemi, Hilkka; Siljander, Pia; Seppänen-Kaijansinkko, Riitta; Kaur, Sippy
2018-01-01
ABSTRACT Fetal bovine serum (FBS) is the most commonly used supplement in studies involving cell-culture experiments. However, FBS contains large numbers of bovine extracellular vesicles (EVs), which hamper the analyses of secreted EVs from the cell type of preference and, thus, also the downstream analyses. Therefore, a prior elimination of EVs from FBS is crucial. However, the current methods of EV depletion by ultracentrifugation are cumbersome and the commercial alternatives expensive. In this study, our aim was to develop a protocol to completely deplete EVs from FBS, which may have wide applicability in cell-culture applications. We investigated different EV-depleted FBS prepared by our novel ultrafiltration-based protocol, by conventionally used overnight ultracentrifugation, or commercially available depleted FBS, and compared them with regular FBS. All sera were characterized by nanoparticle tracking analysis, electron microscopy, Western blotting and RNA quantification. Next, adipose-tissue mesenchymal stem cells (AT-MSCs) and cancer cells were grown in the media supplemented with the three different EV-depleted FBS and compared with cells grown in regular FBS media to assess the effects on cell proliferation, stress, differentiation and EV production. The novel ultrafiltration-based protocol depleted EVs from FBS clearly more efficiently than ultracentrifugation and commercial methods. Cell proliferation, stress, differentiation and EV production of AT-MSCs and cancer cell lines were similarly maintained in all three EV-depleted FBS media up to 96 h. In summary, our ultrafiltration protocol efficiently depletes EVs, is easy to use and maintains cell growth and metabolism. Since the method is also cost-effective and easy to standardize, it could be used in a wide range of cell-culture applications helping to increase comparability of EV research results between laboratories. PMID:29410778
de Jesus, Jemmyson Romário; da Silva Fernandes, Rafael; de Souza Pessôa, Gustavo; Raimundo, Ivo Milton; Arruda, Marco Aurélio Zezzi
2017-08-01
The efficiency of three different depletion methods to remove the most abundant proteins, enriching those human serum proteins with low abundance is checked to make more efficient the search and discovery of biomarkers. These methods utilize magnetic nanoparticles (MNPs), chemical reagents (sequential application of dithiothreitol and acetonitrile, DTT/ACN), and commercial apparatus based on immunoaffinity (ProteoMiner, PM). The comparison between methods shows significant removal of abundant protein, remaining in the supernatant at concentrations of 4.6±0.2, 3.6±0.1, and 3.3±0.2µgµL -1 (n=3) for MNPs, DTT/ACN and PM respectively, from a total protein content of 54µgµL -1 . Using GeLC-MS/MS analysis, MNPs depletion shows good efficiency in removing high molecular weight proteins (>80kDa). Due to the synergic effect between the reagents DTT and ACN, DTT/ACN-based depletion offers good performance in the depletion of thiol-rich proteins, such as albumin and transferrin (DTT action), as well as of high molecular weight proteins (ACN action). Furthermore, PM equalization confirms its efficiency in concentrating low-abundant proteins, decreasing the dynamic range of protein levels in human serum. Direct comparison between the treatments reveals 72 proteins identified when using MNP depletion (43 of them exclusively by this method), but only 20 proteins using DTT/ACN (seven exclusively by this method). Additionally, after PM treatment 30 proteins were identified, seven exclusively by this method. Thus, MNPs and DTT/ACN depletion can be simple, quick, cheap, and robust alternatives for immunochemistry-based protein depletion, providing a potential strategy in the search for disease biomarkers. Copyright © 2017 Elsevier B.V. All rights reserved.
Simulation of stimulated Brillouin scattering and stimulated Raman scattering in shock ignition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, L.; Li, J.; Liu, W. D.
2016-04-15
We study stimulated Brillouin scattering (SBS) and stimulated Raman scattering (SRS) in shock ignition by comparing fluid and particle-in-cell (PIC) simulations. Under typical parameters for the OMEGA experiments [Theobald et al., Phys. Plasmas 19, 102706 (2012)], a series of 1D fluid simulations with laser intensities ranging between 2 × 10{sup 15} and 2 × 10{sup 16 }W/cm{sup 2} finds that SBS is the dominant instability, which increases significantly with the incident intensity. Strong pump depletion caused by SBS and SRS limits the transmitted intensity at the 0.17n{sub c} to be less than 3.5 × 10{sup 15 }W/cm{sup 2}. The PIC simulations show similar physics but with higher saturationmore » levels for SBS and SRS convective modes and stronger pump depletion due to higher seed levels for the electromagnetic fields in PIC codes. Plasma flow profiles are found to be important in proper modeling of SBS and limiting its reflectivity in both the fluid and PIC simulations.« less
Long Non-coding RNA, PANDA, Contributes to the Stabilization of p53 Tumor Suppressor Protein.
Kotake, Yojiro; Kitagawa, Kyoko; Ohhata, Tatsuya; Sakai, Satoshi; Uchida, Chiharu; Niida, Hiroyuki; Naemura, Madoka; Kitagawa, Masatoshi
2016-04-01
P21-associated noncoding RNA DNA damage-activated (PANDA) is induced in response to DNA damage and represses apoptosis by inhibiting the function of nuclear transcription factor Y subunit alpha (NF-YA) transcription factor. Herein, we report that PANDA affects regulation of p53 tumor-suppressor protein. U2OS cells were transfected with PANDA siRNAs. At 72 h post-transfection, cells were subjected to immunoblotting and quantitative reverse transcription-polymerase chain reaction. Depletion of PANDA was associated with decreased levels of p53 protein, but not p53 mRNA. The stability of p53 protein was markedly reduced by PANDA silencing. Degradation of p53 protein by silencing PANDA was prevented by treatment of MG132, a proteasome inhibitor. Moreover, depletion of PANDA prevented accumulation of p53 protein, as a result of DNA damage, induced by the genotoxic agent etoposide. These results suggest that PANDA stabilizes p53 protein in response to DNA damage, and provide new insight into the regulatory mechanisms of p53. Copyright© 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.
Simple Model of Macroscopic Instability in XeCl Discharge Pumped Lasers
NASA Astrophysics Data System (ADS)
Ahmed, Belasri; Zoheir, Harrache
2003-10-01
The aim of this work is to study the development of the macroscopic non uniformity of the electron density of high pressure discharge for excimer lasers and eventually its propagation because of the medium kinetics phenomena. This study is executed using a transverse mono-dimensional model, in which the plasma is represented by a set of resistance's in parallel. This model was employed using a numerical code including three strongly coupled parts: electric circuit equations, electron Boltzmann equation, and kinetics equations (chemical kinetics model). The time variations of the electron density in each plasma element are obtained by solving a set of ordinary differential equations describing the plasma kinetics and external circuit. The use of the present model allows a good comprehension of the halogen depletion phenomena, which is the principal cause of laser ending and allows a simple study of a large-scale non uniformity in preionization density and its effects on electrical and chemical plasma properties. The obtained results indicate clearly that about 50consumed at the end of the pulse. KEY WORDS Excimer laser, XeCl, Modeling, Cold plasma, Kinetic, Halogen depletion, Macroscopic instability.
An RCM-E simulation of a steady magnetospheric convection event
NASA Astrophysics Data System (ADS)
Yang, J.; Toffoletto, F.; Wolf, R.; Song, Y.
2009-12-01
We present simulation results of an idealized steady magnetospheric convection (SMC) event using the Rice Convection Model coupled with an equilibrium magnetic field solver (RCM-E). The event is modeled by placing a plasma distribution with substantially depleted entropy parameter PV5/3 on the RCM's high latitude boundary. The calculated magnetic field shows a highly depressed configuration due to the enhanced westward current around geosynchronous orbit where the resulting partial ring current is stronger and more symmetric than in a typical substorm growth phase. The magnitude of BZ component in the mid plasma sheet is large compared to empirical magnetic field models. Contrary to some previous results, there is no deep BZ minimum in the near-Earth plasma sheet. This suggests that the magnetosphere could transfer into a strong adiabatic earthward convection mode without significant stretching of the plasma-sheet magnetic field, when there are flux tubes with depleted plasma content continuously entering the inner magnetosphere from the mid-tail. Virtual AU/AL and Dst indices are also calculated using a synthetic magnetogram code and are compared to typical features in published observations.
Calcium inputs and transport in a base-poor forest ecosystem as interpreted by Sr isotopes
Scott W. Bailey; James W. Hornbeck; Charles T. Driscoll; Henri E. Gaudette
1996-01-01
Depletion of Ca in forests and its effects on forest health are poorly quantified. Depletion has been difficult to document due to limitations in determining rates at which Ca becomes available for ecosystem processes through weathering, and difficulty in determining changes in ecosystem storage. We coupled a detailed analysis of Sr isotopic composition with a mass...
NASA Technical Reports Server (NTRS)
Adams, Gregory R.; Baldwin, Kenneth M.
1995-01-01
This study was designed to test the hypothesis that myosin heavy chain (MHC) plasticity resulting from creatine depletion is an age-dependent process. At weaning (age 28 days), rat pups were placed on either standard rat chow (normal diet juvenile group) or the same chow supplemented with 1% wt/wt of the creatine analogue beta-guanidinopropionic acid (creatine depletion juvenile (CDJ) group). Two groups of adult rats (age approximately 8 wk) were placed on the same diet regimens (normal diet adult and creatine depletion adult (CDA) groups). After 40 days (CDJ and normal diet juvenile groups) and 60 days (CDA and normal diet adult groups), animals were killed and several skeletal muscles were removed for analysis of creatine content or MHC ditribution. In the CDJ group, creatine depletion (78%) was accompanied by significant shifts toward expression of slower MHC isoforms in two slow and three fast skeletal muscles. In contrast, creatine depletion in adult animals did not result in similar shifts toward slow MHC isoform expression in either muscle type. The results of this study indicate that there is a differential effect of creatine depletion on MHC tranitions that appears to be age dependent. These results strongly suggest that investigators contemplating experimental designs involving the use of the creatine analogue beta-guanidinopropionic acid should consider the age of the animals to be used.
Comparison of Ultra-Conserved Elements in Drosophilids and Vertebrates
Makunin, Igor V.; Shloma, Viktor V.; Stephen, Stuart J.; Pheasant, Michael; Belyakin, Stepan N.
2013-01-01
Metazoan genomes contain many ultra-conserved elements (UCEs), long sequences identical between distant species. In this study we identified UCEs in drosophilid and vertebrate species with a similar level of phylogenetic divergence measured at protein-coding regions, and demonstrated that both the length and number of UCEs are larger in vertebrates. The proportion of non-exonic UCEs declines in distant drosophilids whilst an opposite trend was observed in vertebrates. We generated a set of 2,126 Sophophora UCEs by merging elements identified in several drosophila species and compared these to the eutherian UCEs identified in placental mammals. In contrast to vertebrates, the Sophophora UCEs are depleted around transcription start sites. Analysis of 52,954 P-element, piggyBac and Minos insertions in the D. melanogaster genome revealed depletion of the P-element and piggyBac insertions in and around the Sophophora UCEs. We examined eleven fly strains with transposon insertions into the intergenic UCEs and identified associated phenotypes in five strains. Four insertions behave as recessive lethals, and in one case we observed a suppression of the marker gene within the transgene, presumably by silenced chromatin around the integration site. To confirm the lethality is caused by integration of transposons we performed a phenotype rescue experiment for two stocks and demonstrated that the excision of the transposons from the intergenic UCEs restores viability. Sequencing of DNA after the transposon excision in one fly strain with the restored viability revealed a 47 bp insertion at the original transposon integration site suggesting that the nature of the mutation is important for the appearance of the phenotype. Our results suggest that the UCEs in flies and vertebrates have both common and distinct features, and demonstrate that a significant proportion of intergenic drosophila UCEs are sensitive to disruption. PMID:24349264
Two-dimensional numerical simulation of O-mode to Z-mode conversion in the ionosphere
NASA Astrophysics Data System (ADS)
Cannon, P. D.; Honary, F.; Borisov, N.
2016-03-01
Experiments in the illumination of the F region of the ionosphere via radio frequency waves polarized in the ordinary mode (O-mode) have revealed that the magnitude of artificial heating-induced effects depends strongly on the inclination angle of the pump beam, with a greater modification to the plasma observed when the heating beam is directed close to or along the magnetic zenith direction. Numerical simulations performed using a recently developed finite-difference time-domain (FDTD) code are used to investigate the contribution of the O-mode to Z-mode conversion process to this effect. The aspect angle dependence and angular size of the radio window for which conversion of an O-mode pump wave to the Z-mode occurs is simulated for a variety of plasma density profiles including 2-D linear gradients representative of large-scale plasma depletions, density-depleted plasma ducts, and periodic field-aligned irregularities. The angular shape of the conversion window is found to be strongly influenced by the background plasma profile. If the Z-mode wave is reflected, it can propagate back toward the O-mode reflection region leading to resonant enhancement of the electric field in this region. Simulation results presented in this paper demonstrate that this process can make a significant contribution to the magnitude of electron density depletion and temperature enhancement around the resonance height and contributes to a strong dependence of the magnitude of plasma perturbation with the direction of the pump wave.
Nanavaty, Vishal; Sandhu, Ranjodh; Jehi, Sanaa E; Pandya, Unnati M; Li, Bibo
2017-06-02
Trypanosoma brucei causes human African trypanosomiasis and regularly switches its major surface antigen, VSG, thereby evading the host's immune response. VSGs are monoallelically expressed from subtelomeric expression sites (ESs), and VSG switching exploits subtelomere plasticity. However, subtelomere integrity is essential for T. brucei viability. The telomeric transcript, TERRA, was detected in T. brucei previously. We now show that the active ES-adjacent telomere is transcribed. We find that TbRAP1, a telomere protein essential for VSG silencing, suppresses VSG gene conversion-mediated switching. Importantly, TbRAP1 depletion increases the TERRA level, which appears to result from longer read-through into the telomere downstream of the active ES. Depletion of TbRAP1 also results in more telomeric RNA:DNA hybrids and more double strand breaks (DSBs) at telomeres and subtelomeres. In TbRAP1-depleted cells, expression of excessive TbRNaseH1, which cleaves the RNA strand of the RNA:DNA hybrid, brought telomeric RNA:DNA hybrids, telomeric/subtelomeric DSBs and VSG switching frequency back to WT levels. Therefore, TbRAP1-regulated appropriate levels of TERRA and telomeric RNA:DNA hybrid are fundamental to subtelomere/telomere integrity. Our study revealed for the first time an important role of a long, non-coding RNA in antigenic variation and demonstrated a link between telomeric silencing and subtelomere/telomere integrity through TbRAP1-regulated telomere transcription. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Rotational Fourier tracking of diffusing polygons.
Mayoral, Kenny; Kennair, Terry P; Zhu, Xiaoming; Milazzo, James; Ngo, Kathy; Fryd, Michael M; Mason, Thomas G
2011-11-01
We use optical microscopy to measure the rotational Brownian motion of polygonal platelets that are dispersed in a liquid and confined by depletion attractions near a wall. The depletion attraction inhibits out-of-plane translational and rotational Brownian fluctuations, thereby facilitating in-plane imaging and video analysis. By taking fast Fourier transforms (FFTs) of the images and analyzing the angular position of rays in the FFTs, we determine an isolated particle's rotational trajectory, independent of its position. The measured in-plane rotational diffusion coefficients are significantly smaller than estimates for the bulk; this difference is likely due to the close proximity of the particles to the wall arising from the depletion attraction.
Toward Describing the Effects of Ozone Depletion on Marine Primary Productivity and Carbon Cycling
NASA Technical Reports Server (NTRS)
Cullen, John J.
1995-01-01
This project was aimed at improved predictions of the effects of UVB and ozone depletion on marine primary productivity and carbon flux. A principal objective was to incorporate a new analytical description of photosynthesis as a function of UV and photosynthetically available radiation (Cullen et. al., Science 258:646) into a general oceanographic model. We made significant progress: new insights into the kinetics of photoinhibition were used in the analysis of experiments on Antarctic phytoplankton to generate a general model of UV-induced photoinhibition under the influence of ozone depletion and vertical mixing. The way has been paved for general models on a global scale.
NASA Astrophysics Data System (ADS)
Asoka-Kumar, P.; Leung, T. C.; Lynn, K. G.; Nielsen, B.; Forcier, M. P.; Weinberg, Z. A.; Rubloff, G. W.
1992-06-01
The centroid shifts of positron annihilation spectra are reported from the depletion regions of metal-oxide-semiconductor (MOS) capacitors at room temperature and at 35 K. The centroid shift measurement can be explained using the variation of the electric field strength and depletion layer thickness as a function of the applied gate bias. An estimate for the relevant MOS quantities is obtained by fitting the centroid shift versus beam energy data with a steady-state diffusion-annihilation equation and a derivative-gaussian positron implantation profile. Inadequacy of the present analysis scheme is evident from the derived quantities and alternate methods are required for better predictions.
Hargreaves, P; Rahman, S; Guthrie, P; Taanman, J W; Leonard, J V; Land, J M; Heales, S J R
2002-02-01
Mitochondrial DNA (mtDNA) depletion syndrome (McKusick 251880) is characterized by a progressive quantitative loss of mtDNA resulting in severe mitochondrial dysfunction. A diagnosis of mtDNA depletion can only be confirmed after Southern blot analysis of affected tissue. Only a limited number of centres have the facilities to offer this service, and this is frequently on an irregular basis. There is therefore a need for a test that can refine sample selection as well as complementing the molecular analysis. In this study we compared the activities of the nuclear-encoded succinate ubiquinone reductase (complex II) to the activities of the combined mitochondrial and nuclear-encoded mitochondrial electron transport chain (ETC) complexes; NADH:ubiquinone reductase (complex I), ubiquinol-cytochrome-c reductase (complex III), and cytochrome-c oxidase (complex IV), in skeletal muscle biopsies from 7 patients with confirmed mtDNA depletion. In one patient there was no evidence of an ETC defect. However, the remaining 6 patients exhibited reduced complex I and IV activities. Five of these patients also displayed reduced complex II-III (succinate:cytochrome-c reductase) activity. Individual measurement of complex II and complex III activities demonstrated normal levels of complex II activity compared to complex III, which was reduced in the 5 biopsies assayed. These findings suggest a possible diagnostic value for the detection of normal levels of complex II activity in conjunction with reduced complex I, III and IV activity in the identification of likely candidates for mtDNA depletion syndrome
NASA Astrophysics Data System (ADS)
Tai, Kong Fai; Kamada, Rui; Yagioka, Takeshi; Kato, Takuya; Sugimoto, Hiroki
2017-08-01
Certified efficiency of 22.3% has been achieved for Cu(In,Ga)(Se,S)2 solar cell. Compared to our previous record cell with 20.9% efficiency, the major breakthrough is due to the increased V oc, benefited from potassium treatment. A lower reverse saturation current and a longer carrier collection length deduced from electron-beam induced current indicate that the degree of carrier recombination at the heterojunction and depletion region for the 22.3% cell is lower. Further characterizations (capacitance-voltage profiling, temperature-dependent V oc, Suns-V oc) and analysis indicate that the recombination coefficients at all regions were reduced, especially for the interface and depletion regions. Device simulation was performed assuming varying defect densities to model the current-voltage curve for the 22.3% cell. The best model was also used to estimate the achievable V oc if defect densities were further reduced. Furthermore, by using higher bandgap Cd-free buffer layers, a higher J sc was achieved which gives an in-house solar cell efficiency of 22.8%. Recombination analysis on the 22.8% cell indicates that the interface recombination is further reduced, but the recombination coefficients at the depletion region was higher, pointing out that further improvement on the depletion region recombination could help to achieve a higher V oc and therefore an efficiency beyond 23%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zoller, J.N.; Rosen, R.S.; Holliday, M.A.
With the publication of a Request for Recommendations and Advance Notice of Intent in the November 10, 1994 Federal Register, the Department of Energy initiated a program to assess alternative strategies for the long-term management or use of depleted uranium hexafluoride. This Request was made to help ensure that, by seeking as many recommendations as possible, Department management considers reasonable options in the long-range management strategy. The Depleted Uranium Hexafluoride Management Program consists of three major program elements: Engineering Analysis, Cost Analysis, and an Environmental Impact Statement. This Technology Assessment Report is the first part of the Engineering Analysis Project,more » and assesses recommendations from interested persons, industry, and Government agencies for potential uses for the depleted uranium hexafluoride stored at the gaseous diffusion plants in Paducah, Kentucky, and Portsmouth, Ohio, and at the Oak Ridge Reservation in Tennessee. Technologies that could facilitate the long-term management of this material are also assessed. The purpose of the Technology Assessment Report is to present the results of the evaluation of these recommendations. Department management will decide which recommendations will receive further study and evaluation.« less
Quantitative phosphoproteomics reveals new roles for the protein phosphatase PP6 in mitotic cells.
Rusin, Scott F; Schlosser, Kate A; Adamo, Mark E; Kettenbach, Arminja N
2015-10-13
Protein phosphorylation is an important regulatory mechanism controlling mitotic progression. Protein phosphatase 6 (PP6) is an essential enzyme with conserved roles in chromosome segregation and spindle assembly from yeast to humans. We applied a baculovirus-mediated gene silencing approach to deplete HeLa cells of the catalytic subunit of PP6 (PP6c) and analyzed changes in the phosphoproteome and proteome in mitotic cells by quantitative mass spectrometry-based proteomics. We identified 408 phosphopeptides on 272 proteins that increased and 298 phosphopeptides on 220 proteins that decreased in phosphorylation upon PP6c depletion in mitotic cells. Motif analysis of the phosphorylated sites combined with bioinformatics pathway analysis revealed previously unknown PP6c-dependent regulatory pathways. Biochemical assays demonstrated that PP6c opposed casein kinase 2-dependent phosphorylation of the condensin I subunit NCAP-G, and cellular analysis showed that depletion of PP6c resulted in defects in chromosome condensation and segregation in anaphase, consistent with dysregulation of condensin I function in the absence of PP6 activity. Copyright © 2015, American Association for the Advancement of Science.
Quantitative phosphoproteomics reveals new roles for the protein phosphatase PP6 in mitotic cells
Rusin, Scott F.; Schlosser, Kate A.; Adamo, Mark E.; Kettenbach, Arminja N.
2017-01-01
Protein phosphorylation is an important regulatory mechanism controlling mitotic progression. Protein phosphatase 6 (PP6) is an essential enzyme with conserved roles in chromosome segregation and spindle assembly from yeast to humans. We applied a baculovirus-mediated gene silencing approach to deplete HeLa cells of the catalytic subunit of PP6 (PP6c) and analyzed changes in the phosphoproteome and proteome in mitotic cells by quantitative mass spectrometry–based proteomics. We identified 408 phosphopeptides on 272 proteins that increased and 298 phosphopeptides on 220 proteins that decreased in phosphorylation upon PP6c depletion in mitotic cells. Motif analysis of the phosphorylated sites combined with bioinformatics pathway analysis revealed previously unknown PP6c–dependent regulatory pathways. Biochemical assays demonstrated that PP6c opposed casein kinase 2–dependent phosphorylation of the condensin I subunit NCAP-G, and cellular analysis showed that depletion of PP6c resulted in defects in chromosome condensation and segregation in anaphase, consistent with dysregulation of condensin I function in the absence of PP6 activity. PMID:26462736
Snail1 transcription factor controls telomere transcription and integrity.
Mazzolini, Rocco; Gonzàlez, Núria; Garcia-Garijo, Andrea; Millanes-Romero, Alba; Peiró, Sandra; Smith, Susan; García de Herreros, Antonio; Canudas, Sílvia
2018-01-09
Besides controlling epithelial-to-mesenchymal transition (EMT) and cell invasion, the Snail1 transcriptional factor also provides cells with cancer stem cell features. Since telomere maintenance is essential for stemness, we have examined the control of telomere integrity by Snail1. Fluorescence in situ hybridization (FISH) analysis indicates that Snail1-depleted mouse mesenchymal stem cells (MSC) have both a dramatic increase of telomere alterations and shorter telomeres. Remarkably, Snail1-deficient MSC present higher levels of both telomerase activity and the long non-coding RNA called telomeric repeat-containing RNA (TERRA), an RNA that controls telomere integrity. Accordingly, Snail1 expression downregulates expression of the telomerase gene (TERT) as well as of TERRA 2q, 11q and 18q. TERRA and TERT are transiently downregulated during TGFβ-induced EMT in NMuMG cells, correlating with Snail1 expression. Global transcriptome analysis indicates that ectopic expression of TERRA affects the transcription of some genes induced during EMT, such as fibronectin, whereas that of TERT does not modify those genes. We propose that Snail1 repression of TERRA is required not only for telomere maintenance but also for the expression of a subset of mesenchymal genes. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Conesa, Christine; Ruotolo, Roberta; Soularue, Pascal; Simms, Tiffany A; Donze, David; Sentenac, André; Dieci, Giorgio
2005-10-01
We used genome-wide expression analysis in Saccharomyces cerevisiae to explore whether and how the expression of protein-coding, RNA polymerase (Pol) II-transcribed genes is influenced by a decrease in RNA Pol III-dependent transcription. The Pol II transcriptome was characterized in four thermosensitive, slow-growth mutants affected in different components of the RNA Pol III transcription machinery. Unexpectedly, we found only a modest correlation between altered expression of Pol II-transcribed genes and their proximity to class III genes, a result also confirmed by the analysis of single tRNA gene deletants. Instead, the transcriptome of all of the four mutants was characterized by increased expression of genes known to be under the control of the Gcn4p transcriptional activator. Indeed, GCN4 was found to be translationally induced in the mutants, and deleting the GCN4 gene eliminated the response. The Gcn4p-dependent expression changes did not require the Gcn2 protein kinase and could be specifically counteracted by an increased gene dosage of initiator tRNA(Met). Initiator tRNA(Met) depletion thus triggers a GCN4-dependent reprogramming of genome expression in response to decreased Pol III transcription. Such an effect might represent a key element in the coordinated transcriptional response of yeast cells to environmental changes.
EASY-II Renaissance: n, p, d, α, γ-induced Inventory Code System
NASA Astrophysics Data System (ADS)
Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.
2014-04-01
The European Activation SYstem has been re-engineered and re-written in modern programming languages so as to answer today's and tomorrow's needs in terms of activation, transmutation, depletion, decay and processing of radioactive materials. The new FISPACT-II inventory code development project has allowed us to embed many more features in terms of energy range: up to GeV; incident particles: alpha, gamma, proton, deuteron and neutron; and neutron physics: self-shielding effects, temperature dependence and covariance, so as to cover all anticipated application needs: nuclear fission and fusion, accelerator physics, isotope production, stockpile and fuel cycle stewardship, materials characterization and life, and storage cycle management. In parallel, the maturity of modern, truly general purpose libraries encompassing thousands of target isotopes such as TENDL-2012, the evolution of the ENDF-6 format and the capabilities of the latest generation of processing codes PREPRO, NJOY and CALENDF have allowed the activation code to be fed with more robust, complete and appropriate data: cross sections with covariance, probability tables in the resonance ranges, kerma, dpa, gas and radionuclide production and 24 decay types. All such data for the five most important incident particles (n, p, d, α, γ), are placed in evaluated data files up to an incident energy of 200 MeV. The resulting code system, EASY-II is designed as a functional replacement for the previous European Activation System, EASY-2010. It includes many new features and enhancements, but also benefits already from the feedback from extensive validation and verification activities performed with its predecessor.
Army Net Zero Installation Initiative and Cost Benefit Analysis Activity
2011-10-31
freshwater resources and returns water back to the same watershed so not to deplete the groundwater and surface water resources of that... freshwater resources & returns water back to the same watershed so not to deplete the groundwater & surface water resources of that region in quantity...Goals: Reduce freshwater demand through water efficiency & conservation Access/develop alternate water sources to offset freshwater demand Develop
Interstellar absorption lines in the spectrum of sigma Sco using Copernicus observations
NASA Technical Reports Server (NTRS)
Allen, M. M.; Snow, T. P.
1986-01-01
Since the launch of Copernicus in 1972, studies have been made of the depletion of gas-phase elements onto dust grains. A few stars have been studied in detail, resulting in a standard depletion pattern which has since been used for comparison. Recent developments, however, have suggested that this standard pattern may need to be re-examined. Some weak, semi-forbidden lines were detected recently which may be able to resolve some of the ambiguities. Studies of single elements have shown that depletion of carbon and oxgyen are much smaller than previously determined. The high resolution ultraviolet spectral scans of sigma Sco were originally made in 1973, but have only recently been analyzed. All these stars are bright and moderately reddened. All four stars will be analyzed in detail, but sigma Sco is the first one completed. The data has broad coverage of ions, making these stars excellent candidates for determination of accurate depletions. A profile-fitting analysis was used rather than curves-of-growth in order to determine separate abundances and depletions in components separated by several km/sec.
NASA Astrophysics Data System (ADS)
Strahan, Susan E.; Douglass, Anne R.
2018-01-01
Attribution of Antarctic ozone recovery to the Montreal protocol requires evidence that (1) Antarctic chlorine levels are declining and (2) there is a reduction in ozone depletion in response to a chlorine decline. We use Aura Microwave Limb Sounder measurements of O3, HCl, and N2O to demonstrate that inorganic chlorine (Cly) from 2013 to 2016 was 223 ± 93 parts per trillion lower in the Antarctic lower stratosphere than from 2004 to 2007 and that column ozone depletion declined in response. The mean Cly decline rate, 0.8%/yr, agrees with the expected rate based on chlorofluorocarbon lifetimes. N2O measurements are crucial for identifying changes in stratospheric Cly loading independent of dynamical variability. From 2005 to 2016, the ozone depletion and Cly time series show matching periods of decline, stability, and increase. The observed sensitivity of O3 depletion to changing Cly agrees with the sensitivity simulated by the Global Modeling Initiative chemistry transport model integrated with Modern Era Retrospective Analysis for Research and Applications 2 meteorology.
Accelerated SDS depletion from proteins by transmembrane electrophoresis: Impacts of Joule heating.
Unterlander, Nicole; Doucette, Alan Austin
2018-02-08
SDS plays a key role in proteomics workflows, including protein extraction, solubilization and mass-based separations (e.g. SDS-PAGE, GELFrEE). However, SDS interferes with mass spectrometry and so it must be removed prior to analysis. We recently introduced an electrophoretic platform, termed transmembrane electrophoresis (TME), enabling extensive depletion of SDS from proteins in solution with exceptional protein yields. However, our prior TME runs required 1 h to complete, being limited by Joule heating which causes protein aggregation at higher operating currents. Here, we demonstrate effective strategies to maintain lower TME sample temperatures, permitting accelerated SDS depletion. Among these strategies, the use of a magnetic stir bar to continuously agitate a model protein system (BSA) allows SDS to be depleted below 100 ppm (>98% removal) within 10 min of TME operations, while maintaining exceptional protein recovery (>95%). Moreover, these modifications allow TME to operate without any user intervention, improving throughput and robustness of the approach. Through fits of our time-course SDS depletion curves to an exponential model, we calculate SDS depletion half-lives as low as 1.2 min. This promising electrophoretic platform should provide proteomics researchers with an effective purification strategy to enable MS characterization of SDS-containing proteins. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2016-01-01
Abstract Microglia are the primary immune cells of the brain and function in multiple ways to facilitate proper brain development. However, our current understanding of how these cells influence the later expression of normal behaviors is lacking. Using the laboratory rat, we administered liposomal clodronate centrally to selectively deplete microglia in the developing postnatal brain. We then assessed a range of developmental, juvenile, and adult behaviors. Liposomal clodronate treatment on postnatal days 0, 2, and 4 depleted microglia with recovery by about 10 days of age and induced a hyperlocomotive phenotype, observable in the second postnatal week. Temporary microglia depletion also increased juvenile locomotion in the open field test and decreased anxiety-like behaviors in the open field and elevated plus maze. These same rats displayed reductions in predator odor–induced avoidance behavior, but increased their risk assessment behaviors compared with vehicle-treated controls. In adulthood, postnatal microglia depletion resulted in significant deficits in male-specific sex behaviors. Using factor analysis, we identified two underlying traits—behavioral disinhibition and locomotion—as being significantly altered by postnatal microglia depletion. These findings further implicate microglia as being critically important to the development of juvenile and adult behavior. PMID:27957532
Kelsen, Judith R; Dawany, Noor; Moran, Christopher J; Petersen, Britt-Sabina; Sarmady, Mahdi; Sasson, Ariella; Pauly-Hubbard, Helen; Martinez, Alejandro; Maurer, Kelly; Soong, Joanne; Rappaport, Eric; Franke, Andre; Keller, Andreas; Winter, Harland S; Mamula, Petar; Piccoli, David; Artis, David; Sonnenberg, Gregory F; Daly, Mark; Sullivan, Kathleen E; Baldassano, Robert N; Devoto, Marcella
2015-11-01
Very early onset inflammatory bowel disease (VEO-IBD), IBD diagnosed at 5 years of age or younger, frequently presents with a different and more severe phenotype than older-onset IBD. We investigated whether patients with VEO-IBD carry rare or novel variants in genes associated with immunodeficiencies that might contribute to disease development. Patients with VEO-IBD and parents (when available) were recruited from the Children's Hospital of Philadelphia from March 2013 through July 2014. We analyzed DNA from 125 patients with VEO-IBD (age, 3 wk to 4 y) and 19 parents, 4 of whom also had IBD. Exome capture was performed by Agilent SureSelect V4, and sequencing was performed using the Illumina HiSeq platform. Alignment to human genome GRCh37 was achieved followed by postprocessing and variant calling. After functional annotation, candidate variants were analyzed for change in protein function, minor allele frequency less than 0.1%, and scaled combined annotation-dependent depletion scores of 10 or less. We focused on genes associated with primary immunodeficiencies and related pathways. An additional 210 exome samples from patients with pediatric IBD (n = 45) or adult-onset Crohn's disease (n = 20) and healthy individuals (controls, n = 145) were obtained from the University of Kiel, Germany, and used as control groups. Four hundred genes and regions associated with primary immunodeficiency, covering approximately 6500 coding exons totaling more than 1 Mbp of coding sequence, were selected from the whole-exome data. Our analysis showed novel and rare variants within these genes that could contribute to the development of VEO-IBD, including rare heterozygous missense variants in IL10RA and previously unidentified variants in MSH5 and CD19. In an exome sequence analysis of patients with VEO-IBD and their parents, we identified variants in genes that regulate B- and T-cell functions and could contribute to pathogenesis. Our analysis could lead to the identification of previously unidentified IBD-associated variants. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.
Immunologic Control of Mus musculus Papillomavirus Type 1
Peng, Shiwen; Chang, Yung-Nien; Hung, Chien-Fu; Roden, Richard B. S.
2015-01-01
Persistent papillomas developed in ~10% of out-bred immune-competent SKH-1 mice following MusPV1 challenge of their tail, and in a similar fraction the papillomas were transient, suggesting potential as a model. However, papillomas only occurred in BALB/c or C57BL/6 mice depleted of T cells with anti-CD3 antibody, and they completely regressed within 8 weeks after depletion was stopped. Neither CD4+ nor CD8+ T cell depletion alone in BALB/c or C57BL/6 mice was sufficient to permit visible papilloma formation. However, low levels of MusPV1 were sporadically detected by either genomic DNA-specific PCR analysis of local skin swabs or in situ hybridization of the challenge site with an E6/E7 probe. After switching to CD3+ T cell depletion, papillomas appeared upon 14/15 of mice that had been CD4+ T cell depleted throughout the challenge phase, 1/15 of CD8+ T cell depleted mice, and none in mice without any prior T cell depletion. Both control animals and those depleted with CD8-specific antibody generated MusPV1 L1 capsid-specific antibodies, but not those depleted with CD4-specific antibody prior to T cell depletion with CD3 antibody. Thus, normal BALB/c or C57BL/6 mice eliminate the challenge dose, whereas infection is suppressed but not completely cleared if their CD4 or CD8 T cells are depleted, and recrudescence of MusPV1 is much greater in the former following treatment with CD3 antibody, possibly reflecting their failure to generate capsid antibody. Systemic vaccination of C57BL/6 mice with DNA vectors expressing MusPV1 E6 or E7 fused to calreticulin elicits potent CD8 T cell responses and these immunodominant CD8 T cell epitopes were mapped. Adoptive transfer of a MusPV1 E6-specific CD8+ T cell line controlled established MusPV1 infection and papilloma in RAG1-knockout mice. These findings suggest the potential of immunotherapy for HPV-related disease and the importance of host immunogenetics in the outcome of infection. PMID:26495972
X-Antenna: A graphical interface for antenna analysis codes
NASA Technical Reports Server (NTRS)
Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.
1995-01-01
This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.
Advanced nodal neutron diffusion method with space-dependent cross sections: ILLICO-VX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajic, H.L.; Ougouag, A.M.
1987-01-01
Advanced transverse integrated nodal methods for neutron diffusion developed since the 1970s require that node- or assembly-homogenized cross sections be known. The underlying structural heterogeneity can be accurately accounted for in homogenization procedures by the use of heterogeneity or discontinuity factors. Other (milder) types of heterogeneity, burnup-induced or due to thermal-hydraulic feedback, can be resolved by explicitly accounting for the spatial variations of material properties. This can be done during the nodal computations via nonlinear iterations. The new method has been implemented in the code ILLICO-VX (ILLICO variable cross-section method). Numerous numerical tests were performed. As expected, the convergence ratemore » of ILLICO-VX is lower than that of ILLICO, requiring approx. 30% more outer iterations per k/sub eff/ computation. The methodology has also been implemented as the NOMAD-VX option of the NOMAD, multicycle, multigroup, two- and three-dimensional nodal diffusion depletion code. The burnup-induced heterogeneities (space dependence of cross sections) are calculated during the burnup steps.« less
Blanket activation and afterheat for the Compact Reversed-Field Pinch Reactor
NASA Astrophysics Data System (ADS)
Davidson, J. W.; Battat, M. E.
A detailed assessment has been made of the activation and afterheat for a Compact Reversed-Field Pinch Reactor (CRFPR) blanket using a two-dimensional model that included the limiter, the vacuum ducts, and the manifolds and headers for cooling the limiter and the first and second walls. Region-averaged, multigroup fluxes and prompt gamma-ray/neutron heating rates were calculated using the two-dimensional, discrete-ordinates code TRISM. Activation and depletion calculations were performed with the code FORIG using one-group cross sections generated with the TRISM region-averaged fluxes. Afterheat calculations were performed for regions near the plasma, i.e., the limiter, first wall, etc. assuming a 10-day irradiation. Decay heats were computed for decay periods up to 100 minutes. For the activation calculations, the irradiation period was taken to be one year and blanket activity inventories were computed for decay times to 4 x 10 years. These activities were also calculated as the toxicity-weighted biological hazard potential (BHP).
Wang, Jing; Tergel, Tergel; Chen, Jianhua; Yang, Ju; Kang, Yan; Qi, Zhi
2015-02-01
Ecological evidence indicates a worldwide trend of dramatically decreased soil Ca(2+) levels caused by increased acid deposition and massive timber harvesting. Little is known about the genetic and cellular mechanism of plants' responses to Ca(2+) depletion. In this study, transcriptional profiling analysis helped identify multiple extracellular Ca(2+) ([Ca(2+) ]ext ) depletion-responsive genes in Arabidopsis thaliana L., many of which are involved in response to other environmental stresses. Interestingly, a group of genes encoding putative cytosolic Ca(2+) ([Ca(2+) ]cyt ) sensors were significantly upregulated, implying that [Ca(2+) ]cyt has a role in sensing [Ca(2+) ]ext depletion. Consistent with this observation, [Ca(2+) ]ext depletion stimulated a transient rise in [Ca(2+) ]cyt that was negatively influenced by [K(+) ]ext , suggesting the involvement of a membrane potential-sensitive component. The [Ca(2+) ]cyt response to [Ca(2+) ]ext depletion was significantly desensitized after the initial treatment, which is typical of a receptor-mediated signaling event. The response was insensitive to an animal Ca(2+) sensor antagonist, but was suppressed by neomycin, an inhibitor of phospholipase C. Gd(3+) , an inhibitor of Ca(2+) channels, suppressed the [Ca(2+) ]ext -triggered rise in [Ca(2+) ]cyt and downstream changes in gene expression. Taken together, this study demonstrates that [Ca(2+) ]cyt plays an important role in the putative receptor-mediated cellular and transcriptional response to [Ca(2+) ]ext depletion of plant cells. © 2014 Institute of Botany, Chinese Academy of Sciences.
NASA Astrophysics Data System (ADS)
Pesnell, W. Dean; Goldberg, Richard A.; Jackman, Charles H.; Chenette, D. L.; Gaines, E. E.
1999-01-01
Highly relativistic electron precipitation (HRE) events containing significant fluxes of electrons with E>1MeV have been predicted by models to deplete mesospheric ozone. For the electron fluxes measured during the great HRE of May 1992, depletions were predicted to occur between altitudes of 55 and 80 km, where HOx reactions cause a local minimum in the ozone number density and mixing ratio. Measurements of the precipitating electron fluxes by the particle environment monitor (PEM) tend to underestimate their intensity; thus the predictions of ozone depletion should be considered an estimate of a lower limit. Since the horizontal distribution of the electron precipitation follows the terrestrial magnetic field, it would show a distinct boundary equatorward of the L=3 magnetic shell and be readily distinguished from material that was not affected by the HRE precipitation. To search for possible ozone depletion effects, we have analyzed data from the cryogenic limb array etalon spectrometer and microwave limb sounder instruments on UARS for the above HRE. A simplified diurnal model is proposed to understand the ozone data from UARS, also illustrating the limitations of the UARS instruments for seeing the ozone depletions caused by the HRE events. This diurnal analysis limits the relative ozone depletion at around 60 km altitude to values of <10% during the very intense May 1992 event, consistent with our prediction using an improved Goddard Space Flight Center two-dimensional model.
NASA Astrophysics Data System (ADS)
Kramer, Kevin James
This study investigates the neutronics design aspects of a hybrid fusion-fission energy system called the Laser Fusion-Fission Hybrid (LFFH). A LFFH combines current Laser Inertial Confinement fusion technology with that of advanced fission reactor technology to produce a system that eliminates many of the negative aspects of pure fusion or pure fission systems. When examining the LFFH energy mission, a significant portion of the United States and world energy production could be supplied by LFFH plants. The LFFH engine described utilizes a central fusion chamber surrounded by multiple layers of multiplying and moderating media. These layers, or blankets, include coolant plenums, a beryllium (Be) multiplier layer, a fertile fission blanket and a graphite-pebble reflector. Each layer is separated by perforated oxide dispersion strengthened (ODS) ferritic steel walls. The central fusion chamber is surrounded by an ODS ferritic steel first wall. The first wall is coated with 250-500 mum of tungsten to mitigate x-ray damage. The first wall is cooled by Li17Pb83 eutectic, chosen for its neutron multiplication and good heat transfer properties. The Li17Pb 83 flows in a jacket around the first wall to an extraction plenum. The main coolant injection plenum is immediately behind the Li17Pb83, separated from the Li17Pb83 by a solid ODS wall. This main system coolant is the molten salt flibe (2LiF-BeF2), chosen for beneficial neutronics and heat transfer properties. The use of flibe enables both fusion fuel production (tritium) and neutron moderation and multiplication for the fission blanket. A Be pebble (1 cm diameter) multiplier layer surrounds the coolant injection plenum and the coolant flows radially through perforated walls across the bed. Outside the Be layer, a fission fuel layer comprised of depleted uranium contained in Tristructural-isotropic (TRISO) fuel particles having a packing fraction of 20% in 2 cm diameter fuel pebbles. The fission blanket is cooled by the same radial flibe flow that travels through perforated ODS walls to the reflector blanket. This reflector blanket is 75 cm thick comprised of 2 cm diameter graphite pebbles cooled by flibe. The flibe extraction plenum surrounds the reflector bed. Detailed neutronics designs studies are performed to arrive at the described design. The LFFH engine thermal power is controlled using a technique of adjusting the 6Li/7Li enrichment in the primary and secondary coolants. The enrichment adjusts system thermal power in the design by increasing tritium production while reducing fission. To perform the simulations and design of the LFFH engine, a new software program named LFFH Nuclear Control (LNC) was developed in C++ to extend the functionality of existing neutron transport and depletion software programs. Neutron transport calculations are performed with MCNP5. Depletion calculations are performed using Monteburns 2.0, which utilizes ORIGEN 2.0 and MCNP5 to perform a burnup calculation. LNC supports many design parameters and is capable of performing a full 3D system simulation from initial startup to full burnup. It is able to iteratively search for coolant 6Li enrichments and resulting material compositions that meet user defined performance criteria. LNC is utilized throughout this study for time dependent simulation of the LFFH engine. Two additional methods were developed to improve the computation efficiency of LNC calculations. These methods, termed adaptive time stepping and adaptive mesh refinement were incorporated into a separate stand alone C++ library name the Adaptive Burnup Library (ABL). The ABL allows for other client codes to call and utilize its functionality. Adaptive time stepping is useful for automatically maximizing the size of the depletion time step while maintaining a desired level of accuracy. Adaptive meshing allows for analysis of fixed fuel configurations that would normally require a computationally burdensome number of depletion zones. Alternatively, Adaptive Mesh Refinement (AMR) adjusts the depletion zone size according to the variation in flux across the zone or fractional contribution to total absorption or fission. A parametric analysis on a fully mixed fuel core was performed using the LNC and ABL code suites. The resulting system parameters are found to optimize performance metrics using a 20 MT DU fuel load with a 20% TRISO packing and a 300 im kernel diameter operated with a fusion input power of 500 MW and a fission blanket gain of 4.0. LFFH potentially offers a proliferation resistant technology relative to other nuclear energy systems primarily because of no need for fuel enrichment or reprocessing. A figure of merit of the material attractiveness is examined and it is found that the fuel is effectively contaminated to an unattractive level shortly after the system is started due to fission product and minor actinide build up.
Methodology for fast detection of false sharing in threaded scientific codes
Chung, I-Hsin; Cong, Guojing; Murata, Hiroki; Negishi, Yasushi; Wen, Hui-Fang
2014-11-25
A profiling tool identifies a code region with a false sharing potential. A static analysis tool classifies variables and arrays in the identified code region. A mapping detection library correlates memory access instructions in the identified code region with variables and arrays in the identified code region while a processor is running the identified code region. The mapping detection library identifies one or more instructions at risk, in the identified code region, which are subject to an analysis by a false sharing detection library. A false sharing detection library performs a run-time analysis of the one or more instructions at risk while the processor is re-running the identified code region. The false sharing detection library determines, based on the performed run-time analysis, whether two different portions of the cache memory line are accessed by the generated binary code.
Coupled petrological-geodynamical modeling of a compositionally heterogeneous mantle plume
NASA Astrophysics Data System (ADS)
Rummel, Lisa; Kaus, Boris J. P.; White, Richard W.; Mertz, Dieter F.; Yang, Jianfeng; Baumann, Tobias S.
2018-01-01
Self-consistent geodynamic modeling that includes melting is challenging as the chemistry of the source rocks continuously changes as a result of melt extraction. Here, we describe a new method to study the interaction between physical and chemical processes in an uprising heterogeneous mantle plume by combining a geodynamic code with a thermodynamic modeling approach for magma generation and evolution. We pre-computed hundreds of phase diagrams, each of them for a different chemical system. After melt is extracted, the phase diagram with the closest bulk rock chemistry to the depleted source rock is updated locally. The petrological evolution of rocks is tracked via evolving chemical compositions of source rocks and extracted melts using twelve oxide compositional parameters. As a result, a wide variety of newly generated magmatic rocks can in principle be produced from mantle rocks with different degrees of depletion. The results show that a variable geothermal gradient, the amount of extracted melt and plume excess temperature affect the magma production and chemistry by influencing decompression melting and the depletion of rocks. Decompression melting is facilitated by a shallower lithosphere-asthenosphere boundary and an increase in the amount of extracted magma is induced by a lower critical melt fraction for melt extraction and/or higher plume temperatures. Increasing critical melt fractions activates the extraction of melts triggered by decompression at a later stage and slows down the depletion process from the metasomatized mantle. Melt compositional trends are used to determine melting related processes by focusing on K2O/Na2O ratio as indicator for the rock type that has been molten. Thus, a step-like-profile in K2O/Na2O might be explained by a transition between melting metasomatized and pyrolitic mantle components reproducible through numerical modeling of a heterogeneous asthenospheric mantle source. A potential application of the developed method is shown for the West Eifel volcanic field.
Both cladribine and alemtuzumab may effect MS via B-cell depletion.
Baker, David; Herrod, Samuel S; Alvarez-Gonzalez, Cesar; Zalewski, Lukasz; Albor, Christo; Schmierer, Klaus
2017-07-01
To understand the efficacy of cladribine (CLAD) treatment in MS through analysis of lymphocyte subsets collected, but not reported, in the pivotal phase III trials of cladribine and alemtuzumab induction therapies. The regulatory submissions of the CLAD Tablets Treating Multiple Sclerosis Orally (CLARITY) (NCT00213135) cladribine and Comparison of Alemtuzumab and Rebif Efficacy in Multiple Sclerosis, study one (CARE-MS I) (NCT00530348) alemtuzumab trials were obtained from the European Medicine Agency through Freedom of Information requests. Data were extracted and statistically analyzed. Either dose of cladribine (3.5 mg/kg; 5.25 mg/kg) tested in CLARITY reduced the annualized relapse rate to 0.16-0.18 over 96 weeks, and both doses were similarly effective in reducing the risk of MRI lesions and disability. Surprisingly, however, T-cell depletion was rather modest. Cladribine 3.5 mg/kg depleted CD4 + cells by 40%-45% and CD8 + cells by 15%-30%, whereas alemtuzumab suppressed CD4 + cells by 70%-95% and CD8 + cells by 47%-55%. However, either dose of cladribine induced 70%-90% CD19 + B-cell depletion, similar to alemtuzumab (90%). CD19 + cells slowly repopulated to 15%-25% of baseline before cladribine redosing. However, alemtuzumab induced hyperrepopulation of CD19 + B cells 6-12 months after infusion, which probably forms the substrate for B-cell autoimmunities associated with alemtuzumab. Cladribine induced only modest depletion of T cells, which may not be consistent with a marked influence on MS, based on previous CD4 + T-cell depletion studies. The therapeutic drug-response relationship with cladribine is more consistent with lasting B-cell depletion and, coupled with the success seen with monoclonal CD20 + depletion, suggests that B-cell suppression could be the major direct mechanism of action.
VlincRNAs controlled by retroviral elements are a hallmark of pluripotency and cancer.
St Laurent, Georges; Shtokalo, Dmitry; Dong, Biao; Tackett, Michael R; Fan, Xiaoxuan; Lazorthes, Sandra; Nicolas, Estelle; Sang, Nianli; Triche, Timothy J; McCaffrey, Timothy A; Xiao, Weidong; Kapranov, Philipp
2013-07-22
The function of the non-coding portion of the human genome remains one of the most important questions of our time. Its vast complexity is exemplified by the recent identification of an unusual and notable component of the transcriptome - very long intergenic non-coding RNAs, termed vlincRNAs. Here we identify 2,147 vlincRNAs covering 10 percent of our genome. We show they are present not only in cancerous cells, but also in primary cells and normal human tissues, and are controlled by canonical promoters. Furthermore, vlincRNA promoters frequently originate from within endogenous retroviral sequences. Strikingly, the number of vlincRNAs expressed from endogenous retroviral promoters strongly correlates with pluripotency or the degree of malignant transformation. These results suggest a previously unknown connection between the pluripotent state and cancer via retroviral repeat-driven expression of vlincRNAs. Finally, we show that vlincRNAs can be syntenically conserved in humans and mouse and their depletion using RNAi can cause apoptosis in cancerous cells. These intriguing observations suggest that vlincRNAs could create a framework that combines many existing short ESTs and lincRNAs into a landscape of very long transcripts functioning in the regulation of gene expression in the nucleus. Certain types of vlincRNAs participate at specific stages of normal development and, based on analysis of a limited set of cancerous and primary cell lines, they appear to be co-opted by cancer-associated transcriptional programs. This provides additional understanding of transcriptome regulation during the malignant state, and could lead to additional targets and options for its reversal.
Jiang, Qian; Meng, Xing; Meng, Lingwei; Chang, Nannan; Xiong, Jingwei; Cao, Huiqing; Liang, Zicai
2014-01-01
MicroRNA knockout by genome editing technologies is promising. In order to extend the application of the technology and to investigate the function of a specific miRNA, we used CRISPR/Cas9 to deplete human miR-93 from a cluster by targeting its 5' region in HeLa cells. Various small indels were induced in the targeted region containing the Drosha processing site and seed sequences. Interestingly, we found that even a single nucleotide deletion led to complete knockout of the target miRNA with high specificity. Functional knockout was confirmed by phenotype analysis. Furthermore, de novo microRNAs were not found by RNA-seq. Nevertheless, expression of the pri-microRNAs was increased. When combined with structural analysis, the data indicated that biogenesis was impaired. Altogether, we showed that small indels in the 5' region of a microRNA result in sequence depletion as well as Drosha processing retard.
Network analysis for the visualization and analysis of qualitative data.
Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D
2018-03-01
We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Di Gregorio, Simona; Siracusa, Giovanna; Becarelli, Simone; Mariotti, Lorenzo; Gentini, Alessandro; Lorenzi, Roberto
2016-06-01
Seven hydrocarbonoclastic new bacterial isolates were isolated from dredged sediments of a river estuary in Italy. The sediments were contaminated by shipyard activities since decades, mainly ascribable to the exploitation of diesel oil as the fuel for recreational and commercial navigation of watercrafts. The bacterial isolates were able to utilize diesel oil as sole carbon source. Their metabolic capacities were evaluated by GC-MS analysis, with reference to the depletion of both the normal and branched alkanes, the nC18 fatty acid methyl ester and the unresolved complex mixture of organic compounds. They were taxonomically identified as different species of Stenotrophomonas and Pseudomonas spp. by the combination of amplified ribosomal DNA restriction analysis (ARDRA) and repetitive sequence-based PCR (REP-PCR) analysis. The metabolic activities of interest were analyzed both in relation to the single bacterial strains and to the combination of the latter as a multibacterial species system. After 6 days of incubation in mineral medium with diesel oil as sole carbon source, the Stenotrophomonas sp. M1 strain depleted 43-46 % of Cn-alkane from C28 up to C30, 70 % of the nC18 fatty acid methyl ester and the 46 % of the unresolved complex mixture of organic compounds. On the other hand, the Pseudomonas sp. NM1 strain depleted the 76 % of the nC18 fatty acid methyl ester, the 50 % of the unresolved complex mixture of organic compounds. The bacterial multispecies system was able to completely deplete Cn-alkane from C28 up to C30 and to deplete the 95 % of the unresolved complex mixture of organic compounds. The isolates, either as single strains and as a bacterial multispecies system, were proposed as candidates for bioaugmentation in bio-based processes for the decontamination of dredged sediments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Sung, Yixing; Kucukboyaci, Vefa
The Virtual Environment for Reactor Applications core simulator (VERA-CS) being developed by the Consortium for the Advanced Simulation of Light Water Reactors (CASL) includes coupled neutronics, thermal-hydraulics, and fuel temperature components with an isotopic depletion capability. The neutronics capability employed is based on MPACT, a three-dimensional (3-D) whole core transport code. The thermal-hydraulics and fuel temperature models are provided by the COBRA-TF (CTF) subchannel code. As part of the CASL development program, the VERA-CS (MPACT/CTF) code system was applied to model and simulate reactor core response with respect to departure from nucleate boiling ratio (DNBR) at the limiting time stepmore » of a postulated pressurized water reactor (PWR) main steamline break (MSLB) event initiated at the hot zero power (HZP), either with offsite power available and the reactor coolant pumps in operation (high-flow case) or without offsite power where the reactor core is cooled through natural circulation (low-flow case). The VERA-CS simulation was based on core boundary conditions from the RETRAN-02 system transient calculations and STAR-CCM+ computational fluid dynamics (CFD) core inlet distribution calculations. The evaluation indicated that the VERA-CS code system is capable of modeling and simulating quasi-steady state reactor core response under the steamline break (SLB) accident condition, the results are insensitive to uncertainties in the inlet flow distributions from the CFD simulations, and the high-flow case is more DNB limiting than the low-flow case.« less
Posttest analysis of the FFTF inherent safety tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Padilla, A. Jr.; Claybrook, S.W.
Inherent safety tests were performed during 1986 in the 400-MW (thermal) Fast Flux Test Facility (FFTF) reactor to demonstrate the effectiveness of an inherent shutdown device called the gas expansion module (GEM). The GEM device provided a strong negative reactivity feedback during loss-of-flow conditions by increasing the neutron leakage as a result of an expanding gas bubble. The best-estimate pretest calculations for these tests were performed using the IANUS plant analysis code (Westinghouse Electric Corporation proprietary code) and the MELT/SIEX3 core analysis code. These two codes were also used to perform the required operational safety analyses for the FFTF reactormore » and plant. Although it was intended to also use the SASSYS systems (core and plant) analysis code, the calibration of the SASSYS code for FFTF core and plant analysis was not completed in time to perform pretest analyses. The purpose of this paper is to present the results of the posttest analysis of the 1986 FFTF inherent safety tests using the SASSYS code.« less
An accurate and efficient laser-envelope solver for the modeling of laser-plasma accelerators
Benedetti, C.; Schroeder, C. B.; Geddes, C. G. R.; ...
2017-10-17
Detailed and reliable numerical modeling of laser-plasma accelerators (LPAs), where a short and intense laser pulse interacts with an underdense plasma over distances of up to a meter, is a formidably challenging task. This is due to the great disparity among the length scales involved in the modeling, ranging from the micron scale of the laser wavelength to the meter scale of the total laser-plasma interaction length. The use of the time-averaged ponderomotive force approximation, where the laser pulse is described by means of its envelope, enables efficient modeling of LPAs by removing the need to model the details ofmore » electron motion at the laser wavelength scale. Furthermore, it allows simulations in cylindrical geometry which captures relevant 3D physics at 2D computational cost. A key element of any code based on the time-averaged ponderomotive force approximation is the laser envelope solver. In this paper we present the accurate and efficient envelope solver used in the code INF & RNO (INtegrated Fluid & paRticle simulatioN cOde). The features of the INF & RNO laser solver enable an accurate description of the laser pulse evolution deep into depletion even at a reasonably low resolution, resulting in significant computational speed-ups.« less
RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2012-06-01
The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.« less
RELAP5-3D results for phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, G.; Epiney, A. S.
2012-07-01
The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requiresmore » participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
An accurate and efficient laser-envelope solver for the modeling of laser-plasma accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benedetti, C.; Schroeder, C. B.; Geddes, C. G. R.
Detailed and reliable numerical modeling of laser-plasma accelerators (LPAs), where a short and intense laser pulse interacts with an underdense plasma over distances of up to a meter, is a formidably challenging task. This is due to the great disparity among the length scales involved in the modeling, ranging from the micron scale of the laser wavelength to the meter scale of the total laser-plasma interaction length. The use of the time-averaged ponderomotive force approximation, where the laser pulse is described by means of its envelope, enables efficient modeling of LPAs by removing the need to model the details ofmore » electron motion at the laser wavelength scale. Furthermore, it allows simulations in cylindrical geometry which captures relevant 3D physics at 2D computational cost. A key element of any code based on the time-averaged ponderomotive force approximation is the laser envelope solver. In this paper we present the accurate and efficient envelope solver used in the code INF & RNO (INtegrated Fluid & paRticle simulatioN cOde). The features of the INF & RNO laser solver enable an accurate description of the laser pulse evolution deep into depletion even at a reasonably low resolution, resulting in significant computational speed-ups.« less
An accurate and efficient laser-envelope solver for the modeling of laser-plasma accelerators
NASA Astrophysics Data System (ADS)
Benedetti, C.; Schroeder, C. B.; Geddes, C. G. R.; Esarey, E.; Leemans, W. P.
2018-01-01
Detailed and reliable numerical modeling of laser-plasma accelerators (LPAs), where a short and intense laser pulse interacts with an underdense plasma over distances of up to a meter, is a formidably challenging task. This is due to the great disparity among the length scales involved in the modeling, ranging from the micron scale of the laser wavelength to the meter scale of the total laser-plasma interaction length. The use of the time-averaged ponderomotive force approximation, where the laser pulse is described by means of its envelope, enables efficient modeling of LPAs by removing the need to model the details of electron motion at the laser wavelength scale. Furthermore, it allows simulations in cylindrical geometry which captures relevant 3D physics at 2D computational cost. A key element of any code based on the time-averaged ponderomotive force approximation is the laser envelope solver. In this paper we present the accurate and efficient envelope solver used in the code INF&RNO (INtegrated Fluid & paRticle simulatioN cOde). The features of the INF&RNO laser solver enable an accurate description of the laser pulse evolution deep into depletion even at a reasonably low resolution, resulting in significant computational speed-ups.
NASA Astrophysics Data System (ADS)
Lin, Y.; Wang, X.; Fok, M. C. H.; Buzulukova, N.; Perez, J. D.; Chen, L. J.
2017-12-01
The interaction between the Earth's inner and outer magnetospheric regions associated with the tail fast flows is calculated by coupling the Auburn 3-D global hybrid simulation code (ANGIE3D) to the Comprehensive Inner Magnetosphere/Ionosphere (CIMI) model. The global hybrid code solves fully kinetic equations governing the ions and a fluid model for electrons in the self-consistent electromagnetic field of the dayside and night side outer magnetosphere. In the integrated computation model, the hybrid simulation provides the CIMI model with field data in the CIMI 3-D domain and particle data at its boundary, and the transport in the inner magnetosphere is calculated by the CIMI model. By joining the two existing codes, effects of the solar wind on particle transport through the outer magnetosphere into the inner magnetosphere are investigated. Our simulation shows that fast flows and flux ropes are localized transients in the magnetotail plasma sheet and their overall structures have a dawn-dusk asymmetry. Strong perpendicular ion heating is found at the fast flow braking, which affects the earthward transport of entropy-depleted bubbles. We report on the impacts from the temperature anisotropy and non-Maxwellian ion distributions associated with the fast flows on the ring current and the convection electric field.
Alkaitis, Matthew S.; Wang, Honghui; Ikeda, Allison K.; Rowley, Carol A.; MacCormick, Ian J. C.; Chertow, Jessica H.; Billker, Oliver; Suffredini, Anthony F.; Roberts, David J.; Taylor, Terrie E.; Seydel, Karl B.; Ackerman, Hans C.
2016-01-01
Background. Plasmodium infection depletes arginine, the substrate for nitric oxide synthesis, and impairs endothelium-dependent vasodilation. Increased conversion of arginine to ornithine by parasites or host arginase is a proposed mechanism of arginine depletion. Methods. We used high-performance liquid chromatography to measure plasma arginine, ornithine, and citrulline levels in Malawian children with cerebral malaria and in mice infected with Plasmodium berghei ANKA with or without the arginase gene. Heavy isotope–labeled tracers measured by quadrupole time-of-flight liquid chromatography–mass spectrometry were used to quantify the in vivo rate of appearance and interconversion of plasma arginine, ornithine, and citrulline in infected mice. Results. Children with cerebral malaria and P. berghei–infected mice demonstrated depletion of plasma arginine, ornithine, and citrulline. Knock out of Plasmodium arginase did not alter arginine depletion in infected mice. Metabolic tracer analysis demonstrated that plasma arginase flux was unchanged by P. berghei infection. Instead, infected mice exhibited decreased rates of plasma arginine, ornithine, and citrulline appearance and decreased conversion of plasma citrulline to arginine. Notably, plasma arginine use by nitric oxide synthase was decreased in infected mice. Conclusions. Simultaneous arginine and ornithine depletion in malaria parasite–infected children cannot be fully explained by plasma arginase activity. Our mouse model studies suggest that plasma arginine depletion is driven primarily by a decreased rate of appearance. PMID:27923948
Prieto, DaRue A; Chan, King C; Johann, Donald J; Ye, Xiaoying; Whitely, Gordon; Blonder, Josip
2017-01-01
The discovery of novel drug targets and biomarkers via mass spectrometry (MS)-based proteomic analysis of clinical specimens has proven to be challenging. The wide dynamic range of protein concentration in clinical specimens and the high background/noise originating from highly abundant proteins in tissue homogenates and serum/plasma encompass two major analytical obstacles. Immunoaffinity depletion of highly abundant blood-derived proteins from serum/plasma is a well-established approach adopted by numerous researchers; however, the utilization of this technique for immunodepletion of tissue homogenates obtained from fresh frozen clinical specimens is lacking. We first developed immunoaffinity depletion of highly abundant blood-derived proteins from tissue homogenates, using renal cell carcinoma as a model disease, and followed this study by applying it to different tissue types. Tissue homogenate immunoaffinity depletion of highly abundant proteins may be equally important as is the recognized need for depletion of serum/plasma, enabling more sensitive MS-based discovery of novel drug targets, and/or clinical biomarkers from complex clinical samples. Provided is a detailed protocol designed to guide the researcher through the preparation and immunoaffinity depletion of fresh frozen tissue homogenates for two-dimensional liquid chromatography, tandem mass spectrometry (2D-LC-MS/MS)-based molecular profiling of tissue specimens in the context of drug target and/or biomarker discovery.
Depletion of the Complex Multiple Aquifer System of Jordan
NASA Astrophysics Data System (ADS)
Rödiger, T.; Siebert, C.; Geyer, S.; Merz, R.
2017-12-01
In many countries worldwide water scarcity pose a significant risk to the environment and the socio-economy. Particularly in countries where the available water resources are strongly limited by climatic conditions an accurate determination of the available water resources is of high priority, especially when water supply predominantly rely oon groundwater resources and their recharge. If groundwater abstraction exceeds the natural groundwater recharge in heavily used well field areas, overexploitation or persistent groundwater depletion occurs. This is the case in the Kingdom of Jordan, where a multi-layer aquifer complex forms the eastern subsurface catchment of the Dead Sea basin. Since the begin of the industrial and agricultural development of the country, dramatically falling groundwater levels, the disappearance of springs and saltwater intrusions from deeper aquifers is documented nation-wide. The total water budget is influenced by (i) a high climatic gradient from hyperarid to semiarid and (ii) the intnese anthropogenic abstraction. For this multi-layered aquifer system we developed a methodology to evaluate groundwater depletion by linking a hydrological and a numerical flow model including estimates of groundwater abstraction. Hence, we define groundwater depletion as the rate of groundwater abstraction in excess of natural recharge rate. Restricting our analysis, we calculated a range of groundwater depletion from 0% in the eastern Hamad basin to around 40% in the central part of Jordan and to extreme values of 100% of depletion in the Azraq and Disi basin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; McPherson, Brian J.; Grigg, Reid B.
Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storagemore » of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.« less
Demuth, Ilja; Digweed, Martin; Concannon, Patrick
2004-11-11
DNA interstrand crosslinks (ICLs) are critical lesions for the mammalian cell since they affect both DNA strands and block transcription and replication. The repair of ICLs in the mammalian cell involves components of different repair pathways such as nucleotide-excision repair and the double-strand break/homologous recombination repair pathways. However, the mechanistic details of mammalian ICL repair have not been fully delineated. We describe here the complete coding sequence and the genomic organization of hSNM1B, one of at least three human homologs of the Saccharomyces cerevisiae PSO2 gene. Depletion of hSNM1B by RNA interference rendered cells hypersensitive to ICL-inducing agents. This requirement for hSNM1B in the cellular response to ICL has been hypothesized before but never experimentally verified. In addition, siRNA knockdown of hSNM1B rendered cells sensitive to ionizing radiation, suggesting the possibility of hSNM1B involvement in homologous recombination repair of double-strand breaks arising as intermediates of ICL repair. Monoubiquitination of FANCD2, a key step in the FANC/BRCA pathway, is not affected in hSNM1B-depleted HeLa cells, indicating that hSNM1B is probably not a part of the Fanconi anemia core complex. Nonetheless, similarities in the phenotype of hSNM1B-depleted cells and cultured cells from patients suffering from Fanconi anemia make hSNM1B a candidate for one of the as yet unidentified Fanconi anemia genes not involved in monoubiquitination of FANCD2.
Burwitz, Benjamin J; Reed, Jason S; Hammond, Katherine B; Ohme, Merete A; Planer, Shannon L; Legasse, Alfred W; Ericsen, Adam J; Richter, Yoram; Golomb, Gershon; Sacha, Jonah B
2014-09-01
Nonhuman primates are critical animal models for the study of human disorders and disease and offer a platform to assess the role of immune cells in pathogenesis via depletion of specific cellular subsets. However, this model is currently hindered by the lack of reagents that safely and specifically ablate myeloid cells of the monocyte/macrophage Lin. Given the central importance of macrophages in homeostasis and host immunity, development of a macrophage-depletion technique in nonhuman primates would open new avenues of research. Here, using LA at i.v. doses as low as 0.1 mg/kg, we show a >50% transient depletion of circulating monocytes and tissue-resident macrophages in RMs by an 11-color flow cytometric analysis. Diminution of monocytes was followed rapidly by emigration of monocytes from the bone marrow, leading to a rebound of monocytes to baseline levels. Importantly, LA was well-tolerated, as no adverse effects or changes in gross organ function were observed during depletion. These results advance the ex vivo study of myeloid cells by flow cytometry and pave the way for in vivo studies of monocyte/macrophage biology in nonhuman primate models of human disease. © 2014 Society for Leukocyte Biology.
Burwitz, Benjamin J.; Reed, Jason S.; Hammond, Katherine B.; Ohme, Merete A.; Planer, Shannon L.; Legasse, Alfred W.; Ericsen, Adam J.; Richter, Yoram; Golomb, Gershon; Sacha, Jonah B.
2014-01-01
Nonhuman primates are critical animal models for the study of human disorders and disease and offer a platform to assess the role of immune cells in pathogenesis via depletion of specific cellular subsets. However, this model is currently hindered by the lack of reagents that safely and specifically ablate myeloid cells of the monocyte/macrophage Lin. Given the central importance of macrophages in homeostasis and host immunity, development of a macrophage-depletion technique in nonhuman primates would open new avenues of research. Here, using LA at i.v. doses as low as 0.1 mg/kg, we show a >50% transient depletion of circulating monocytes and tissue-resident macrophages in RMs by an 11-color flow cytometric analysis. Diminution of monocytes was followed rapidly by emigration of monocytes from the bone marrow, leading to a rebound of monocytes to baseline levels. Importantly, LA was well-tolerated, as no adverse effects or changes in gross organ function were observed during depletion. These results advance the ex vivo study of myeloid cells by flow cytometry and pave the way for in vivo studies of monocyte/macrophage biology in nonhuman primate models of human disease. PMID:24823811
NASA Astrophysics Data System (ADS)
Pinto, Victor A.; Kim, Hee-Jeong; Lyons, Larry R.; Bortnik, Jacob
2018-02-01
We have identified 61 relativistic electron enhancement events and 21 relativistic electron persistent depletion events during 1996 to 2006 from the Geostationary Operational Environmental Satellite (GOES) 8 and 10 using data from the Energetic Particle Sensor (EPS) >2 MeV fluxes. We then performed a superposed epoch time analysis of the events to find the characteristic solar wind parameters that determine the occurrence of such events, using the OMNI database. We found that there are clear differences between the enhancement events and the persistent depletion events, and we used these to establish a set of threshold values in solar wind speed, proton density and interplanetary magnetic field (IMF) Bz that can potentially be useful to predict sudden increases in flux. Persistent depletion events are characterized by a low solar wind speed, a sudden increase in proton density that remains elevated for a few days, and a northward turning of IMF Bz shortly after the depletion starts. We have also found that all relativistic electron enhancement or persistent depletion events occur when some geomagnetic disturbance is present, either a coronal mass ejection or a corotational interaction region; however, the storm index, SYM-H, does not show a strong connection with relativistic electron enhancement events or persistent depletion events. We have tested a simple threshold method for predictability of relativistic electron enhancement events using data from GOES 11 for the years 2007-2010 and found that around 90% of large increases in electron fluxes can be identified with this method.
Potential and timescales for oxygen depletion in coastal upwelling systems: A box-model analysis
NASA Astrophysics Data System (ADS)
Harrison, C. S.; Hales, B.; Siedlecki, S.; Samelson, R. M.
2016-05-01
A simple box model is used to examine oxygen depletion in an idealized ocean-margin upwelling system. Near-bottom oxygen depletion is controlled by a competition between flushing with oxygenated offshore source waters and respiration of particulate organic matter produced near the surface and retained near the bottom. Upwelling-supplied nutrients are consumed in the surface box, and some surface particles sink to the bottom where they respire, consuming oxygen. Steady states characterize the potential for hypoxic near-bottom oxygen depletion; this potential is greatest for faster sinking rates, and largely independent of production timescales except in that faster production allows faster sinking. Timescales for oxygen depletion depend on upwelling and productivity differently, however, as oxygen depletion can only be reached in meaningfully short times when productivity is rapid. Hypoxia thus requires fast production, to capture upwelled nutrients, and fast sinking, to deliver the respiration potential to model bottom waters. Combining timescales allows generalizations about tendencies toward hypoxia. If timescales of sinking are comparable to or smaller than the sum of those for respiration and flushing, the steady state will generally be hypoxic, and results indicate optimal timescales and conditions exist to generate hypoxia. For example, the timescale for approach to hypoxia lengthens with stronger upwelling, since surface particle and nutrient are shunted off-shelf, in turn reducing subsurface respiration and oxygen depletion. This suggests that if upwelling winds intensify with climate change the increased forcing could offer mitigation of coastal hypoxia, even as the oxygen levels in upwelled source waters decline.
Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
David W. Nigg, Principal Investigator; Kevin A. Steuhm, Project Manager
Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to properly verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Updatemore » Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the next anticipated ATR Core Internals Changeout (CIC) in the 2014-2015 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its third full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL under various licensing arrangements. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009, Cycle 145A through Cycle 151B, was successfully completed during 2012. This major effort supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR Core Safety Analysis Package (CSAP) preparation process, in parallel with the established PDQ-based methodology, beginning late in Fiscal Year 2012. Acquisition of the advanced SERPENT (VTT-Finland) and MC21 (DOE-NR) Monte Carlo stochastic neutronics simulation codes was also initiated during the year and some initial applications of SERPENT to ATRC experiment analysis were demonstrated. These two new codes will offer significant additional capability, including the possibility of full-3D Monte Carlo fuel management support capabilities for the ATR at some point in the future. Finally, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system has been implemented and initial computational results have been obtained. This capability will have many applications as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zittel, P.F.
1994-09-10
The solid-fuel rocket motors of large space launch vehicles release gases and particles that may significantly affect stratospheric ozone densities along the vehicle's path. In this study, standard rocket nozzle and flowfield computer codes have been used to characterize the exhaust gases and particles through the afterburning region of the solid-fuel motors of the Titan IV launch vehicle. The models predict that a large fraction of the HCl gas exhausted by the motors is converted to Cl and Cl2 in the plume afterburning region. Estimates of the subsequent chemistry suggest that on expansion into the ambient daytime stratosphere, the highlymore » reactive chlorine may significantly deplete ozone in a cylinder around the vehicle track that ranges from 1 to 5 km in diameter over the altitude range of 15 to 40 km. The initial ozone depletion is estimated to occur on a time scale of less than 1 hour. After the initial effects, the dominant chemistry of the problem changes, and new models are needed to follow the further expansion, or closure, of the ozone hole on a longer time scale.« less
GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winn, W.G.
The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.
Thermo-chemical evolution of a one-plate planet: application to Mars
NASA Astrophysics Data System (ADS)
Plesa, A.-C.; Breuer, D.
2012-04-01
Little attention has been devoted so far to find a modelling framework able to explain the geophysical implications of the Martian meteorites, the so-called SNC meteorites. Geochemical analysis of the SNC meteorites implies the rapid formation, i.e. before ~4.5 Ga, of three to four isotopically distinct reservoirs that did not remix since then [3]. In [4] the authors argue that a fast overturn of an early fractionated magma ocean may have given origin to a stably stratified mantle with a large density gradient capable to keep the mantle heterogeneous and to prevent mixing due to thermal convection. This model, albeit capable to provide a plausible explanation to the SNC meteorites, suggests a conductive mantle after the overturn which is clearly at odds with the volcanic history of Mars. This is best explained by assuming a convective mantle and partial melting as the principal agents responsible for the generation and evolution of Martian volcanism. In this work, we present an alternative scenario assuming a homogeneous mantle and accounting for compositional changes and melting temperature variations due to mantle depletion, dehydration stiffening of the mantle material due to water partitioning from the minerals into the melt, redistribution of radioactive heat sources between mantle and crust and thermal conductivity decrease in crustal regions. We use the 2D cylindrical - 3D spherical convection code Gaia [1, 2] and to model the above mentioned effects of partial melting we use a Lagrangian, particle based method. Simulation results show that chemical reservoirs, which can be formed due to partial melting when accounting for compositional changes and dehydration stiffening, remain stable over the entire thermal evolution of Mars. However, an initially depleted (i.e. buoyant harzburgite) layer of about 200 km is needed. This depleted layer in an otherwise homogeneous mantle may be the consequence of equilibrium fractionation of a freezing magma ocean where only the residual melt rises to the surface. If the heat released by accretion never allowed for a magma ocean to build, a large amount of partial melting of about 20% in the earliest stage is required to form such a buoyant layer. These models show an active convective interior and long lived partial melt production, which agrees with the volcanic history of Mars [5].
Countering the Consequences of Ego Depletion: The Effects of Self-Talk on Selective Attention.
Gregersen, Jón; Hatzigeorgiadis, Antonis; Galanis, Evangelos; Comoutos, Nikos; Papaioannou, Athanasios
2017-06-01
This study examined the effects of a self-talk intervention on selective attention in a state of ego depletion. Participants were 62 undergraduate students with a mean age of 20.02 years (SD = 1.17). The experiment was conducted in four consecutive sessions. Following baseline assessment, participants were randomly assigned into experimental and control groups. A two-session training was conducted for the two groups, with the experimental group using self-talk. In the final assessment, participants performed a selective attention test, including visual and auditory components, following a task inducing a state of ego depletion. The analysis showed that participants of the experimental group achieved a higher percentage of correct responses on the visual test and produced faster reaction times in both the visual and the auditory test compared with participants of the control group. The results of this study suggest that the use of self-talk can benefit selective attention for participants in states of ego depletion.
Integrated Composite Analyzer (ICAN): Users and programmers manual
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.
1986-01-01
The use of and relevant equations programmed in a computer code designed to carry out a comprehensive linear analysis of multilayered fiber composites is described. The analysis contains the essential features required to effectively design structural components made from fiber composites. The inputs to the code are constituent material properties, factors reflecting the fabrication process, and composite geometry. The code performs micromechanics, macromechanics, and laminate analysis, including the hygrothermal response of fiber composites. The code outputs are the various ply and composite properties, composite structural response, and composite stress analysis results with details on failure. The code is in Fortran IV and can be used efficiently as a package in complex structural analysis programs. The input-output format is described extensively through the use of a sample problem. The program listing is also included. The code manual consists of two parts.
2016-10-01
KHDRBS3 and SRSF12 on tumor progression and metastasis (Task1). We analyzed the effect of KHDRBS3 depletion on the growth and migration properties of the...tumor growth during the second year of this project. Continued analysis of the splicing factor expression in primary tumor samples further supports...depletion on tumor initiation, growth and metastasis. Keywords Pre-mRNA splicing, breast cancer, KHDRBS3, SRPK1, SRSF12, metastasis
Holewinski, Ronald J; Jin, Zhicheng; Powell, Matthew J; Maust, Matthew D; Van Eyk, Jennifer E
2013-03-01
Analysis of serum and plasma proteomes is a common approach for biomarker discovery, and the removal of high-abundant proteins, such as albumin and immunoglobins, is usually the first step in the analysis. However, albumin binds peptides and proteins, which raises concerns as to how the removal of albumin could impact the outcome of the biomarker study while ignoring the possibility that this could be a biomarker subproteome itself. The first goal of this study was to test a new commercially available affinity capture reagent from Protea Biosciences and to compare the efficiency and reproducibility to four other commercially available albumin depletion methods. The second goal of this study was to determine if there is a highly efficient albumin depletion/isolation system that minimizes sample handling and would be suitable for large numbers of samples. Two of the methods tested (Sigma and ProteaPrep) showed an albumin depletion efficiency of 97% or greater for both serum and cerebrospinal fluid (CSF). Isolated serum and CSF albuminomes from ProteaPrep spin columns were analyzed directly by LC-MS/MS, identifying 128 serum (45 not previously reported) and 94 CSF albuminome proteins (17 unique to the CSF albuminome). Serum albuminome was also isolated using Vivapure anti-HSA columns for comparison, identifying 105 proteins, 81 of which overlapped with the ProteaPrep method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Rocha, Raquel; Santana, Genoile Oliveira; Almeida, Neogélia; Lyra, Andre Castro
2009-03-01
Inflammatory bowel disease (IBD) is often associated with malnutrition. The aim of this study was to compare the body composition of outpatients with IBD during remission and active phase. In order to evaluate disease activity we used Crohn's Disease Activity Index for Crohn's disease (CD) patients and Lichtiger's Index for ulcerative colitis (UC) patients. All patients underwent the analysis of BMI, arm muscle area (AMA) and triceps plus subscapula skinfold thickness (TST+SST) to identify total, muscle and fat mass, respectively. In total 102 patients were evaluated (CD, n 50; UC, n 52) and the majority was young women. Malnutrition according to BMI was found in 14.0 % of patients with CD and 5.7 % of UC patients. Muscle mass depletion was detected in more than half of the CD and UC patients. The BMI, TST+SST and AMA values were lower in the active phase only in CD patients (P < 0.05). Fat mass depletion was associated with active phase in both CD and UC patients. Body composition parameters obtained using BMI, TST+SST and AMA were not correlated with the presence of fistula in CD patients (P>0.05). In conclusion, patients without signs of malnutrition had fat mass depletion especially in the active phase and muscle mass depletion occurred both in CD and UC patients.
NASA Technical Reports Server (NTRS)
Collinet, M.; Medard, E.; Devouard, B.; Peslier, A.
2012-01-01
Martian basalts can be classified in at least two geochemically different families: enriched and depleted shergottites. Enriched shergottites are characterized by higher incompatible element concentrations and initial Sr-87/Sr-86 and lower initial Nd-143/Nd-144 and Hf-176/Hf-177 than depleted shergottites [e.g. 1, 2]. It is now generally admitted that shergottites result from the melting of at least two distinct mantle reservoirs [e.g. 2, 3]. Some of the olivine-phyric shergottites (either depleted or enriched), the most magnesian Martian basalts, could represent primitive melts, which are of considerable interest to constrain mantle sources. Two depleted olivine-phyric shergottites, Yamato (Y) 980459 and Northwest Africa (NWA) 5789, are in equilibrium with their most magnesian olivine (Fig. 1) and their bulk rock compositions are inferred to represent primitive melts [4, 5]. Larkman Nunatak (LAR) 06319 [3, 6, 7] and NWA 1068 [8], the most magnesian enriched basalts, have bulk Mg# that are too high to be in equilibrium with their olivine megacryst cores. Parental melt compositions have been estimated by subtracting the most magnesian olivine from the bulk rock composition, assuming that olivine megacrysts have partially accumulated [3, 9]. However, because this technique does not account for the actual petrography of these meteorites, we used image analysis to study these rocks history, reconstruct their parent magma and understand the nature of olivine megacrysts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brantley, P S
2006-09-27
We describe an asymptotic analysis of the coupled nonlinear system of equations describing time-dependent three-dimensional monoenergetic neutron transport and isotopic depletion and radioactive decay. The classic asymptotic diffusion scaling of Larsen and Keller [1], along with a consistent small scaling of the terms describing the radioactive decay of isotopes, is applied to this coupled nonlinear system of equations in a medium of specified initial isotopic composition. The analysis demonstrates that to leading order the neutron transport equation limits to the standard time-dependent neutron diffusion equation with macroscopic cross sections whose number densities are determined by the standard system of ordinarymore » differential equations, the so-called Bateman equations, describing the temporal evolution of the nuclide number densities.« less
Reliable absolute analog code retrieval approach for 3D measurement
NASA Astrophysics Data System (ADS)
Yu, Shuang; Zhang, Jing; Yu, Xiaoyang; Sun, Xiaoming; Wu, Haibin; Chen, Deyun
2017-11-01
The wrapped phase of phase-shifting approach can be unwrapped by using Gray code, but both the wrapped phase error and Gray code decoding error can result in period jump error, which will lead to gross measurement error. Therefore, this paper presents a reliable absolute analog code retrieval approach. The combination of unequal-period Gray code and phase shifting patterns at high frequencies are used to obtain high-frequency absolute analog code, and at low frequencies, the same unequal-period combination patterns are used to obtain the low-frequency absolute analog code. Next, the difference between the two absolute analog codes was employed to eliminate period jump errors, and a reliable unwrapped result can be obtained. Error analysis was used to determine the applicable conditions, and this approach was verified through theoretical analysis. The proposed approach was further verified experimentally. Theoretical analysis and experimental results demonstrate that the proposed approach can perform reliable analog code unwrapping.
Gombeau, Kewin; Murat El Houdigui, Sophia; Floriani, Magali; Camilleri, Virginie; Cavalie, Isabelle; Adam-Guillermin, Christelle
2017-01-01
Uranium is an actinide naturally found in the environment. Anthropogenic activities lead to the release of increasing amounts of uranium and depleted uranium (DU) in the environment, posing potential risks to aquatic organisms due to radiological and chemical toxicity of this radionucleide. Although environmental contaminations with high levels of uranium have already been observed, chronic exposures of non-human species to levels close to the environmental quality standards remain scarcely characterized. The present study focused on the identification of the molecular pathways impacted by a chronic exposure of zebrafish to 20 μg/L of DU during 10 days. The transcriptomic effects were evaluated by the use of the mRNAseq analysis in three organs of adult zebrafish, the brain the testis and the ovaries, and two developmental stages of the adult fish progeny, two-cells embryo and four-days larvae. The results highlight generic effects on the cell adhesion process, but also specific transcriptomic responses depending on the organ or the developmental stage investigated. The analysis of the transgenerational effects of DU-exposure on the four-day zebrafish larvae demonstrate an induction of genes involved in oxidative response (cat, mpx, sod1 and sod2), a decrease of expression of the two hatching enzymes (he1a and he1b), the deregulation of the expression of gene coding for the ATPase complex and the induction of cellular stress. Electron microscopy analysis of skeletal muscles on the four-days larvae highlights significant histological impacts on the ultrastructure of both the mitochondria and the myofibres. In addition, the comparison with the transcriptomic data obtained for the acetylcholine esterase mutant reveals the induction of protein-chaperons in the skeletal muscles of the progeny of fish chronically exposed to DU, pointing towards long lasting effects of this chemical in the muscles. The results presented in this study support the hypothesis that a chronic parental exposure to an environmentally relevant concentration of DU could impair the progeny development with significant effects observed both at the molecular level and on the histological ultrastructure of organs. This study provides a comprehensive transcriptomic dataset useful for ecotoxicological studies on other fish species at the molecular level. It also provides a key DU responsive gene, egr1, which may be a candidate biomarker for monitoring aquatic pollution by heavy metals. PMID:28531178
A Semantic Analysis Method for Scientific and Engineering Code
NASA Technical Reports Server (NTRS)
Stewart, Mark E. M.
1998-01-01
This paper develops a procedure to statically analyze aspects of the meaning or semantics of scientific and engineering code. The analysis involves adding semantic declarations to a user's code and parsing this semantic knowledge with the original code using multiple expert parsers. These semantic parsers are designed to recognize formulae in different disciplines including physical and mathematical formulae and geometrical position in a numerical scheme. In practice, a user would submit code with semantic declarations of primitive variables to the analysis procedure, and its semantic parsers would automatically recognize and document some static, semantic concepts and locate some program semantic errors. A prototype implementation of this analysis procedure is demonstrated. Further, the relationship between the fundamental algebraic manipulations of equations and the parsing of expressions is explained. This ability to locate some semantic errors and document semantic concepts in scientific and engineering code should reduce the time, risk, and effort of developing and using these codes.
NASA Astrophysics Data System (ADS)
Kraljić, K.; Strüngmann, L.; Fimmel, E.; Gumbel, M.
2018-01-01
The genetic code is degenerated and it is assumed that redundancy provides error detection and correction mechanisms in the translation process. However, the biological meaning of the code's structure is still under current research. This paper presents a Genetic Code Analysis Toolkit (GCAT) which provides workflows and algorithms for the analysis of the structure of nucleotide sequences. In particular, sets or sequences of codons can be transformed and tested for circularity, comma-freeness, dichotomic partitions and others. GCAT comes with a fertile editor custom-built to work with the genetic code and a batch mode for multi-sequence processing. With the ability to read FASTA files or load sequences from GenBank, the tool can be used for the mathematical and statistical analysis of existing sequence data. GCAT is Java-based and provides a plug-in concept for extensibility. Availability: Open source Homepage:http://www.gcat.bio/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleicher, Frederick; Ortensi, Javier; DeHart, Mark
Accurate calculation of desired quantities to predict fuel behavior requires the solution of interlinked equations representing different physics. Traditional fuels performance codes often rely on internal empirical models for the pin power density and a simplified boundary condition on the cladding edge. These simplifications are performed because of the difficulty of coupling applications or codes on differing domains and mapping the required data. To demonstrate an approach closer to first principles, the neutronics application Rattlesnake and the thermal hydraulics application RELAP-7 were coupled to the fuels performance application BISON under the master application MAMMOTH. A single fuel pin was modeledmore » based on the dimensions of a Westinghouse 17x17 fuel rod. The simulation consisted of a depletion period of 1343 days, roughly equal to three full operating cycles, followed by a station blackout (SBO) event. The fuel rod was depleted for 1343 days for a near constant total power loading of 65.81 kW. After 1343 days the fission power was reduced to zero (simulating a reactor shut-down). Decay heat calculations provided the time-varying energy source after this time. For this problem, Rattlesnake, BISON, and RELAP-7 are coupled under MAMMOTH in a split operator approach. Each system solves its physics on a separate mesh and, for RELAP-7 and BISON, on only a subset of the full problem domain. Rattlesnake solves the neutronics over the whole domain that includes the fuel, cladding, gaps, water, and top and bottom rod holders. Here BISON is applied to the fuel and cladding with a 2D axi-symmetric domain, and RELAP-7 is applied to the flow of the circular outer water channel with a set of 1D flow equations. The mesh on the Rattlesnake side can either be 3D (for low order transport) or 2D (for diffusion). BISON has a matching ring structure mesh for the fuel so both the power density and local burn up are copied accurately from Rattlesnake. At each depletion time step, Rattlesnake calculates a power density, fission density rate, burn-up distribution and fast flux based on the current water density and fuel temperature. These are then mapped to the BISON mesh for a fuels performance solve. BISON calculates the fuel temperature and cladding surface temperature based upon the current power density and bulk fluid temperature. RELAP-7 then calculates the fluid temperature, water density fraction and water phase velocity based upon the cladding surface temperature. The fuel temperature and the fluid density are then passed back to Rattlesnake for another neutronics calculation. Six Picard or fixed-point style iterations are preformed in this manner to obtain consistent tightly coupled and stable results. For this paper a set of results from the detailed calculation are provided for both during depletion and the SBO event. We demonstrate that a detailed calculation closer to first principles can be done under MAMMOTH between different applications on differing domains.« less
FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna
2016-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu
2015-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Qu, Wen; Cingolani, Pablo; Zeeberg, Barry R; Ruden, Douglas M
2017-01-01
Deep sequencing of cDNAs made from spliced mRNAs indicates that most coding genes in many animals and plants have pre-mRNA transcripts that are alternatively spliced. In pre-mRNAs, in addition to invariant exons that are present in almost all mature mRNA products, there are at least 6 additional types of exons, such as exons from alternative promoters or with alternative polyA sites, mutually exclusive exons, skipped exons, or exons with alternative 5' or 3' splice sites. Our bioinformatics-based hypothesis is that, in analogy to the genetic code, there is an "alternative-splicing code" in introns and flanking exon sequences, analogous to the genetic code, that directs alternative splicing of many of the 36 types of introns. In humans, we identified 42 different consensus sequences that are each present in at least 100 human introns. 37 of the 42 top consensus sequences are significantly enriched or depleted in at least one of the 36 types of introns. We further supported our hypothesis by showing that 96 out of 96 analyzed human disease mutations that affect RNA splicing, and change alternative splicing from one class to another, can be partially explained by a mutation altering a consensus sequence from one type of intron to that of another type of intron. Some of the alternative splicing consensus sequences, and presumably their small-RNA or protein targets, are evolutionarily conserved from 50 plant to animal species. We also noticed the set of introns within a gene usually share the same splicing codes, thus arguing that one sub-type of splicesosome might process all (or most) of the introns in a given gene. Our work sheds new light on a possible mechanism for generating the tremendous diversity in protein structure by alternative splicing of pre-mRNAs.
Aerothermo-Structural Analysis of Low Cost Composite Nozzle/Inlet Components
NASA Technical Reports Server (NTRS)
Shivakumar, Kuwigai; Challa, Preeli; Sree, Dave; Reddy, D.
1999-01-01
This research is a cooperative effort among the Turbomachinery and Propulsion Division of NASA Glenn, CCMR of NC A&T State University, and the Tuskegee University. The NC A&T is the lead center and Tuskegee University is the participating institution. Objectives of the research were to develop an integrated aerodynamic, thermal and structural analysis code for design of aircraft engine components, such as, nozzles and inlets made of textile composites; conduct design studies on typical inlets for hypersonic transportation vehicles and setup standards test examples and finally manufacture a scaled down composite inlet. These objectives are accomplished through the following seven tasks: (1) identify the relevant public domain codes for all three types of analysis; (2) evaluate the codes for the accuracy of results and computational efficiency; (3) develop aero-thermal and thermal structural mapping algorithms; (4) integrate all the codes into one single code; (5) write a graphical user interface to improve the user friendliness of the code; (6) conduct test studies for rocket based combined-cycle engine inlet; and finally (7) fabricate a demonstration inlet model using textile preform composites. Tasks one, two and six are being pursued. Selected and evaluated NPARC for flow field analysis, CSTEM for in-depth thermal analysis of inlets and nozzles and FRAC3D for stress analysis. These codes have been independently verified for accuracy and performance. In addition, graphical user interface based on micromechanics analysis for laminated as well as textile composites was developed. Demonstration of this code will be made at the conference. A rocket based combined cycle engine was selected for test studies. Flow field analysis of various inlet geometries were studied. Integration of codes is being continued. The codes developed are being applied to a candidate example of trailblazer engine proposed for space transportation. A successful development of the code will provide a simpler, faster and user-friendly tool for conducting design studies of aircraft and spacecraft engines, applicable in high speed civil transport and space missions.
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
Juleff, Nicholas; Windsor, Miriam; Lefevre, Eric A.; Gubbins, Simon; Hamblin, Pip; Reid, Elizabeth; McLaughlin, Kerry; Beverley, Peter C. L.; Morrison, Ivan W.; Charleston, Bryan
2009-01-01
The role of T-lymphocyte subsets in recovery from foot-and-mouth disease virus (FMDV) infection in calves was investigated by administering subset-specific monoclonal antibodies. The depletion of circulating CD4+ or WC1+ γδ T cells was achieved for a period extending from before challenge to after resolution of viremia and peak clinical signs, whereas CD8+ cell depletion was only partial. The depletion of CD4+ cells was also confirmed by analysis of lymph node biopsy specimens 5 days postchallenge. Depletion with anti-WC1 and anti-CD8 antibodies had no effect on the kinetics of infection, clinical signs, and immune responses following FMDV infection. Three of the four CD4+ T-cell-depleted calves failed to generate an antibody response to the nonstructural polyprotein 3ABC but generated a neutralizing antibody response similar to that in the controls, including rapid isotype switching to immunoglobulin G antibody. We conclude that antibody responses to sites on the surface of the virus capsid are T cell independent, whereas those directed against the nonstructural proteins are T cell dependent. CD4 depletion was found to substantially inhibit antibody responses to the G-H peptide loop VP1135-156 on the viral capsid, indicating that responses to this particular site, which has a more mobile structure than other neutralizing sites on the virus capsid, are T cell dependent. The depletion of CD4+ T cells had no adverse effect on the magnitude or duration of clinical signs or clearance of virus from the circulation. Overall, we conclude that CD4+ T-cell-independent antibody responses play a major role in the resolution of foot-and-mouth disease in cattle. PMID:19176618
Park, Yun Yeon; Ahn, Ju-Hyun; Cho, Min-Guk; Lee, Jae-Ho
2018-04-27
ATP depletion inhibits cell cycle progression, especially during the G1 phase and the G2 to M transition. However, the effect of ATP depletion on mitotic progression remains unclear. We observed that the reduction of ATP after prometaphase by simultaneous treatment with 2-deoxyglucose and NaN 3 did not arrest mitotic progression. Interestingly, ATP depletion during nocodazole-induced prometaphase arrest resulted in mitotic slippage, as indicated by a reduction in mitotic cells, APC/C-dependent degradation of cyclin B1, increased cell attachment, and increased nuclear membrane reassembly. Additionally, cells successfully progressed through the cell cycle after mitotic slippage, as indicated by EdU incorporation and time-lapse imaging. Although degradation of cyclin B during normal mitotic progression is primarily regulated by APC/C Cdc20 , we observed an unexpected decrease in Cdc20 prior to degradation of cyclin B during mitotic slippage. This decrease in Cdc20 was followed by a change in the binding partner preference of APC/C from Cdc20 to Cdh1; consequently, APC/C Cdh1 , but not APC/C Cdc20 , facilitated cyclin B degradation following ATP depletion. Pulse-chase analysis revealed that ATP depletion significantly abrogated global translation, including the translation of Cdc20 and Cdh1. Additionally, the half-life of Cdh1 was much longer than that of Cdc20. These data suggest that ATP depletion during mitotic arrest induces mitotic slippage facilitated by APC/C Cdh1 -dependent cyclin B degradation, which follows a decrease in Cdc20 resulting from reduced global translation and the differences in the half-lives of the Cdc20 and Cdh1 proteins.
Wake coupling to full potential rotor analysis code
NASA Technical Reports Server (NTRS)
Torres, Francisco J.; Chang, I-Chung; Oh, Byung K.
1990-01-01
The wake information from a helicopter forward flight code is coupled with two transonic potential rotor codes. The induced velocities for the near-, mid-, and far-wake geometries are extracted from a nonlinear rigid wake of a standard performance and analysis code. These, together with the corresponding inflow angles, computation points, and azimuth angles, are then incorporated into the transonic potential codes. The coupled codes can then provide an improved prediction of rotor blade loading at transonic speeds.
NMD3 regulates both mRNA and rRNA nuclear export in African trypanosomes via an XPOI-linked pathway
Bühlmann, Melanie; Walrad, Pegine; Rico, Eva; Ivens, Alasdair; Capewell, Paul; Naguleswaran, Arunasalam; Roditi, Isabel; Matthews, Keith R.
2015-01-01
Trypanosomes mostly regulate gene expression through post-transcriptional mechanisms, particularly mRNA stability. However, much mRNA degradation is cytoplasmic such that mRNA nuclear export must represent an important level of regulation. Ribosomal RNAs must also be exported from the nucleus and the trypanosome orthologue of NMD3 has been confirmed to be involved in rRNA processing and export, matching its function in other organisms. Surprisingly, we found that TbNMD3 depletion also generates mRNA accumulation of procyclin-associated genes (PAGs), these being co-transcribed by RNA polymerase I with the procyclin surface antigen genes expressed on trypanosome insect forms. By whole transcriptome RNA-seq analysis of TbNMD3-depleted cells we confirm the regulation of the PAG transcripts by TbNMD3 and using reporter constructs reveal that PAG1 regulation is mediated by its 5′UTR. Dissection of the mechanism of regulation demonstrates that it is not dependent upon translational inhibition mediated by TbNMD3 depletion nor enhanced transcription. However, depletion of the nuclear export factors XPO1 or MEX67 recapitulates the effects of TbNMD3 depletion on PAG mRNAs and mRNAs accumulated in the nucleus of TbNMD3-depleted cells. These results invoke a novel RNA regulatory mechanism involving the NMD3-dependent nuclear export of mRNA cargos, suggesting a shared platform for mRNA and rRNA export. PMID:25873624
Ogata, M; Noda, K; Akita, H; Ishibashi, H
2015-03-19
Rats with dopamine depletion caused by 6-hydroxydopamine (6-OHDA) treatment during adulthood and the neonatal period exhibit akinetic motor activity and spontaneous motor hyperactivity during adolescence, respectively, indicating that the behavioral effects of dopamine depletion depend on the period of lesion development. Dopamine depletion during adulthood induces hyperalgesic response to mechanical, thermal, and/or chemical stimuli, whereas the effects of neonatal dopamine depletion on nociceptive response in adolescent rats are yet to be examined. The latter aspect was addressed in this study, and behavioral responses were examined using von-Frey, tail flick, and formalin tests. The formalin test revealed that rats with neonatal dopamine depletion exhibited a significant increase in nociceptive response during interphase (6-15min post formalin injection) and phase 2 (16-75min post formalin injection). This increase in nociceptive response to the formalin injection was not reversed by pretreatment with methamphetamine, which ameliorates motor hyperactivity observed in adolescent rats with neonatal 6-OHDA treatment. The von-Frey filament and tail flick tests failed to reveal significant differences in withdrawal thresholds between neonatal 6-OHDA-treated and vehicle-treated rats. The spinal neuronal response to the formalin injection into the rat hind paw was also examined through immunohistochemical analysis of c-Fos protein. Significantly increased numbers of c-Fos-immunoreactive cells were observed in laminae I-II and V-VI of the ipsilateral spinal cord to the site of the formalin injection in rats with neonatal dopamine depletion compared with vehicle-treated rats. These results suggest that the dopaminergic neural system plays a crucial role in the development of a neural network for tonic pain, including the spinal neural circuit for nociceptive transmission, and that the mechanism underlying hyperalgesia to tonic pain is not always consistent with that of spontaneous motor hyperactivity induced by neonatal dopamine depletion. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
Landolina, Maurizio; Curnis, Antonio; Morani, Giovanni; Vado, Antonello; Ammendola, Ernesto; D'onofrio, Antonio; Stabile, Giuseppe; Crosato, Martino; Petracci, Barbara; Ceriotti, Carlo; Bontempi, Luca; Morosato, Martina; Ballari, Gian Paolo; Gasparini, Maurizio
2015-08-01
Device replacement at the time of battery depletion of implantable cardioverter-defibrillators (ICDs) may carry a considerable risk of complications and engenders costs for healthcare systems. Therefore, ICD device longevity is extremely important both from a clinical and economic standpoint. Cardiac resynchronization therapy defibrillators (CRT-D) battery longevity is shorter than ICDs. We determined the rate of replacements for battery depletion and we identified possible determinants of early depletion in a series of patients who had undergone implantation of CRT-D devices. We retrieved data on 1726 consecutive CRT-D systems implanted from January 2008 to March 2010 in nine centres. Five years after a successful CRT-D implantation procedure, 46% of devices were replaced due to battery depletion. The time to device replacement for battery depletion differed considerably among currently available CRT-D systems from different manufacturers, with rates of batteries still in service at 5 years ranging from 52 to 88% (log-rank test, P < 0.001). Left ventricular lead output and unipolar pacing configuration were independent determinants of early depletion [hazard ratio (HR): 1.96; 95% 95% confidence interval (CI): 1.57-2.46; P < 0.001 and HR: 1.58, 95% CI: 1.25-2.01; P < 0.001, respectively]. The implantation of a recent-generation device (HR: 0.57; 95% CI: 0.45-0.72; P < 0.001), the battery chemistry and the CRT-D manufacturer (HR: 0.64; 95% CI: 0.47-0.89; P = 0.008) were additional factors associated with replacement for battery depletion. The device longevity at 5 years was 54%. High left ventricular lead output and unipolar pacing configuration were associated with early battery depletion, while recent-generation CRT-Ds displayed better longevity. Significant differences emerged among currently available CRT-D systems from different manufacturers. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Cardiology.
Park, Yun Yeon; Nam, Hyun-Ja; Do, Mihyang; Lee, Jae-Ho
2016-01-01
RSK2, also known as RPS6KA3 (ribosomal protein S6 kinase, 90 kDa, polypeptide 3), is a downstream kinase of the mitogen-activated protein kinase (MAPK) pathway, which is important in regulating survival, transcription, growth and proliferation. However, its biological role in mitotic progression is not well understood. In this study, we examined the potential involvement of RSK2 in the regulation of mitotic progression. Interestingly, depletion of RSK2, but not RSK1, caused the accumulation of mitotic cells. Time-lapse analysis revealed that mitotic duration, particularly the duration for metaphase-to-anaphase transition was prolonged in RSK2-depleted cells, suggesting activation of spindle assembly checkpoint (SAC). Indeed, more BubR1 (Bub1-related kinase) was present on metaphase plate kinetochores in RSK2-depleted cells, and depletion of BubR1 abolished the mitotic accumulation caused by RSK2 depletion, confirming BubR1-dependent SAC activation. Along with the shortening of inter-kinetochore distance, these data suggested that weakening of the tension across sister kinetochores by RSK2 depletion led to the activation of SAC. To test this, we analyzed the RSK2 effects on the stability of kinetochore–microtubule interactions, and found that RSK2-depleted cells formed less kinetochore–microtubule fibers. Moreover, RSK2 depletion resulted in the decrease of basal level of microtubule as well as an irregular distribution of mitotic spindles, which might lead to observed several mitotic progression defects such as increase in unaligned chromosomes, defects in chromosome congression and a decrease in pole-to-pole distance in these cells. Taken together, our data reveal that RSK2 affects mitotic progression by regulating the distribution, basal level and the stability of mitotic spindles. PMID:27491410
Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe
James Wacker; James (Scott) Groenier
2010-01-01
The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...
QIL1 is a novel mitochondrial protein required for MICOS complex stability and cristae morphology.
Guarani, Virginia; McNeill, Elizabeth M; Paulo, Joao A; Huttlin, Edward L; Fröhlich, Florian; Gygi, Steven P; Van Vactor, David; Harper, J Wade
2015-05-21
The mitochondrial contact site and cristae junction (CJ) organizing system (MICOS) dynamically regulate mitochondrial membrane architecture. Through systematic proteomic analysis of human MICOS, we identified QIL1 (C19orf70) as a novel conserved MICOS subunit. QIL1 depletion disrupted CJ structure in cultured human cells and in Drosophila muscle and neuronal cells in vivo. In human cells, mitochondrial disruption correlated with impaired respiration. Moreover, increased mitochondrial fragmentation was observed upon QIL1 depletion in flies. Using quantitative proteomics, we show that loss of QIL1 resulted in MICOS disassembly with the accumulation of a MIC60-MIC19-MIC25 sub-complex and degradation of MIC10, MIC26, and MIC27. Additionally, we demonstrated that in QIL1-depleted cells, overexpressed MIC10 fails to significantly restore its interaction with other MICOS subunits and SAMM50. Collectively, our work uncovers a previously unrecognized subunit of the MICOS complex, necessary for CJ integrity, cristae morphology, and mitochondrial function and provides a resource for further analysis of MICOS architecture.
QIL1 is a novel mitochondrial protein required for MICOS complex stability and cristae morphology
Guarani, Virginia; McNeill, Elizabeth M; Paulo, Joao A; Huttlin, Edward L; Fröhlich, Florian; Gygi, Steven P; Van Vactor, David; Harper, J Wade
2015-01-01
The mitochondrial contact site and cristae junction (CJ) organizing system (MICOS) dynamically regulate mitochondrial membrane architecture. Through systematic proteomic analysis of human MICOS, we identified QIL1 (C19orf70) as a novel conserved MICOS subunit. QIL1 depletion disrupted CJ structure in cultured human cells and in Drosophila muscle and neuronal cells in vivo. In human cells, mitochondrial disruption correlated with impaired respiration. Moreover, increased mitochondrial fragmentation was observed upon QIL1 depletion in flies. Using quantitative proteomics, we show that loss of QIL1 resulted in MICOS disassembly with the accumulation of a MIC60-MIC19-MIC25 sub-complex and degradation of MIC10, MIC26, and MIC27. Additionally, we demonstrated that in QIL1-depleted cells, overexpressed MIC10 fails to significantly restore its interaction with other MICOS subunits and SAMM50. Collectively, our work uncovers a previously unrecognized subunit of the MICOS complex, necessary for CJ integrity, cristae morphology, and mitochondrial function and provides a resource for further analysis of MICOS architecture. DOI: http://dx.doi.org/10.7554/eLife.06265.001 PMID:25997101
Neely, M. Diana; Schmidt, Dennis E.; Deutch, Ariel Y.
2007-01-01
The proximate cause of Parkinson’s Disease is striatal dopamine depletion. Although no overt toxicity to striatal neurons has been reported in Parkinson’s Disease, one of the consequences of striatal dopamine loss is a decrease in the number of dendritic spines on striatal medium spiny neurons (MSNs). Dendrites of these neurons receive cortical glutamatergic inputs onto the dendritic spine head and dopaminergic inputs from the substantia nigra onto the spine neck. This synaptic arrangement suggests that dopamine gates corticostriatal glutamatergic drive onto spines. Using triple organotypic slice cultures comprised of ventral mesencephalon, striatum, and cortex, we examined the role of the cortex in dopamine depletion-induced dendritic spine loss in MSNs. The striatal dopamine innervation was lesioned by treatment of the cultures with the dopaminergic neurotoxin MPP+ or by removing the mesencephalon. Both MPP+ and mesencephalic ablation decreased MSN dendritic spine density. Analysis of spine morphology revealed that thin spines were preferentially lost after dopamine depletion. Removal of the cortex completely prevented dopamine depletion-induced spine loss. These data indicate that the dendritic remodeling of MSNs seen in parkinsonism occurs secondary to increases in corticostriatal glutamatergic drive, and suggest that modulation of cortical activity may be a useful therapeutic strategy in Parkinson’s Disease. PMID:17888581
Visualization of stratospheric ozone depletion and the polar vortex
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.
1995-01-01
Direct analysis of spacecraft observations of stratospheric ozone yields information about the morphology of annual austral depletion. Visual correlation of ozone with other atmospheric data illustrates the diurnal dynamics of the polar vortex and contributions from the upper troposphere, including the formation and breakup of the depletion region each spring. These data require care in their presentation to minimize the introduction of visualization artifacts that are erroneously interpreted as data features. Non geographically registered data of differing mesh structures can be visually correlated via cartographic warping of base geometries without interpolation. Because this approach is independent of the realization technique, it provides a framework for experimenting with many visualization strategies. This methodology preserves the fidelity of the original data sets in a coordinate system suitable for three-dimensional, dynamic examination of atmospheric phenomena.
Interplay of Laser-Plasma Interactions and Inertial Fusion Hydrodynamics.
Strozzi, D J; Bailey, D S; Michel, P; Divol, L; Sepke, S M; Kerbel, G D; Thomas, C A; Ralph, J E; Moody, J D; Schneider, M B
2017-01-13
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI-specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)-mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. This model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling and data from hohlraum experiments on wall x-ray emission and capsule implosion shape.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strozzi, D. J.; Bailey, D. S.; Michel, P.
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated in this work via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI—specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)—mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. In conclusion, this model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling andmore » data from hohlraum experiments on wall x-ray emission and capsule implosion shape.« less
Stress granule formation via ATP depletion-triggered phase separation
NASA Astrophysics Data System (ADS)
Wurtz, Jean David; Lee, Chiu Fan
2018-04-01
Stress granules (SG) are droplets of proteins and RNA that form in the cell cytoplasm during stress conditions. We consider minimal models of stress granule formation based on the mechanism of phase separation regulated by ATP-driven chemical reactions. Motivated by experimental observations, we identify a minimal model of SG formation triggered by ATP depletion. Our analysis indicates that ATP is continuously hydrolysed to deter SG formation under normal conditions, and we provide specific predictions that can be tested experimentally.
NASA Astrophysics Data System (ADS)
Kervalishvili, Guram; Stolle, Claudia; Xiong, Chao
2016-04-01
ESA's constellation mission Swarm was successfully launched on 22 November 2013. The three satellites achieved their final constellation on 17 April 2014 and since then Swarm-A and Swarm-C orbiting the Earth at about 470 km (flying side-by-side) and Swarm-B at about 520 km altitude. The satellites carry instruments to monitor the F-region electron density with a sampling frequency of 2 Hz. This paper will present a detection algorithm for low-latitude post-sunset plasma bubbles (depletions), which uses local minima and maxima to detect depletions directly from electron density readings from Swarm. Our analyses were performed in the magnetic latitude (MLat) and local time (MLT) coordinate system. The detection procedure also captures the amplitude of depletion, which is called depth in the following. The width of a bubble corresponds to the length the satellite is located inside a depletion. We discuss the global distribution of depth and width of plasma bubbles and its seasonal and local time dependence for all three Swarm satellites from April 2015 through September 2015. As expected, on global average the bubble occurrence rate is highest for combined equinoxes (Mar, Apr, Sep, and Oct) and smallest for June solstice (May, Jun, Jul, and Aug). MLT distribution of the bubble occurrence number shows a sharp increase at about 19 MLT and decreases towards post-midnight hours. Interestingly, there is an inverse relation between depth and width of bubbles as function of MLT. This is true for all seasons and for all Swarm satellites. The bubble depth (width) is decreasing (increasing) from post-sunset to post-midnight for December solstice (Jan, Feb, Nov, and Dec) and combined equinoxes with about the same amplitude values for bubbles depth (width). Therefore we suggest that at post midnight when the depletions are less steep the structures of the depletions is broader than early after sunset. However for June solstice the depletions are less deep and the bubble depth and width do not change significantly throughout the evening. Deepest depletions occur at around +/- 10° magnetic latitude that is at the inner edge of the ionisation anomaly with density maxima at around 15° MLat. Therefore, the level of background electron density does not only determine the depth of a post-sunset depletion.
Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO
NASA Technical Reports Server (NTRS)
Stallworth, R.; Meyers, C. A.; Stinson, H. C.
1989-01-01
Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.
Optimization techniques using MODFLOW-GWM
Grava, Anna; Feinstein, Daniel T.; Barlow, Paul M.; Bonomi, Tullia; Buarne, Fabiola; Dunning, Charles; Hunt, Randall J.
2015-01-01
An important application of optimization codes such as MODFLOW-GWM is to maximize water supply from unconfined aquifers subject to constraints involving surface-water depletion and drawdown. In optimizing pumping for a fish hatchery in a bedrock aquifer system overlain by glacial deposits in eastern Wisconsin, various features of the GWM-2000 code were used to overcome difficulties associated with: 1) Non-linear response matrices caused by unconfined conditions and head-dependent boundaries; 2) Efficient selection of candidate well and drawdown constraint locations; and 3) Optimizing against water-level constraints inside pumping wells. Features of GWM-2000 were harnessed to test the effects of systematically varying the decision variables and constraints on the optimized solution for managing withdrawals. An important lesson of the procedure, similar to lessons learned in model calibration, is that the optimized outcome is non-unique, and depends on a range of choices open to the user. The modeler must balance the complexity of the numerical flow model used to represent the groundwater-flow system against the range of options (decision variables, objective functions, constraints) available for optimizing the model.
Density Convection near Radiating ICRF Antennas and its Effect on the Coupling of Lower Hybrid Waves
NASA Astrophysics Data System (ADS)
Ekedahl, A.; Colas, L.; Mayoral, M.-L.; Beaumont, B.; Bibet, Ph.; Brémond, S.; Kazarian, F.; Mailloux, J.; Noterdaeme, J.-M.; Efda-Jet Contributors
2003-12-01
Combined operation of Lower Hybrid (LH) and Ion Cyclotron Resonance Frequency (ICRF) waves can result in a degradation of the LH wave coupling, as observed both in the Tore Supra and JET tokamaks. The reflection coefficient on the part of the LH launcher magnetically connected to the powered ICRF antenna increases, suggesting a local decrease in the electron density in the connecting flux tubes. This has been confirmed by Langmuir probe measurements on the LH launchers in the latest Tore Supra experiments. Moreover, recent experiments in JET indicate that the LH coupling degradation depends on the ICRF power and its launched k//-spectrum. The 2D density distribution around the Tore Supra ICRF antennas has been modelled with the CELLS-code, balancing parallel losses with diffusive transport and sheath induced E×B convection, obtained from RF field mapping using the ICANT-code. The calculations are in qualitative agreement with the experimental observations, i.e. density depletion is obtained, localised mainly in the antenna shadow, and dependent on ICRF power and antenna spectrum.
Control of Fur synthesis by the non-coding RNA RyhB and iron-responsive decoding.
Vecerek, Branislav; Moll, Isabella; Bläsi, Udo
2007-02-21
The Fe2+-dependent Fur protein serves as a negative regulator of iron uptake in bacteria. As only metallo-Fur acts as an autogeneous repressor, Fe2+scarcity would direct fur expression when continued supply is not obviously required. We show that in Escherichia coli post-transcriptional regulatory mechanisms ensure that Fur synthesis remains steady in iron limitation. Our studies revealed that fur translation is coupled to that of an upstream open reading frame (uof), translation of which is downregulated by the non-coding RNA (ncRNA) RyhB. As RyhB transcription is negatively controlled by metallo-Fur, iron depletion creates a negative feedback loop. RyhB-mediated regulation of uof-fur provides the first example for indirect translational regulation by a trans-encoded ncRNA. In addition, we present evidence for an iron-responsive decoding mechanism of the uof-fur entity. It could serve as a backup mechanism of the RyhB circuitry, and represents the first link between iron availability and synthesis of an iron-containing protein.
Cheon, Dong Huey; Nam, Eun Ji; Park, Kyu Hyung; Woo, Se Joon; Lee, Hye Jin; Kim, Hee Cheol; Yang, Eun Gyeong; Lee, Cheolju; Lee, Ji Eun
2016-01-04
While human plasma serves as a great source for disease diagnosis, low-molecular-weight (LMW) proteome (<30 kDa) has been shown to contain a rich source of diagnostic biomarkers. Here we employ top-down mass spectrometry to analyze the LMW proteoforms present in four types of human plasma samples pooled from three healthy controls (HCs) without immunoaffinity depletion and with depletion of the top two, six, and seven high-abundance proteins. The LMW proteoforms were first fractionated based on molecular weight using gel-eluted liquid fraction entrapment electrophoresis (GELFrEE). Then, the GELFrEE fractions containing up to 30 kDa were subjected to nanocapillary-LC-MS/MS, and the high-resolution MS and MS/MS data were processed using ProSightPC 3.0. As a result, a total of 442 LMW proteins and cleaved products, including those with post-translational modifications and single amino acid variations, were identified. From additional comparative analysis of plasma samples without immunoaffinity depletion between HCs and colorectal cancer (CRC) patients via top-down approach, tens of LMW proteoforms, including platelet factor 4, were found to show >1.5-fold changes between the plasma samples of HCs and CRC patients, and six of the LMW proteins were verified by Western blot analysis.
Depleted uranium analysis in blood by inductively coupled plasma mass spectrometry
Todorov, T.I.; Xu, H.; Ejnik, J.W.; Mullick, F.G.; Squibb, K.; McDiarmid, M.A.; Centeno, J.A.
2009-01-01
In this study we report depleted uranium (DU) analysis in whole blood samples. Internal exposure to DU causes increased uranium levels as well as change in the uranium isotopic composition in blood specimen. For identification of DU exposure we used the 235U/238U ratio in blood samples, which ranges from 0.00725 for natural uranium to 0.002 for depleted uranium. Uranium quantification and isotopic composition analysis were performed by inductively coupled plasma mass spectrometry. For method validation we used eight spiked blood samples with known uranium concentrations and isotopic composition. The detection limit for quantification was determined to be 4 ng L-1 uranium in whole blood. The data reproduced within 1-5% RSD and an accuracy of 1-4%. In order to achieve a 235U/238U ratio range of 0.00698-0.00752% with 99.7% confidence limit a minimum whole blood uranium concentration of 60 ng L??1 was required. An additional 10 samples from a cohort of veterans exposed to DU in Gulf War I were analyzed with no knowledge of their medical history. The measured 235U/ 238U ratios in the blood samples were used to identify the presence or absence of DU exposure within this patient group. ?? 2009 The Royal Society of Chemistry.
Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frambati, S.; Frignani, M.
2012-07-01
We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
Salmon, Stefanie J; Adriaanse, Marieke A; De Vet, Emely; Fennis, Bob M; De Ridder, Denise T D
2014-01-01
Self-control relies on a limited resource that can get depleted, a phenomenon that has been labeled ego-depletion. We argue that individuals may differ in their sensitivity to depleting tasks, and that consequently some people deplete their self-control resource at a faster rate than others. In three studies, we assessed individual differences in depletion sensitivity, and demonstrate that depletion sensitivity moderates ego-depletion effects. The Depletion Sensitivity Scale (DSS) was employed to assess depletion sensitivity. Study 1 employs the DSS to demonstrate that individual differences in sensitivity to ego-depletion exist. Study 2 shows moderate correlations of depletion sensitivity with related self-control concepts, indicating that these scales measure conceptually distinct constructs. Study 3 demonstrates that depletion sensitivity moderates the ego-depletion effect. Specifically, participants who are sensitive to depletion performed worse on a second self-control task, indicating a stronger ego-depletion effect, compared to participants less sensitive to depletion.
Salmon, Stefanie J.; Adriaanse, Marieke A.; De Vet, Emely; Fennis, Bob M.; De Ridder, Denise T. D.
2014-01-01
Self-control relies on a limited resource that can get depleted, a phenomenon that has been labeled ego-depletion. We argue that individuals may differ in their sensitivity to depleting tasks, and that consequently some people deplete their self-control resource at a faster rate than others. In three studies, we assessed individual differences in depletion sensitivity, and demonstrate that depletion sensitivity moderates ego-depletion effects. The Depletion Sensitivity Scale (DSS) was employed to assess depletion sensitivity. Study 1 employs the DSS to demonstrate that individual differences in sensitivity to ego-depletion exist. Study 2 shows moderate correlations of depletion sensitivity with related self-control concepts, indicating that these scales measure conceptually distinct constructs. Study 3 demonstrates that depletion sensitivity moderates the ego-depletion effect. Specifically, participants who are sensitive to depletion performed worse on a second self-control task, indicating a stronger ego-depletion effect, compared to participants less sensitive to depletion. PMID:25009523
Quantifying circular RNA expression from RNA-seq data using model-based framework.
Li, Musheng; Xie, Xueying; Zhou, Jing; Sheng, Mengying; Yin, Xiaofeng; Ko, Eun-A; Zhou, Tong; Gu, Wanjun
2017-07-15
Circular RNAs (circRNAs) are a class of non-coding RNAs that are widely expressed in various cell lines and tissues of many organisms. Although the exact function of many circRNAs is largely unknown, the cell type-and tissue-specific circRNA expression has implicated their crucial functions in many biological processes. Hence, the quantification of circRNA expression from high-throughput RNA-seq data is becoming important to ascertain. Although many model-based methods have been developed to quantify linear RNA expression from RNA-seq data, these methods are not applicable to circRNA quantification. Here, we proposed a novel strategy that transforms circular transcripts to pseudo-linear transcripts and estimates the expression values of both circular and linear transcripts using an existing model-based algorithm, Sailfish. The new strategy can accurately estimate transcript expression of both linear and circular transcripts from RNA-seq data. Several factors, such as gene length, amount of expression and the ratio of circular to linear transcripts, had impacts on quantification performance of circular transcripts. In comparison to count-based tools, the new computational framework had superior performance in estimating the amount of circRNA expression from both simulated and real ribosomal RNA-depleted (rRNA-depleted) RNA-seq datasets. On the other hand, the consideration of circular transcripts in expression quantification from rRNA-depleted RNA-seq data showed substantial increased accuracy of linear transcript expression. Our proposed strategy was implemented in a program named Sailfish-cir. Sailfish-cir is freely available at https://github.com/zerodel/Sailfish-cir . tongz@medicine.nevada.edu or wanjun.gu@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Analysis of new measurements of Calvert Cliffs spent fuel samples using SCALE 6.2
Hu, Jianwei; Giaquinto, J. M.; Gauld, I. C.; ...
2017-04-28
High quality experimental data for isotopic compositions in irradiated fuel are important to spent fuel applications, including nuclear safeguards, spent fuel storage, transportation, and final disposal. The importance of these data has been increasingly recognized in recent years, particularly as countries like Finland and Sweden plan to open the world’s first two spent fuel geological repositories in 2020s, while other countries, including the United States, are considering extended dry fuel storage options. Destructive and nondestructive measurements of a spent fuel rod segment from a Combustion Engineering 14 × 14 fuel assembly of the Calvert Cliffs Unit 1 nuclear reactor havemore » been recently performed at Oak Ridge National Laboratory (ORNL). These ORNL measurements included two samples selected from adjacent axial locations of a fuel rod with initial enrichment of 3.038 wt% 235U, which achieved burnups close to 43.5 GWd/MTU. More than 50 different isotopes of 16 elements were measured using high precision measurement methods. Various investigations have assessed the quality of the new ORNL measurement data, including comparison to previous measurements and to calculation results. Previous measurement data for samples from the same fuel rod measured at ORNL are available from experiments performed at Pacific Northwest National Laboratory in the United States and the Khoplin Radium Institute in Russia. Detailed assembly models were developed using the newly released SCALE 6.2 code package to simulate depletion and decay of the measured fuel samples. Furthermore, results from this work show that the new ORNL measurements provide a good quality radiochemical assay data set for spent fuel with relatively high burnup and long cooling time, and they can serve as good benchmark data for nuclear burnup code validation and spent fuel studies.« less
A complex approach to the blue-loop problem
NASA Astrophysics Data System (ADS)
Ostrowski, Jakub; Daszynska-Daszkiewicz, Jadwiga
2015-08-01
The problem of the blue loops during the core helium burning, outstanding for almost fifty years, is one of the most difficult and poorly understood problems in stellar astrophysics. Most of the work focused on the blue loops done so far has been performed with old stellar evolution codes and with limited computational resources. In the end the obtained conclusions were based on a small sample of models and could not have taken into account more advanced effects and interactions between them.The emergence of the blue loops depends on many details of the evolution calculations, in particular on chemical composition, opacity, mixing processes etc. The non-linear interactions between these factors contribute to the statement that in most cases it is hard to predict without a precise stellar modeling whether a loop will emerge or not. The high sensitivity of the blue loops to even small changes of the internal structure of a star yields one more issue: a sensitivity to numerical problems, which are common in calculations of stellar models on advanced stages of the evolution.To tackle this problem we used a modern stellar evolution code MESA. We calculated a large grid of evolutionary tracks (about 8000 models) with masses in the range of 3.0 - 25.0 solar masses from the zero age main sequence to the depletion of helium in the core. In order to make a comparative analysis, we varied metallicity, helium abundance and different mixing parameters resulting from convective overshooting, rotation etc.The better understanding of the properties of the blue loops is crucial for our knowledge of the population of blue supergiants or pulsating variables such as Cepheids, α-Cygni or Slowly Pulsating B-type supergiants. In case of more massive models it is also of great importance for studies of the progenitors of supernovae.
GASP. III. JO36: A Case of Multiple Environmental Effects at Play?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritz, Jacopo; Bruzual, Gustavo; Cervantes Sodi, Bernardo
The so-called jellyfish galaxies are objects exhibiting disturbed morphology, mostly in the form of tails of gas stripped from the main body of the galaxy. Several works have strongly suggested ram pressure stripping to be the mechanism driving this phenomenon. Here, we focus on one of these objects, drawn from a sample of optically selected jellyfish galaxies, and use it to validate sinopsis, the spectral fitting code that will be used for the analysis of the GASP (GAs Stripping Phenomena in galaxies with MUSE) survey, and study the spatial distribution and physical properties of the gas and stellar populations inmore » this galaxy. We compare the model spectra to those obtained with gandalf, a code with similar features widely used to interpret the kinematics of stars and gas in galaxies from IFU data. We find that sinopsis can reproduce the pixel-by-pixel spectra of this galaxy at least as well as gandalf does, providing reliable estimates of the underlying stellar absorption to properly correct the nebular gas emission. Using these results, we find strong evidences of a double effect of ram pressure exerted by the intracluster medium onto the gas of the galaxy. A moderate burst of star formation, dating between 20 and 500 Myr ago and involving the outer parts of the galaxy more strongly than the inner regions, was likely induced by a first interaction of the galaxy with the intracluster medium. Stripping by ram pressure, plus probable gas depletion due to star formation, contributed to create a truncated ionized gas disk. The presence of an extended stellar tail on only one side of the disk points instead to another kind of process, likely gravitational interaction by a fly-by or a close encounter with another galaxy in the cluster.« less
Analysis of new measurements of Calvert Cliffs spent fuel samples using SCALE 6.2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Jianwei; Giaquinto, J. M.; Gauld, I. C.
High quality experimental data for isotopic compositions in irradiated fuel are important to spent fuel applications, including nuclear safeguards, spent fuel storage, transportation, and final disposal. The importance of these data has been increasingly recognized in recent years, particularly as countries like Finland and Sweden plan to open the world’s first two spent fuel geological repositories in 2020s, while other countries, including the United States, are considering extended dry fuel storage options. Destructive and nondestructive measurements of a spent fuel rod segment from a Combustion Engineering 14 × 14 fuel assembly of the Calvert Cliffs Unit 1 nuclear reactor havemore » been recently performed at Oak Ridge National Laboratory (ORNL). These ORNL measurements included two samples selected from adjacent axial locations of a fuel rod with initial enrichment of 3.038 wt% 235U, which achieved burnups close to 43.5 GWd/MTU. More than 50 different isotopes of 16 elements were measured using high precision measurement methods. Various investigations have assessed the quality of the new ORNL measurement data, including comparison to previous measurements and to calculation results. Previous measurement data for samples from the same fuel rod measured at ORNL are available from experiments performed at Pacific Northwest National Laboratory in the United States and the Khoplin Radium Institute in Russia. Detailed assembly models were developed using the newly released SCALE 6.2 code package to simulate depletion and decay of the measured fuel samples. Furthermore, results from this work show that the new ORNL measurements provide a good quality radiochemical assay data set for spent fuel with relatively high burnup and long cooling time, and they can serve as good benchmark data for nuclear burnup code validation and spent fuel studies.« less
GASP. III. JO36: A Case of Multiple Environmental Effects at Play?
NASA Astrophysics Data System (ADS)
Fritz, Jacopo; Moretti, Alessia; Gullieuszik, Marco; Poggianti, Bianca; Bruzual, Gustavo; Vulcani, Benedetta; Nicastro, Fabrizio; Jaffé, Yara; Cervantes Sodi, Bernardo; Bettoni, Daniela; Biviano, Andrea; Fasano, Giovanni; Charlot, Stéphane; Bellhouse, Callum; Hau, George
2017-10-01
The so-called jellyfish galaxies are objects exhibiting disturbed morphology, mostly in the form of tails of gas stripped from the main body of the galaxy. Several works have strongly suggested ram pressure stripping to be the mechanism driving this phenomenon. Here, we focus on one of these objects, drawn from a sample of optically selected jellyfish galaxies, and use it to validate sinopsis, the spectral fitting code that will be used for the analysis of the GASP (GAs Stripping Phenomena in galaxies with MUSE) survey, and study the spatial distribution and physical properties of the gas and stellar populations in this galaxy. We compare the model spectra to those obtained with gandalf, a code with similar features widely used to interpret the kinematics of stars and gas in galaxies from IFU data. We find that sinopsis can reproduce the pixel-by-pixel spectra of this galaxy at least as well as gandalf does, providing reliable estimates of the underlying stellar absorption to properly correct the nebular gas emission. Using these results, we find strong evidences of a double effect of ram pressure exerted by the intracluster medium onto the gas of the galaxy. A moderate burst of star formation, dating between 20 and 500 Myr ago and involving the outer parts of the galaxy more strongly than the inner regions, was likely induced by a first interaction of the galaxy with the intracluster medium. Stripping by ram pressure, plus probable gas depletion due to star formation, contributed to create a truncated ionized gas disk. The presence of an extended stellar tail on only one side of the disk points instead to another kind of process, likely gravitational interaction by a fly-by or a close encounter with another galaxy in the cluster.
Combustion: Structural interaction in a viscoelastic material
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Chang, J. P.; Kumar, M.; Kuo, K. K.
1980-01-01
The effect of interaction between combustion processes and structural deformation of solid propellant was considered. The combustion analysis was performed on the basis of deformed crack geometry, which was determined from the structural analysis. On the other hand, input data for the structural analysis, such as pressure distribution along the crack boundary and ablation velocity of the crack, were determined from the combustion analysis. The interaction analysis was conducted by combining two computer codes, a combustion analysis code and a general purpose finite element structural analysis code.
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.
1995-01-01
NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.
Million, Matthieu; Tidjani Alou, Maryam; Khelaifia, Saber; Bachar, Dipankar; Lagier, Jean-Christophe; Dione, Niokhor; Brah, Souleymane; Hugon, Perrine; Lombard, Vincent; Armougom, Fabrice; Fromonot, Julien; Robert, Catherine; Michelle, Caroline; Diallo, Aldiouma; Fabre, Alexandre; Guieu, Régis; Sokhna, Cheikh; Henrissat, Bernard; Parola, Philippe; Raoult, Didier
2016-01-01
Severe acute malnutrition (SAM) is associated with inadequate diet, low levels of plasma antioxidants and gut microbiota alterations. The link between gut redox and microbial alterations, however, remains unexplored. By sequencing the gut microbiomes of 79 children of varying nutritional status from three centers in Senegal and Niger, we found a dramatic depletion of obligate anaerobes in malnutrition. This was confirmed in an individual patient data meta-analysis including 107 cases and 77 controls from 5 different African and Asian countries. Specifically, several species of the Bacteroidaceae, Eubacteriaceae, Lachnospiraceae and Ruminococceae families were consistently depleted while Enterococcus faecalis, Escherichia coli and Staphylococcus aureus were consistently enriched. Further analyses on our samples revealed increased fecal redox potential, decreased total bacterial number and dramatic Methanobrevibacter smithii depletion. Indeed, M. smithii was detected in more than half of the controls but in none of the cases. No causality was demonstrated but, based on our results, we propose a unifying theory linking microbiota specificity, lacking anaerobes and archaea, to low antioxidant nutrients, and lower food conversion. PMID:27183876
Hall, Lindsay J; Clare, Simon; Dougan, Gordon
2012-01-01
NK cells were found to be recruited in a temporally controlled manner to the nasal-associated lymphoid tissue and the cervical lymph nodes of mice following intranasal immunisation with Ag85B-ESAT6 antigen from Mycobacterium tuberculosis mixed with Escherichia coli heat-labile toxin as adjuvant. These NK cells were activated and they secreted a diverse range of cytokines and other immunmodulators. Using antibody depletion targeting anti-asialo GM1, we found evidence for altered trafficking, impaired activation and cytokine secretion of dendritic cells, macrophages and neutrophils in immunised NK cell depleted mice compared to control animals. Analysis of antigen-specific immune responses revealed an attenuated antibody and cytokine response in immunised NK cell depleted animals. Systemic administration of rIL-6 but not rIFN-γ significantly restored immune responses in mice depleted of NK cells. In conclusion, cytokine production, particularly IL-6, via NK cells and NK cell activated immune populations, plays an important role in the establishment of local innate immune responses and the consequent development of adaptive immunity after mucosal immunisation. PMID:20220095
Adigun, Babatunde John; Fensin, Michael Lorne; Galloway, Jack D.; ...
2016-10-01
Our burnup study examined the effect of a predicted critical control rod position on the nuclide predictability of several axial and radial locations within a 4×4 graphite moderated gas cooled reactor fuel cluster geometry. To achieve this, a control rod position estimator (CRPE) tool was developed within the framework of the linkage code Monteburns between the transport code MCNP and depletion code CINDER90, and four methodologies were proposed within the tool for maintaining criticality. Two of the proposed methods used an inverse multiplication approach - where the amount of fissile material in a set configuration is slowly altered until criticalitymore » is attained - in estimating the critical control rod position. Another method carried out several MCNP criticality calculations at different control rod positions, then used a linear fit to estimate the critical rod position. The final method used a second-order polynomial fit of several MCNP criticality calculations at different control rod positions to guess the critical rod position. The results showed that consistency in prediction of power densities as well as uranium and plutonium isotopics was mutual among methods within the CRPE tool that predicted critical position consistently well. Finall, while the CRPE tool is currently limited to manipulating a single control rod, future work could be geared toward implementing additional criticality search methodologies along with additional features.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Current and anticipated uses of thermalhydraulic and neutronic codes at PSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aksan, S.N.; Zimmermann, M.A.; Yadigaroglu, G.
1997-07-01
The thermalhydraulic and/or neutronic codes in use at PSI mainly provide the capability to perform deterministic safety analysis for Swiss NPPs and also serve as analysis tools for experimental facilities for LWR and ALWR simulations. In relation to these applications, physical model development and improvements, and assessment of the codes are also essential components of the activities. In this paper, a brief overview is provided on the thermalhydraulic and/or neutronic codes used for safety analysis of LWRs, at PSI, and also of some experiences and applications with these codes. Based on these experiences, additional assessment needs are indicated, together withmore » some model improvement needs. The future needs that could be used to specify both the development of a new code and also improvement of available codes are summarized.« less
Content Analysis Coding Schemes for Online Asynchronous Discussion
ERIC Educational Resources Information Center
Weltzer-Ward, Lisa
2011-01-01
Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…
Paredes, João A; Zhou, Xiaoshan; Höglund, Stefan; Karlsson, Anna
2013-01-01
Loss of thymidine kinase 2 (TK2) causes a heterogeneous myopathic form of mitochondrial DNA (mtDNA) depletion syndrome (MDS) in humans that predominantly affects skeletal muscle tissue. In mice, TK2 deficiency also affects several tissues in addition to skeletal muscle, including brain, heart, adipose tissue, kidneys and causes death about 3 weeks after birth. We analysed skeletal muscle and heart muscle tissues of Tk2 knockout mice at postnatal development phase and observed that TK2 deficient pups grew slower and their skeletal muscles appeared significantly underdeveloped, whereas heart was close to normal in size. Both tissues showed mtDNA depletion and mitochondria with altered ultrastructure, as revealed by transmission electron microscopy. Gene expression microarray analysis showed a strong down-regulation of genes involved in cell cycle and cell proliferation in both tissues, suggesting a lower pool of undifferentiated proliferating cells. Analysis of isolated primary myoblasts from Tk2 knockout mice showed slow proliferation, less ability to differentiate and signs of premature senescence, even in absence of mtDNA depletion. Our data demonstrate that TK2 deficiency disturbs myogenic progenitor cells function in postnatal skeletal muscle and we propose this as one of the causes of underdeveloped phenotype and myopathic characteristic of the TK2 deficient mice, in addition to the progressive mtDNA depletion, mitochondrial damage and respiratory chain deficiency in post-mitotic differentiated tissue.
Stratospheric ozone depletion from future nitrous oxide increases
NASA Astrophysics Data System (ADS)
Wang, W.; Tian, W.; Dhomse, S.; Xie, F.; Shu, J.; Austin, J.
2014-12-01
We have investigated the impact of the assumed nitrous oxide (N2O) increases on stratospheric chemistry and dynamics using a series of idealized simulations with a coupled chemistry-climate model (CCM). In a future cooler stratosphere the net yield of NOy from N2O is shown to decrease in a reference run following the IPCC A1B scenario, but NOy can still be significantly increased by extra increases of N2O over 2001-2050. Over the last decade of simulations, 50% increases in N2O result in a maximal 6% reduction in ozone mixing ratios in the middle stratosphere at around 10 hPa and an average 2% decrease in the total ozone column (TCO) compared with the control run. This enhanced destruction could cause an ozone decline in the first half of this century in the middle stratosphere around 10 hPa, while global TCO still shows an increase at the same time. The results from a multiple linear regression analysis and sensitivity simulations with different forcings show that the chemical effect of N2O increases dominates the N2O-induced ozone depletion in the stratosphere, while the dynamical and radiative effects of N2O increases are overall insignificant. The analysis of the results reveals that the ozone depleting potential of N2O varies with the time period and is influenced by the environmental conditions. For example, carbon dioxide (CO2) increases can strongly offset the ozone depletion effect of N2O.
Paredes, João A.; Zhou, Xiaoshan; Höglund, Stefan; Karlsson, Anna
2013-01-01
Loss of thymidine kinase 2 (TK2) causes a heterogeneous myopathic form of mitochondrial DNA (mtDNA) depletion syndrome (MDS) in humans that predominantly affects skeletal muscle tissue. In mice, TK2 deficiency also affects several tissues in addition to skeletal muscle, including brain, heart, adipose tissue, kidneys and causes death about 3 weeks after birth. We analysed skeletal muscle and heart muscle tissues of Tk2 knockout mice at postnatal development phase and observed that TK2 deficient pups grew slower and their skeletal muscles appeared significantly underdeveloped, whereas heart was close to normal in size. Both tissues showed mtDNA depletion and mitochondria with altered ultrastructure, as revealed by transmission electron microscopy. Gene expression microarray analysis showed a strong down-regulation of genes involved in cell cycle and cell proliferation in both tissues, suggesting a lower pool of undifferentiated proliferating cells. Analysis of isolated primary myoblasts from Tk2 knockout mice showed slow proliferation, less ability to differentiate and signs of premature senescence, even in absence of mtDNA depletion. Our data demonstrate that TK2 deficiency disturbs myogenic progenitor cells function in postnatal skeletal muscle and we propose this as one of the causes of underdeveloped phenotype and myopathic characteristic of the TK2 deficient mice, in addition to the progressive mtDNA depletion, mitochondrial damage and respiratory chain deficiency in post-mitotic differentiated tissue. PMID:23341978
Simms-Waldrip, Tiffany R; Sunkersett, Gauri; Coughlin, Laura A; Savani, Milan R; Arana, Carlos; Kim, Jiwoong; Kim, Minsoo; Zhan, Xiaowei; Greenberg, David E; Xie, Yang; Davies, Stella M; Koh, Andrew Y
2017-05-01
Adult stem cell transplantation (SCT) patients with graft-versus-host-disease (GVHD) exhibit significant disruptions in gut microbial communities. These changes are associated with higher overall mortality and appear to be driven by specific antibiotic therapies. It is unclear whether pediatric SCT patients who develop GVHD exhibit similar antibiotic-induced gut microbiota community changes. Here, we show that pediatric SCT patients (from Children's Medical Center Dallas, n = 8, and Cincinnati Children's Hospital, n = 7) who developed GVHD showed a significant decline, up to 10-log fold, in gut anti-inflammatory Clostridia (AIC) compared with those without GVHD. In fact, the development of GVHD is significantly associated with this AIC decline and with cumulative antibiotic exposure, particularly antibiotics effective against anaerobic bacteria (P = .003, Firth logistic regression analysis). Using metagenomic shotgun sequencing analysis, we were able to identify specific commensal bacterial species, including AIC, that were significantly depleted in GVHD patients. We then used a preclinical GVHD model to verify our clinical observations. Clindamycin depleted AIC and exacerbated GVHD in mice, whereas oral AIC supplementation increased gut AIC levels and mitigated GVHD in mice. Together, these data suggest that an antibiotic-induced AIC depletion in the gut microbiota is associated with the development of GVHD in pediatric SCT patients. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.
2016-01-01
The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.
Nielsen, Martha G.; Locke, Daniel B.
2012-01-01
In order to evaluate water availability in the State of Maine, the U.S. Geological Survey (USGS) and the Maine Geological Survey began a cooperative investigation to provide the first rigorous evaluation of watersheds deemed "at risk" because of the combination of instream flow requirements and proportionally large water withdrawals. The study area for this investigation includes the Harvey and Merrill Brook watersheds and the Freeport aquifer in the towns of Freeport, Pownal, and Yarmouth, Maine. A numerical groundwater- flow model was used to evaluate groundwater withdrawals, groundwater-surface-water interactions, and the effect of water-management practices on streamflow. The water budget illustrates the effect that groundwater withdrawals have on streamflow and the movement of water within the system. Streamflow measurements were made following standard USGS techniques, from May through September 2009 at one site in the Merrill Brook watershed and four sites in the Harvey Brook watershed. A record-extension technique was applied to estimate long-term monthly streamflows at each of the five sites. The conceptual model of the groundwater system consists of a deep, confined aquifer (the Freeport aquifer) in a buried valley that trends through the middle of the study area, covered by a discontinuous confining unit, and topped by a thin upper saturated zone that is a mixture of sandy units, till, and weathered clay. Harvey and Merrill Brooks flow southward through the study area, and receive groundwater discharge from the upper saturated zone and from the deep aquifer through previously unknown discontinuities in the confining unit. The Freeport aquifer gets most of its recharge from local seepage around the edges of the confining unit, the remainder is received as inflow from the north within the buried valley. Groundwater withdrawals from the Freeport aquifer in the study area were obtained from the local water utility and estimated for other categories. Overall, the public-supply withdrawals (105.5 million gallons per year (Mgal/yr)) were much greater than those for any other category, being almost 7 times greater than all domestic well withdrawals (15.3 Mgal/yr). Industrial withdrawals in the study area (2.0 Mgal/yr) are mostly by a company that withdraws from an aquifer at the edge of the Merrill Brook watershed. Commercial withdrawals are very small (1.0 Mgal/yr), and no irrigation or other agricultural withdrawals were identified in this study area. A three-dimensional, steady-state groundwater-flow model was developed to evaluate stream-aquifer interactions and streamflow depletion from pumping, to help refine the conceptual model, and to predict changes in streamflow resulting from changes in pumping and recharge. Groundwater levels and flow in the Freeport aquifer study area were simulated with the three-dimensional, finite-difference groundwater-flow modeling code, MODFLOW-2005. Study area hydrology was simulated with a 3-layer model, under steady-state conditions. The groundwater model was used to evaluate changes that could occur in the water budgets of three parts of the local hydrologic system (the Harvey Brook watershed, the Merrill Brook watershed, and the buried aquifer from which pumping occurs) under several different climatic and pumping scenarios. The scenarios were (1) no pumping well withdrawals; (2) current (2009) pumping, but simulated drought conditions (20-percent reduction in recharge); (3) current (2009) recharge, but a 50-percent increase in pumping well withdrawals for public supply; and (4) drought conditions and increased pumping combined. In simulated drought situations, the overall recharge to the buried valley is about 15 percent less and the total amount of streamflow in the model area is reduced by about 19 percent. Without pumping, infiltration to the buried valley aquifer around the confining unit decreased by a small amount (0.05 million gallons per day (Mgal/d)), and discharge to the streams increased by about 8 percent (0.3 Mgal/d). A 50-percent increase in pumping resulted in a simulated decrease in streamflow discharge of about 4 percent (0.14 Mgal/d). Streamflow depletion in Harvey Brook was evaluated by use of the numerical groundwater-flow model and an analytical model. The analytical model estimated negligible depletion from Harvey Brook under current (2009) pumping conditions, whereas the numerical model estimated that flow to Harvey Brook decreased 0.38 cubic feet per second (ft3/s) because of the pumping well withdrawals. A sensitivity analysis of the analytical model method showed that conducting a cursory evaluation using an analytical model of streamflow depletion using available information may result in a very wide range in results, depending on how well the hydraulic conductivity variables and aquifer geometry of the system are known, and how well the aquifer fits the assumptions of the model. Using the analytical model to evaluate the streamflow depletion with an incomplete understanding of the hydrologic system gave results that seem unlikely to reflect actual streamflow depletion in the Freeport aquifer study area. In contrast, the groundwater-flow model was a more robust method of evaluating the amount of streamflow depletion that results from withdrawals in the Freeport aquifer, and could be used to evaluate streamflow depletion in both streams. Simulations of streamflow without pumping for each measurement site were compared to the calibratedmodel streamflow (with pumping), the difference in the total being streamflow depletion. Simulations without pumping resulted in a simulated increase in the steady-state flow rate of 0.38 ft3/s in Harvey Brook and 0.01 ft3/s in Merrill Brook. This translates into a streamflow-depletion amount equal to about 8.5 percent of the steady-state base flow in Harvey Brook, and an unmeasurable amount of depletion in Merrill Brook. If pumping was increased by 50 percent and recharge reduced by 20 percent, the amount of streamflow depletion in Harvey Brook could reach 1.41 ft3/s.
Chemical and Dynamical Impacts of Stratospheric Sudden Warmings on Arctic Ozone Variability
NASA Technical Reports Server (NTRS)
Strahan, S. E.; Douglass, A. R.; Steenrod, S. D.
2016-01-01
We use the Global Modeling Initiative (GMI) chemistry and transport model with Modern-Era Retrospective Analysis for Research and Applications (MERRA) meteorological fields to quantify heterogeneous chemical ozone loss in Arctic winters 2005-2015. Comparisons to Aura Microwave Limb Sounder N2O and O3 observations show the GMI simulation credibly represents the transport processes and net heterogeneous chemical loss necessary to simulate Arctic ozone. We find that the maximum seasonal ozone depletion varies linearly with the number of cold days and with wave driving (eddy heat flux) calculated from MERRA fields. We use this relationship and MERRA temperatures to estimate seasonal ozone loss from 1993 to 2004 when inorganic chlorine levels were in the same range as during the Aura period. Using these loss estimates and the observed March mean 63-90N column O3, we quantify the sensitivity of the ozone dynamical resupply to wave driving, separating it from the sensitivity of ozone depletion to wave driving. The results show that about 2/3 of the deviation of the observed March Arctic O3 from an assumed climatological mean is due to variations in O3 resupply and 13 is due to depletion. Winters with a stratospheric sudden warming (SSW) before mid-February have about 1/3 the depletion of winters without one and export less depletion to the midlatitudes. However, a larger effect on the spring midlatitude ozone comes from dynamical differences between warm and cold Arctic winters, which can mask or add to the impact of exported depletion.
Panno, Angelo; Carrus, Giuseppe; Lafortezza, Raffaele; Mariani, Luigi; Sanesi, Giovanni
2017-11-01
Air temperatures are increasing because of global climate change. A warming phenomenon strongly related to global climate change is the urban heat island. It has been shown that the hotter temperatures occurring in cities during the summer negatively affect human wellbeing, but little is known about the potential mechanisms underlying the relationships between hotter temperatures, cognitive psychological resources and wellbeing. The aim of the present research is to understand whether, and how, spending time in urban green spaces, which can be considered as a specific kind of Nature-Based Solution (NBS), helps the recovery of cognitive resources and wellbeing. The main hypothesis is that contact with urban green is related to wellbeing through the depletion of cognitive resources (i.e., ego depletion). Moreover, we expected that individuals showing higher scores of ego depletion also report a higher estimate of the maximum temperature reached during the summer. The results of a survey (N = 115) conducted among visitors to Parco Nord Milano, a large urban park located in Milan (Italy), point out that people visiting the park during the summer show a higher level of wellbeing as well as a lower level of ego depletion. A mediation analysis shows that visiting urban green spaces is associated with greater wellbeing through less ego depletion. Our results also point out that, as expected, people showing a higher level of ego depletion tend to overestimate the maximum air temperature. Implications for future studies and applied interventions regarding the role of NBS to promote human wellbeing are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.
Winek, Katarzyna; Engel, Odilo; Koduah, Priscilla; Heimesaat, Markus M.; Fischer, André; Bereswill, Stefan; Dames, Claudia; Kershaw, Olivia; Gruber, Achim D.; Curato, Caterina; Oyama, Naoki; Meisel, Christian; Meisel, Andreas
2016-01-01
Background and Purpose— Antibiotics disturbing microbiota are often used in treatment of poststroke infections. A bidirectional brain–gut microbiota axis was recently suggested as a modulator of nervous system diseases. We hypothesized that gut microbiota may be an important player in the course of stroke. Methods— We investigated the outcome of focal cerebral ischemia in C57BL/6J mice after an 8-week decontamination with quintuple broad-spectrum antibiotic cocktail. These microbiota-depleted animals were subjected to 60 minutes middle cerebral artery occlusion or sham operation. Infarct volume was measured using magnetic resonance imaging, and mice were monitored clinically throughout the whole experiment. At the end point, tissues were preserved for further analysis, comprising histology and immunologic investigations using flow cytometry. Results— We found significantly decreased survival in the middle cerebral artery occlusion microbiota-depleted mice when the antibiotic cocktail was stopped 3 days before surgery (compared with middle cerebral artery occlusion specific pathogen-free and sham-operated microbiota-depleted mice). Moreover, all microbiota-depleted animals in which antibiotic treatment was terminated developed severe acute colitis. This phenotype was rescued by continuous antibiotic treatment or colonization with specific pathogen-free microbiota before surgery. Further, infarct volumes on day one did not differ between any of the experimental groups. Conclusions— Conventional microbiota ensures intestinal protection in the mouse model of experimental stroke and prevents development of acute and severe colitis in microbiota-depleted mice not given antibiotic protection after cerebral ischemia. Our experiments raise the clinically important question as to whether microbial colonization or specific microbiota are crucial for stroke outcome. PMID:27056982
Alkaitis, Matthew S; Wang, Honghui; Ikeda, Allison K; Rowley, Carol A; MacCormick, Ian J C; Chertow, Jessica H; Billker, Oliver; Suffredini, Anthony F; Roberts, David J; Taylor, Terrie E; Seydel, Karl B; Ackerman, Hans C
2016-12-15
Plasmodium infection depletes arginine, the substrate for nitric oxide synthesis, and impairs endothelium-dependent vasodilation. Increased conversion of arginine to ornithine by parasites or host arginase is a proposed mechanism of arginine depletion. We used high-performance liquid chromatography to measure plasma arginine, ornithine, and citrulline levels in Malawian children with cerebral malaria and in mice infected with Plasmodium berghei ANKA with or without the arginase gene. Heavy isotope-labeled tracers measured by quadrupole time-of-flight liquid chromatography-mass spectrometry were used to quantify the in vivo rate of appearance and interconversion of plasma arginine, ornithine, and citrulline in infected mice. Children with cerebral malaria and P. berghei-infected mice demonstrated depletion of plasma arginine, ornithine, and citrulline. Knock out of Plasmodium arginase did not alter arginine depletion in infected mice. Metabolic tracer analysis demonstrated that plasma arginase flux was unchanged by P. berghei infection. Instead, infected mice exhibited decreased rates of plasma arginine, ornithine, and citrulline appearance and decreased conversion of plasma citrulline to arginine. Notably, plasma arginine use by nitric oxide synthase was decreased in infected mice. Simultaneous arginine and ornithine depletion in malaria parasite-infected children cannot be fully explained by plasma arginase activity. Our mouse model studies suggest that plasma arginine depletion is driven primarily by a decreased rate of appearance. Published by Oxford University Press for the Infectious Diseases Society of America 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Srivastava, R.; Mehmed, Oral
2002-01-01
An aeroelastic analysis system for flutter and forced response analysis of turbomachines based on a two-dimensional linearized unsteady Euler solver has been developed. The ASTROP2 code, an aeroelastic stability analysis program for turbomachinery, was used as a basis for this development. The ASTROP2 code uses strip theory to couple a two dimensional aerodynamic model with a three dimensional structural model. The code was modified to include forced response capability. The formulation was also modified to include aeroelastic analysis with mistuning. A linearized unsteady Euler solver, LINFLX2D is added to model the unsteady aerodynamics in ASTROP2. By calculating the unsteady aerodynamic loads using LINFLX2D, it is possible to include the effects of transonic flow on flutter and forced response in the analysis. The stability is inferred from an eigenvalue analysis. The revised code, ASTROP2-LE for ASTROP2 code using Linearized Euler aerodynamics, is validated by comparing the predictions with those obtained using linear unsteady aerodynamic solutions.
2006-01-01
collected, code both. Code Type of Analysis Code Type of Analysis A Physical properties I Common ions/trace elements B Common ions J Sanitary analysis and...1) A ground-water site is coded as if it is a single point, not a geographic area or property . (2) Latitude and longitude should be determined at a...terrace from an adjacent upland on one side, and a lowland coast or valley on the other. Due to the effects of erosion, the terrace surface may not be as
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
EAC: A program for the error analysis of STAGS results for plates
NASA Technical Reports Server (NTRS)
Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.
1989-01-01
A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Deterministic methods for multi-control fuel loading optimization
NASA Astrophysics Data System (ADS)
Rahman, Fariz B. Abdul
We have developed a multi-control fuel loading optimization code for pressurized water reactors based on deterministic methods. The objective is to flatten the fuel burnup profile, which maximizes overall energy production. The optimal control problem is formulated using the method of Lagrange multipliers and the direct adjoining approach for treatment of the inequality power peaking constraint. The optimality conditions are derived for a multi-dimensional multi-group optimal control problem via calculus of variations. Due to the Hamiltonian having a linear control, our optimal control problem is solved using the gradient method to minimize the Hamiltonian and a Newton step formulation to obtain the optimal control. We are able to satisfy the power peaking constraint during depletion with the control at beginning of cycle (BOC) by building the proper burnup path forward in time and utilizing the adjoint burnup to propagate the information back to the BOC. Our test results show that we are able to achieve our objective and satisfy the power peaking constraint during depletion using either the fissile enrichment or burnable poison as the control. Our fuel loading designs show an increase of 7.8 equivalent full power days (EFPDs) in cycle length compared with 517.4 EFPDs for the AP600 first cycle.
Zhang, Liangyu; Ward, Jordan D.; Cheng, Ze; Dernburg, Abby F.
2015-01-01
Experimental manipulation of protein abundance in living cells or organisms is an essential strategy for investigation of biological regulatory mechanisms. Whereas powerful techniques for protein expression have been developed in Caenorhabditis elegans, existing tools for conditional disruption of protein function are far more limited. To address this, we have adapted the auxin-inducible degradation (AID) system discovered in plants to enable conditional protein depletion in C. elegans. We report that expression of a modified Arabidopsis TIR1 F-box protein mediates robust auxin-dependent depletion of degron-tagged targets. We document the effectiveness of this system for depletion of nuclear and cytoplasmic proteins in diverse somatic and germline tissues throughout development. Target proteins were depleted in as little as 20-30 min, and their expression could be re-established upon auxin removal. We have engineered strains expressing TIR1 under the control of various promoter and 3′ UTR sequences to drive tissue-specific or temporally regulated expression. The degron tag can be efficiently introduced by CRISPR/Cas9-based genome editing. We have harnessed this system to explore the roles of dynamically expressed nuclear hormone receptors in molting, and to analyze meiosis-specific roles for proteins required for germ line proliferation. Together, our results demonstrate that the AID system provides a powerful new tool for spatiotemporal regulation and analysis of protein function in a metazoan model organism. PMID:26552885
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiwari, Pragya; Srivastava, A. K.; Khattak, B. Q.
Polymethyl methacrylate (PMMA) is characterized for electron beam interactions in the resist layer in lithographic applications. PMMA thin films (free standing) were prepared by solvent casting method. These films were irradiated with 30keV electron beam at different doses. Structural and chemical properties of the films were studied by means of X-ray diffraction and Fourier transform infra-red (FTIR) spectroscopy The XRD results showed that the amorphization increases with electron beam irradiation dose. FTIR spectroscopic analysis reveals that electron beam irradiation promotes the scission of carbonyl group and depletes hydrogen and converts polymeric structure into hydrogen depleted carbon network.
Replacement of ozone depleting and toxic chemicals in gravimetric analysis of non-volatile residue
NASA Technical Reports Server (NTRS)
Arnold, G. S.; Uht, J. C.; Sinsheimer, F. B.
1995-01-01
The standard tests for determining nonvolatile residue accretion on spacecraft surfaces and in clean processing facilities rely on the use of halogenated solvents that are targeted for elimination because of their toxic or ozone-depleting natures. This paper presents a literature-based screening survey for candidate replacement solvents. Potential replacements were evaluated for their vapor pressure, toxicity, and solvent properties. Three likely candidates were identified: ethyl acetate, methyl acetate, and acetone. Laboratory tests are presented that evaluate the suitability of these candidate replacement solvents.
Fryar-Williams, Stephanie
2016-01-01
The Mental Health Biomarker Project (2010–2016) explored variables for psychosis in schizophrenia and schizoaffective disorder. Blood samples from 67, highly characterized symptomatic cases and 67 gender and age matched control participants were analyzed for methyl tetrahydrofolate reductase (MTHFR) 677C → T gene variants and for vitamin B6, B12 and D, folate, unbound copper, zinc cofactors for enzymes in the methylation cycle, and related catecholamine pathways. Urine samples were analyzed for indole-catecholamines, their metabolites, and oxidative-stress marker, hydroxylpyrolline-2-one (HPL). Rating scales were Brief Psychiatric Rating Scale, Positive and Negative Syndrome Scale, Global Assessment of Function scale, Clinical Global Impression (CGI) score, and Social and Occupational Functioning Assessment Scale (SOFAS). Analysis used Spearman’s correlates, receiver operating characteristics and structural equation modeling (SEM). The correlative pattern of variables in the overall participant sample strongly implicated monoamine oxidase (MAO) enzyme inactivity so the significant role of MAO’s cofactor flavin adenine nucleotide and its precursor flavin adenine mononucleotide (FMN) within the biochemical pathways was investigated and confirmed as 71% on SEM of the total sample. Splitting the data sets for MTHFR 677C → T polymorphism variants coding for the MTHFR enzyme, discovered that biochemistry variables relating to the wild-type enzyme differed markedly in pattern from those coded by the homozygous variant and that the hereozygous-variant pattern resembled the wild-type-coded pattern. The MTHFR 677C → T-wild and -heterozygous gene variants have a pattern of depleted vitamin cofactors characteristic of flavin insufficiency with under-methylation and severe oxidative stress. The second homozygous MTHFR 677TT pattern related to elevated copper:zinc ratio and a vitamin pattern related to flavin sufficiency and risk of over-methylation. The two gene variants and their different biochemical phenotypes govern findings in relationship to case-identification, illness severity, duration of illness, and functional disability in schizophrenia and schizoaffective psychosis, and establish a basis for trials of gene-guided precision treatment for the management of psychosis. PMID:27881965
Fryar-Williams, Stephanie
2016-01-01
The Mental Health Biomarker Project (2010-2016) explored variables for psychosis in schizophrenia and schizoaffective disorder. Blood samples from 67, highly characterized symptomatic cases and 67 gender and age matched control participants were analyzed for methyl tetrahydrofolate reductase (MTHFR) 677C → T gene variants and for vitamin B6, B12 and D, folate, unbound copper, zinc cofactors for enzymes in the methylation cycle, and related catecholamine pathways. Urine samples were analyzed for indole-catecholamines, their metabolites, and oxidative-stress marker, hydroxylpyrolline-2-one (HPL). Rating scales were Brief Psychiatric Rating Scale, Positive and Negative Syndrome Scale, Global Assessment of Function scale, Clinical Global Impression (CGI) score, and Social and Occupational Functioning Assessment Scale (SOFAS). Analysis used Spearman's correlates, receiver operating characteristics and structural equation modeling (SEM). The correlative pattern of variables in the overall participant sample strongly implicated monoamine oxidase (MAO) enzyme inactivity so the significant role of MAO's cofactor flavin adenine nucleotide and its precursor flavin adenine mononucleotide (FMN) within the biochemical pathways was investigated and confirmed as 71% on SEM of the total sample. Splitting the data sets for MTHFR 677C → T polymorphism variants coding for the MTHFR enzyme, discovered that biochemistry variables relating to the wild-type enzyme differed markedly in pattern from those coded by the homozygous variant and that the hereozygous-variant pattern resembled the wild-type-coded pattern. The MTHFR 677C → T-wild and -heterozygous gene variants have a pattern of depleted vitamin cofactors characteristic of flavin insufficiency with under-methylation and severe oxidative stress. The second homozygous MTHFR 677TT pattern related to elevated copper:zinc ratio and a vitamin pattern related to flavin sufficiency and risk of over-methylation. The two gene variants and their different biochemical phenotypes govern findings in relationship to case-identification, illness severity, duration of illness, and functional disability in schizophrenia and schizoaffective psychosis, and establish a basis for trials of gene-guided precision treatment for the management of psychosis.
Energy Savings Analysis of the Proposed NYStretch-Energy Code 2018
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Bing; Zhang, Jian; Chen, Yan
This study was conducted by the Pacific Northwest National Laboratory (PNNL) in support of the stretch energy code development led by the New York State Energy Research and Development Authority (NYSERDA). In 2017 NYSERDA developed its 2016 Stretch Code Supplement to the 2016 New York State Energy Conservation Construction Code (hereinafter referred to as “NYStretch-Energy”). NYStretch-Energy is intended as a model energy code for statewide voluntary adoption that anticipates other code advancements culminating in the goal of a statewide Net Zero Energy Code by 2028. Since then, NYSERDA continues to develop the NYStretch-Energy Code 2018 edition. To support the effort,more » PNNL conducted energy simulation analysis to quantify the energy savings of proposed commercial provisions of the NYStretch-Energy Code (2018) in New York. The focus of this project is the 20% improvement over existing commercial model energy codes. A key requirement of the proposed stretch code is that it be ‘adoptable’ as an energy code, meaning that it must align with current code scope and limitations, and primarily impact building components that are currently regulated by local building departments. It is largely limited to prescriptive measures, which are what most building departments and design projects are most familiar with. This report describes a set of energy-efficiency measures (EEMs) that demonstrate 20% energy savings over ANSI/ASHRAE/IES Standard 90.1-2013 (ASHRAE 2013) across a broad range of commercial building types and all three climate zones in New York. In collaboration with New Building Institute, the EEMs were developed from national model codes and standards, high-performance building codes and standards, regional energy codes, and measures being proposed as part of the on-going code development process. PNNL analyzed these measures using whole building energy models for selected prototype commercial buildings and multifamily buildings representing buildings in New York. Section 2 of this report describes the analysis methodology, including the building types and construction area weights update for this analysis, the baseline, and the method to conduct the energy saving analysis. Section 3 provides detailed specifications of the EEMs and bundles. Section 4 summarizes the results of individual EEMs and EEM bundles by building type, energy end-use and climate zone. Appendix A documents detailed descriptions of the selected prototype buildings. Appendix B provides energy end-use breakdown results by building type for both the baseline code and stretch code in all climate zones.« less
New technologies for advanced three-dimensional optimum shape design in aeronautics
NASA Astrophysics Data System (ADS)
Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno
1999-05-01
The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
NASA Technical Reports Server (NTRS)
Shapiro, Wilbur
1991-01-01
The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.
Green Curriculum Analysis in Technological Education
ERIC Educational Resources Information Center
Chakraborty, Arpita; Singh, Manvendra Pratap; Roy, Mousumi
2018-01-01
With rapid industrialization and technological development, India is facing adverse affects of unsustainable pattern of production and consumption. Education for sustainable development has been widely recognized to reduce the threat of environmental degradation and resource depletion. This paper used the content analysis method to explore the…
Cookson, Emma A; Conte, Ianina L; Dempster, John; Hannah, Matthew J; Carter, Tom
2013-12-01
Regulated secretion from endothelial cells is mediated by Weibel-Palade body (WPB) exocytosis. Plasma membrane cholesterol is implicated in regulating secretory granule exocytosis and fusion pore dynamics; however, its role in modulating WPB exocytosis is not clear. To address this we combined high-resolution electrochemical analysis of WPB fusion pore dynamics, by amperometry, with high-speed optical imaging of WPB exocytosis following cholesterol depletion or supplementation in human umbilical vein endothelial cells. We identified serotonin (5-HT) immunoreactivity in WPBs, and VMAT1 expression allowing detection of secreted 5-HT as discrete current spikes during exocytosis. A high proportion of spikes (∼75%) had pre-spike foot signals, indicating that WPB fusion proceeds via an initial narrow pore. Cholesterol depletion significantly reduced pre-spike foot signal duration and increased the rate of fusion pore expansion, whereas cholesterol supplementation had broadly the reverse effect. Cholesterol depletion slowed the onset of hormone-evoked WPB exocytosis, whereas its supplementation increased the rate of WPB exocytosis and hormone-evoked proregion secretion. Our results provide the first analysis of WPB fusion pore dynamics and highlight an important role for cholesterol in the regulation of WPB exocytosis.
Proteomic analysis of hydrogen photoproduction in sulfur-deprived Chlamydomonas cells.
Chen, Mei; Zhao, Le; Sun, Yong-Le; Cui, Su-Xia; Zhang, Li-Fang; Yang, Bin; Wang, Jie; Kuang, Ting-Yun; Huang, Fang
2010-08-06
The green alga Chlamydomonas reinhardtii is a model organism to study H(2) metabolism in photosynthetic eukaryotes. To understand the molecular mechanism of H(2) metabolism, we used 2-DE coupled with MALDI-TOF and MALDI-TOF/TOF-MS to investigate proteomic changes of Chlamydomonas cells that undergo sulfur-depleted H(2) photoproduction process. In this report, we obtained 2-D PAGE soluble protein profiles of Chlamydomonas at three time points representing different phases leading to H(2) production. We found over 105 Coomassie-stained protein spots, corresponding to 82 unique gene products, changed in abundance throughout the process. Major changes included photosynthetic machinery, protein biosynthetic apparatus, molecular chaperones, and 20S proteasomal components. A number of proteins related to sulfate, nitrogen and acetate assimilation, and antioxidative reactions were also changed significantly. Other proteins showing alteration during the sulfur-depleted H(2) photoproduction process were proteins involved in cell wall and flagella metabolisms. In addition, among these differentially expressed proteins, 11 were found to be predicted proteins without functional annotation in the Chlamydomonas genome database. The results of this proteomic analysis provide new insight into molecular basis of H(2) photoproduction in Chlamydomonas under sulfur depletion.
NASA Technical Reports Server (NTRS)
Almroth, B. O.; Brogan, F. A.
1978-01-01
Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.
Md Yusof, Md Yuzaiful; Shaw, Daniel; El-Sherbiny, Yasser M; Dunn, Emma; Rawstron, Andy C; Emery, Paul; Vital, Edward M
2017-01-01
Objective To assess factors associated with primary and secondary non-response to rituximab in systemic lupus erythematosus (SLE) and evaluate management of secondary non-depletion non-response (2NDNR). Methods 125 patients with SLE treated with rituximab over 12 years were studied prospectively. A major clinical response was defined as improvement of all active British Isles Lupus Assessment Group (BILAG)-2004 domains to grade C/better and no A/B flare. Partial responders were defined by one persistent BILAG B. B-cell subsets were measured using highly sensitive flow cytometry. Patients with 2NDNR, defined by infusion reaction and defective depletion, were treated with ocrelizumab or ofatumumab. Results 117 patients had evaluable data. In cycle 1 (C1), 96/117 (82%) achieved BILAG response (major=50%, partial=32%). In multivariable analysis, younger age (OR 0.97, 95% CI 0.94 to 1.00) and B-cell depletion at 6 weeks (OR 3.22, 95% CI 1.24 to 8.33) increased the odds of major response. Complete depletion was predicted by normal complement and lower pre-rituximab plasmablasts and was not associated with increased serious infection post-rituximab. Seventy-seven (with data on 72) C1 responders were retreated on clinical relapse. Of these, 61/72 (85%) responded in cycle 2 (C2). Of the 11 C2 non-responders, nine met 2NDNR criteria (incidence=12%) and tested positive for anti-rituximab antibodies. Lack of concomitant immunosuppressant and higher pre-rituximab plasmablasts predicted 2NDNR. Five were switched to ocrelizumab/ofatumumab, and all depleted and responded. Conclusion Treatment with anti-CD20 agents can be guided by B-cell monitoring and should aim to achieve complete depletion. 2NDNR is associated with anti-rituximab antibodies, and switching to humanised agents restores depletion and response. In SLE, alternative anti-CD20 antibodies may be more consistently effective. PMID:28684557
Brejchová, Jana; Sýkora, Jan; Dlouhá, Kateřina; Roubalová, Lenka; Ostašov, Pavel; Vošahlíková, Miroslava; Hof, Martin; Svoboda, Petr
2011-12-01
Biophysical studies of fluorescence anisotropy of DPH and Laurdan generalized polarization were performed in plasma membranes (PM) isolated from control and cholesterol-depleted HEK293 cells stably expressing pertussis toxin (PTX)-insensitive DOR-Gi1α (Cys351-Ile351) fusion protein. PM isolated from control, PTX-untreated, cells were compared with PM isolated from PTX-treated cells. Results from both types of PM indicated that i) hydrophobic membrane interior was made more accessible to water molecules and more chaotically organized in cholesterol-depleted samples, ii) cholesterol depletion resulted in an overall increase in surface area of membrane, membrane fluidity, and mobility of its constituents. Analysis of DOR-Gi1α coupling in PTX-treated and PTX-untreated cells indicated that cholesterol depletion did not alter the agonist binding site of DOR (Bmax and Kd) but the ability of DOR agonist DADLE to activate G proteins was markedly impaired. In PTX-untreated membranes, EC50 for DADLE-stimulated [35S]GTPγS binding was shifted by one order of magnitude to the right: from 4.3±1.2×10(-9) M to 2.2±1.3×10(-8) M in control and cholesterol-depleted membrane samples, respectively. In PTX-treated membranes, EC50 was shifted from 4.5±1.1×10(-9) M to 2.8±1.4×10(-8) M. In summary, the perturbation of optimum PM organization by cholesterol depletion deteriorates functional coupling of DOR to covalently bound Gi1α as well as endogenously expressed PTX-sensitive G proteins of Gi/Go family while receptor ligand binding site is unchanged. The biophysical state of hydrophobic plasma (cell) membrane interior should be regarded as regulatory factor of DOR-signaling cascade. Copyright © 2011 Elsevier B.V. All rights reserved.
Human calprotectin affects the redox speciation of iron.
Nakashige, Toshiki G; Nolan, Elizabeth M
2017-08-16
We report that the metal-sequestering human host-defense protein calprotectin (CP, S100A8/S100A9 oligomer) affects the redox speciation of iron (Fe) in bacterial growth media and buffered aqueous solution. Under aerobic conditions and in the absence of an exogenous reducing agent, CP-Ser (S100A8(C42S)/S100A9(C3S) oligomer) depletes Fe from three different bacterial growth media preparations over a 48 h timeframe (T = 30 °C). The presence of the reducing agent β-mercaptoethanol accelerates this process and allows CP-Ser to deplete Fe over a ≈1 h timeframe. Fe-depletion assays performed with metal-binding-site variants of CP-Ser show that the hexahistidine (His 6 ) site, which coordinates Fe(ii) with high affinity, is required for Fe depletion. An analysis of Fe redox speciation in buffer containing Fe(iii) citrate performed under aerobic conditions demonstrates that CP-Ser causes a time-dependent increase in the [Fe(ii)]/[Fe(iii)] ratio. Taken together, these results indicate that the hexahistidine site of CP stabilizes Fe(ii) and thereby shifts the redox equilibrium of Fe to the reduced ferrous state under aerobic conditions. We also report that the presence of bacterial metabolites affects the Fe-depleting activity of CP-Ser. Supplementation of bacterial growth media with an Fe(iii)-scavenging siderophore (enterobactin, staphyloferrin B, or desferrioxamine B) attenuates the Fe-depleting activity of CP-Ser. This result indicates that formation of Fe(iii)-siderophore complexes blocks CP-mediated reduction of Fe(iii) and hence the ability of CP to coordinate Fe(ii). In contrast, the presence of pyocyanin (PYO), a redox-cycling phenazine produced by Pseudomonas aeruginosa that reduces Fe(iii) to Fe(ii), accelerates Fe depletion by CP-Ser under aerobic conditions. These findings indicate that the presence of microbial metabolites that contribute to metal homeostasis at the host/pathogen interface can affect the metal-sequestering function of CP.
NASA Astrophysics Data System (ADS)
Teinilä, K.; Frey, A.; Hillamo, R.; Tülp, H. C.; Weller, R.
2014-10-01
Aerosol chemical and physical properties were measured in 2010 at Neumayer research station, Antarctica. Samples for chemical analysis (ion chromatography) were collected using a Teflon/Nylon filter combination (TNy) sampler, and with a multi stage low pressure impactor (SDI). Particle number concentration was measured continuously with a Grimm OPC optical particle counter. Total particle number concentration varied largely throughout the year, and the highest number concentrations for particles larger than 0.3 μm were observed simultaneously with the highest sea salt concentrations. About 50% of the sea salt aerosol mass was found in the submicron size range. Below 0.2 μm of particle aerodynamic diameter the contribution of sea salt aerosols was negligible. Further analysis showed that sea salt aerosols had undergone physico-chemical processes, either during the transportation, or during their formation. High degree of chloride depletion was observed during austral summer, when the presence of acidic gases exhibit their characteristic seasonal maximum. Apart from chloride depletion, excess chloride relating to sodium was also detected in one SDI sample, indicating actually a sodium depletion by mirabilite formation on freshly formed sea ice areas. Analysis of selected episodes showed that the concentration of sea salt particles, their modal structure, and their chemical composition is connected with their source areas, their formation mechanisms, and local transport history.
Background information for Van Aken on testing of NESTT product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reynolds, John G.
2016-11-18
Debris from explosives testing in a shot tank that contains 4 weight percent or less of explosive is shown to be non-reactive under the specified testing protocol in the Code of Federal Regulations. This debris can then be regarded as a non-hazardous waste on the basis of reactivity, when collected and packaged in a specified manner. If it is contaminated with radioactive components (e.g. depleted uranium), it can therefore be disposed of as radioactive waste or mixed waste, as appropriate (note that debris may contain other materials that render it hazardous, such as beryllium). We also discuss potential waste generationmore » issues in contained firing operations that are applicable to the planned new Contained Firing Facility (CFF).« less
Interplay of Laser-Plasma Interactions and Inertial Fusion Hydrodynamics
Strozzi, D. J.; Bailey, D. S.; Michel, P.; ...
2017-01-12
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated in this work via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI—specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)—mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. In conclusion, this model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling andmore » data from hohlraum experiments on wall x-ray emission and capsule implosion shape.« less
Shift Verification and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less
Code Analysis and Refactoring with Clang Tools, Version 0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelley, Timothy M.
2016-12-23
Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.
1988-01-01
A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).
Evaluation of the DRAGON code for VHTR design analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taiwo, T. A.; Kim, T. K.; Nuclear Engineering Division
2006-01-12
This letter report summarizes three activities that were undertaken in FY 2005 to gather information on the DRAGON code and to perform limited evaluations of the code performance when used in the analysis of the Very High Temperature Reactor (VHTR) designs. These activities include: (1) Use of the code to model the fuel elements of the helium-cooled and liquid-salt-cooled VHTR designs. Results were compared to those from another deterministic lattice code (WIMS8) and a Monte Carlo code (MCNP). (2) The preliminary assessment of the nuclear data library currently used with the code and libraries that have been provided by themore » IAEA WIMS-D4 Library Update Project (WLUP). (3) DRAGON workshop held to discuss the code capabilities for modeling the VHTR.« less
NASA Astrophysics Data System (ADS)
Barber, Duncan Henry
During some postulated accidents at nuclear power stations, fuel cooling may be impaired. In such cases, the fuel heats up and the subsequent increased fission-gas release from the fuel to the gap may result in fuel sheath failure. After fuel sheath failure, the barrier between the coolant and the fuel pellets is lost or impaired, gases and vapours from the fuel-to-sheath gap and other open voids in the fuel pellets can be vented. Gases and steam from the coolant can enter the broken fuel sheath and interact with the fuel pellet surfaces and the fission-product inclusion on the fuel surface (including material at the surface of the fuel matrix). The chemistry of this interaction is an important mechanism to model in order to assess fission-product releases from fuel. Starting in 1995, the computer program SOURCE 2.0 was developed by the Canadian nuclear industry to model fission-product release from fuel during such accidents. SOURCE 2.0 has employed an early thermochemical model of irradiated uranium dioxide fuel developed at the Royal Military College of Canada. To overcome the limitations of computers of that time, the implementation of the RMC model employed lookup tables to pre-calculated equilibrium conditions. In the intervening years, the RMC model has been improved, the power of computers has increased significantly, and thermodynamic subroutine libraries have become available. This thesis is the result of extensive work based on these three factors. A prototype computer program (referred to as SC11) has been developed that uses a thermodynamic subroutine library to calculate thermodynamic equilibria using Gibbs energy minimization. The Gibbs energy minimization requires the system temperature (T) and pressure (P), and the inventory of chemical elements (n) in the system. In order to calculate the inventory of chemical elements in the fuel, the list of nuclides and nuclear isomers modelled in SC11 had to be expanded from the list used by SOURCE 2.0. A benchmark calculation demonstrates the improvement in agreement of the total inventory of those chemical elements included in the RMC fuel model to an ORIGEN-S calculation. ORIGEN-S is the Oak Ridge isotope generation and depletion computer program. The Gibbs energy minimizer requires a chemical database containing coefficients from which the Gibbs energy of pure compounds, gas and liquid mixtures, and solid solutions can be calculated. The RMC model of irradiated uranium dioxide fuel has been converted into the required format. The Gibbs energy minimizer has been incorporated into a new model of fission-product vaporization from the fuel surface. Calculated release fractions using the new code have been compared to results calculated with SOURCE IST 2.0P11 and to results of tests used in the validation of SOURCE 2.0. The new code shows improvements in agreement with experimental releases for a number of nuclides. Of particular significance is the better agreement between experimental and calculated release fractions for 140La. The improved agreement reflects the inclusion in the RMC model of the solubility of lanthanum (III) oxide (La2O3) in the fuel matrix. Calculated lanthanide release fractions from earlier computer programs were a challenge to environmental qualification analysis of equipment for some accident scenarios. The new prototype computer program would alleviate this concern. Keywords: Nuclear Engineering; Material Science; Thermodynamics; Radioactive Material, Gibbs Energy Minimization, Actinide Generation and Depletion, FissionProduct Generation and Depletion.
Green, Nancy
2005-04-01
We developed a Bayesian network coding scheme for annotating biomedical content in layperson-oriented clinical genetics documents. The coding scheme supports the representation of probabilistic and causal relationships among concepts in this domain, at a high enough level of abstraction to capture commonalities among genetic processes and their relationship to health. We are using the coding scheme to annotate a corpus of genetic counseling patient letters as part of the requirements analysis and knowledge acquisition phase of a natural language generation project. This paper describes the coding scheme and presents an evaluation of intercoder reliability for its tag set. In addition to giving examples of use of the coding scheme for analysis of discourse and linguistic features in this genre, we suggest other uses for it in analysis of layperson-oriented text and dialogue in medical communication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi
2012-10-01
PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICSmore » (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2014-04-01
The INL PHISICS code system consists of three modules providing improved core simulation capability: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. Coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been finalized, and as part of the code verification and validation program the exercises defined for Phase I of the OECD/NEA MHTGR 350 MW Benchmark were completed. This paper provides an overview of the MHTGR Benchmark, and presents selected results of the three steady state exercises 1-3 defined for Phase I. For Exercise 1,more » a stand-alone steady-state neutronics solution for an End of Equilibrium Cycle Modular High Temperature Reactor (MHTGR) was calculated with INSTANT, using the provided geometry, material descriptions, and detailed cross-section libraries. Exercise 2 required the modeling of a stand-alone thermal fluids solution. The RELAP5-3D results of four sub-cases are discussed, consisting of various combinations of coolant bypass flows and material thermophysical properties. Exercise 3 combined the first two exercises in a coupled neutronics and thermal fluids solution, and the coupled code suite PHISICS/RELAP5-3D was used to calculate the results of two sub-cases. The main focus of the paper is a comparison of the traditional RELAP5-3D “ring” model approach vs. a much more detailed model that include kinetics feedback on individual block level and thermal feedbacks on a triangular sub-mesh. The higher fidelity of the block model is illustrated with comparison results on the temperature, power density and flux distributions, and the typical under-predictions produced by the ring model approach are highlighted.« less
Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery
NASA Astrophysics Data System (ADS)
Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.
2017-05-01
In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.
Wurm, Philipp; Spindelboeck, Walter; Krause, Robert; Plank, Johannes; Fuchs, Gottfried; Bashir, Mina; Petritsch, Wolfgang; Halwachs, Bettina; Langner, Cord; Högenauer, Christoph
2017-01-01
Objective: Antibiotic therapy is a major risk factor for the development of diarrhea and colitis with varying severity. Often the origin of antibiotic-associated gastrointestinal deterioration remains elusive and no specific infectious agents could be discerned. Patients: We represent three cases of intractable high-volume diarrhea associated with combined antibiotic and steroid therapy in critically ill patients not fitting into established disease entities. Cases presented with severe apoptotic enterocolitis resembling acute intestinal graft-versus-host-disease. Microbiologic workup precluded known enteropathogens, but microbiota analysis revealed a severely depleted gut microbiota with concomitant opportunistic pathogen overgrowth. Interventions: Fecal microbiota transplantation, performed in one patient, was associated with correction of dysbiosis, rapid clinical improvement, and healing of enterocolitis. Conclusions: Our series represents a severe form of antibiotic-associated colitis in critically ill patients signified by microbiota depletion, and reestablishment of a physiologic gastrointestinal microbiota might be beneficial for this condition. PMID:28333760
König, Simone; Nitzki, Frauke; Uhmann, Anja; Dittmann, Kai; Theiss-Suennemann, Jennifer; Herrmann, Markus; Reichardt, Holger M; Schwendener, Reto; Pukrop, Tobias; Schulz-Schaeffer, Walter; Hahn, Heidi
2014-01-01
Basal cell carcinoma (BCC) belongs to the group of non-melanoma skin tumors and is the most common tumor in the western world. BCC arises due to mutations in the tumor suppressor gene Patched1 (Ptch). Analysis of the conditional Ptch knockout mouse model for BCC reveals that macrophages and dendritic cells (DC) of the skin play an important role in BCC growth restraining processes. This is based on the observation that a clodronate-liposome mediated depletion of these cells in the tumor-bearing skin results in significant BCC enlargement. The depletion of these cells does not modulate Ki67 or K10 expression, but is accompanied by a decrease in collagen-producing cells in the tumor stroma. Together, the data suggest that cutaneous macrophages and DC in the tumor microenvironment exert an antitumor effect on BCC.
Six Questions for the Resource Model of Control (and Some Answers)
Inzlicht, Michael; Berkman, Elliot
2017-01-01
The resource model of self-control casts self-control as a capacity that relies on some limited resource that exhausts with use. The model captured our imagination and brought much-needed attention on an important yet neglected psychological construct. Despite its success, basic issues with the model remain. Here, we ask six questions: (i) Does self-control really wane over time? (ii) Is ego depletion a form of mental fatigue? (iii) What is the resource that is depleted by ego depletion? (iv) How can changes in motivation, perception, and expectations replenish an exhausted resource? (v) Has the revised resource model unwittingly become a model about motivation? (vi) Do self-control exercises increase self-control? By providing some answers to these questions – including conducting a meta-analysis of the self-control training literature – we highlight how the resource model needs to be revised if not supplanted altogether. PMID:28966660
Upregulation of miR-146a by YY1 depletion correlates with delayed progression of prostate cancer
Huang, Yeqing; Tao, Tao; Liu, Chunhui; Guan, Han; Zhang, Guangyuan; Ling, Zhixin; Zhang, Lei; Lu, Kai; Chen, Shuqiu; Xu, Bin; Chen, Ming
2017-01-01
Previously published studies explained that the excessive expression of miR-146a influences the prostate cancer (PCa) cells in terms of apoptosis, progression, and viability. Although miR-146a acts as a tumor suppressor, current knowledge on the molecular mechanisms that controls its expression in PCa is limited. In this study, gene set enrichment analysis (GSEA) showed negatively enriched expression of miR-146a target gene sets and positively enriched expression of gene sets suppressed by the enhancer of zeste homolog 2 (EZH2) after YY1 depletion in PCa cells. The current results demonstrated that the miR-146a levels in PCa tissues with high Gleason scores (>7) are significantly lower than those in PCa tissues with low Gleason scores (≤7), which were initially observed in the clinical specimens. An inverse relationship between YY1 and miR-146a expression was also observed. Experiments indicated the decrease in cell viability, proliferation, and promoting apoptosis after YY1 depletion, while through inhibiting miR-146a could alleviate the negative effect brought by YY1 depletion. We detected the reversed adjustment of YY1 to accommodate miR-146a transcriptions. On the basis of YY1 depletion, we determined that the expression of miR-146a increased after EZH2 knockdown. We validated the combination of YY1 and its interaction with EZH2 at the miR-146a promoter binding site, thereby prohibiting the transcriptional activity of miR-146a in PCa cells. Our results suggested that YY1 depletion repressed PCa cell viability and proliferation and induced apoptosis at least in a miR-146a-assisted manner. PMID:28101571
Feijão, Tália; Afonso, Olga; Maia, André F; Sunkel, Claudio E
2013-10-01
Kinetochores bind spindle microtubules and also act as signaling centers that monitor this interaction. Defects in kinetochore assembly lead to chromosome missegregation and aneuploidy. The interaction between microtubules and chromosomes involves a conserved super-complex of proteins, known as the KNL1Mis12Ndc80 (KMN) network, composed by the KNL1 (Spc105), Mis12, and Ndc80 complexes. Previous studies indicate that all components of the network are required for kinetochore-microtubule attachment and all play relevant functions in chromosome congression, biorientation, and segregation. Here, we report a comparative study addressing the role of the different KMN components using dsRNA and in vivo fluorescence microscopy in Drosophila S2 cells allowing us to suggest that different KMN network components might perform different roles in chromosome segregation and the mitotic checkpoint signaling. Depletion of different components results in mostly lateral kinetochore-microtubule attachments that are relatively stable on depletion of Mis12 or Ndc80 but very unstable after Spc105 depletion. In vivo analysis on depletion of Mis12, Ndc80, and to some extent Spc105, shows that lateral kinetochore-microtubule interactions are still functional allowing poleward kinetochore movement. We also find that different KMN network components affect differently the localization of spindle assembly checkpoint (SAC) proteins at kinetochores. Depletion of Ndc80 and Spc105 abolishes the mitotic checkpoint, whereas depletion of Mis12 causes a delay in mitotic progression. Taken together, our results suggest that Mis12 and Ndc80 complexes help to properly orient microtubule attachment, whereas Spc105 plays a predominant role in the kinetochore-microtubule attachment as well as in the poleward movement of chromosomes, SAC response, and cell viability. Copyright © 2013 Wiley Periodicals, Inc.
Murnane, Kevin Sean; Perrine, Shane Alan; Finton, Brendan James; Galloway, Matthew Peter; Howell, Leonard Lee; Fantegrossi, William Edward
2011-01-01
Rationale Considerable evidence indicates that amphetamine derivatives can deplete brain monoaminergic neurotransmitters. However, the behavioral and cognitive consequences of neurochemical depletions induced by amphetamines are not well established. Objectives In this study, mice were exposed to dosing regimens of 3,4-methylenedioxymethamphetamine (MDMA), methamphetamine (METH), or para-chloroamphetamine (PCA) known to deplete the monoamine neurotransmitters dopamine and serotonin, and the effects of these dosing regimens on learning and memory were assessed. Methods In the same animals, we determined deficits in learning and memory via passive avoidance (PA) behavior and changes in tissue content of monoamine neurotransmitters and their primary metabolites in the striatum, frontal cortex, cingulate, hippocampus, and amygdala via ex vivo high pressure liquid chromatography. Results Consistent with previous studies, significant reductions in tissue content of dopamine and serotonin were readily apparent. In addition, exposure to METH and PCA impaired PA performance and resulted in significant depletions of dopamine, serotonin, and their metabolites in several brain regions. Multiple linear regression analysis revealed that the tissue concentration of dopamine in the anterior striatum was the strongest predictor of PA performance, with an additional significant contribution by the tissue concentration of the serotonin metabolite 5-hydroxyindoleacetic acid in the cingulate. In contrast to the effects of METH and PCA, exposure to MDMA did not deplete anterior striatal dopamine levels or cingulate levels of 5-hydroxyindoleacetic acid, and it did not impair PA performance. Conclusions These studies demonstrate that certain amphetamines impair PA performance in mice and that these impairments may be attributable to specific neurochemical depletions. PMID:21993877
ELEMENTAL DEPLETIONS IN THE MAGELLANIC CLOUDS AND THE EVOLUTION OF DEPLETIONS WITH METALLICITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tchernyshyov, Kirill; Meixner, Margaret; Seale, Jonathan
2015-10-01
We present a study of the composition of gas and dust in the Large and Small Magellanic Clouds (LMC and SMC) using UV absorption spectroscopy. We measure P ii and Fe ii along 84 spatially distributed sightlines toward the MCs using archival Far Ultraviolet Spectroscopic Explorer observations. For 16 of those sightlines, we also measure Si ii, Cr ii, and Zn ii from new Hubble Space Telescope Cosmic Origins Spectrograph observations. We analyze these spectra using a new spectral line analysis technique based on a semi-parametric Voigt profile model. We have combined these measurements with H i and H{sub 2}more » column densities and reference stellar abundances from the literature to derive gas-phase abundances, depletions, and gas-to-dust ratios (GDRs). Of our 84 P and 16 Zn measurements, 80 and 13, respectively, are depleted by more than 0.1 dex, suggesting that P and Zn abundances are not accurate metallicity indicators at and above the metallicity of the SMC. Si, Cr, and Fe are systematically less depleted in the SMC than in the Milky Way (MW) or LMC. The minimum Si depletion in the SMC is consistent with zero. We find GDR ranges of 190–565 in the LMC and 480–2100 in the SMC, which is broadly consistent with GDRs from the literature. These ranges represent actual location to location variation and are evidence of dust destruction and/or growth in the diffuse neutral phase of the interstellar medium. Where they overlap in metallicity, the gas-phase abundances of the MW, LMC, and SMC and damped Lyα systems evolve similarly with metallicity.« less
Natsch, Andreas; Gfeller, Hans
2008-12-01
A key step in the skin sensitization process is the formation of a covalent adduct between skin sensitizers and endogenous proteins and/or peptides in the skin. Based on this mechanistic understanding, there is a renewed interest in in vitro assays to determine the reactivity of chemicals toward peptides in order to predict their sensitization potential. A standardized peptide reactivity assay yielded a promising predictivity. This published assay is based on high-performance liquid chromatography with ultraviolet detection to quantify peptide depletion after incubation with test chemicals. We had observed that peptide depletion may be due to either adduct formation or peptide oxidation. Here we report a modified assay based on both liquid chromatography-mass spectrometry (LC-MS) analysis and detection of free thiol groups. This approach allows simultaneous determination of (1) peptide depletion, (2) peptide oxidation (dimerization), (3) adduct formation, and (4) thiol reactivity and thus generates a more detailed characterization of the reactivity of a molecule. Highly reactive molecules are further discriminated with a kinetic measure. The assay was validated on 80 chemicals. Peptide depletion could accurately be quantified both with LC-MS detection and depletion of thiol groups. The majority of the moderate/strong/extreme sensitizers formed detectable peptide adducts, but many sensitizers were also able to catalyze peptide oxidation. Whereas adduct formation was only observed for sensitizers, this oxidation reaction was also observed for two nonsensitizing fragrance aldehydes, indicating that peptide depletion might not always be regarded as sufficient evidence for rating a chemical as a sensitizer. Thus, this modified assay gives a more informed view of the peptide reactivity of chemicals to better predict their sensitization potential.
Contributions of Human Cytochrome P450 Enzymes to Glyburide Metabolism*
Zhou, Lin; Naraharisetti, Suresh B.; Liu, Li; Wang, Honggang; Lin, Yvonne S.; Isoherranen, Nina; Unadkat, Jashvant D.; Hebert, Mary F.; Mao, Qingcheng
2011-01-01
Glyburide (GLB) is a widely used oral sulfonylurea for the treatment of gestational diabetes. Therapeutic use of GLB is often complicated by a substantial inter-individual variability in the pharmacokinetics and pharmacodynamics of the drug in human populations, which might be caused by inter-individual variations in factors such as GLB metabolism. Therefore, there has been a continued interest in identifying human cytochrome P450 (CYP) isoforms that play a major role in the metabolism of GLB. However, contrasting data are available in the present literature in this regard. In the present study, we systematically investigated the contributions of various human CYP isoforms (CYP3A4, CYP3A5, CYP2C8, CYP2C9, and CYP2C19) to in vitro metabolism of GLB. GLB depletion and metabolite formation in human liver microsomes were most significantly inhibited by the CYP3A inhibitor ketoconazole compared with the inhibitors of other CYP isoforms. Furthermore, multiple correlation analysis between GLB depletion and individual CYP activities was performed, demonstrating a significant correlation between GLB depletion and the CYP3A probe activity in 16 individual human liver microsomal preparations, but not between GLB depletion and the CYP2C19, CYP2C8, or CYP2C9 probe activity. By using recombinant supersomes overexpressing individual human CYP isoforms, we found that GLB could be depleted by all the enzymes tested; however, the intrinsic clearance (Vmax/Km) of CYP3A4 for GLB depletion was 4 – 17 times greater than that of other CYP isoforms. These results confirm that human CYP3A4 is the major enzyme invovled in the in vitro metabolism of GLB. PMID:20437462
Behavioral Impulsivity Does Not Predict Naturalistic Alcohol Consumption or Treatment Outcomes
Mullen, Jillian; Mathias, Charles W.; Karns, Tara E.; Liang, Yuanyuan; Hill-Kapturczak, Nathalie; Roache, John D.; Lamb, Richard J.; Dougherty, Donald M.
2016-01-01
Objective The purpose of this study was to determine if behavioral impulsivity under multiple conditions (baseline, after alcohol consumption or after serotonin depletion) predicted naturalistic alcohol use or treatment outcomes from a moderation-based contingency management intervention. Method The current data analysis pulls information from three phases of a large study: 1) Phase 1 examined baseline and the effects of alcohol use and serotonin depletion on three types of behavioral impulsivity: response initiation (IMT task), response inhibition (GoStop task), and delay discounting (SKIP task); 2) Phase 2 involved 28 days of naturalistic drinking; and 3) Phase 3 involved 3 months of contingency management. During phases 2 and 3 alcohol use was measured objectively using transdermal alcohol monitors. The results of each individual phase has been previously published showing that at a group level the effects of alcohol consumption on impulsivity were dependent on the component of impulsivity being measured and the dose of alcohol consumed but serotonin depletion had no effect on impulsivity, and that a moderation-based contingency management intervention reduced heavy drinking. Results The current analysis combining data from those who completed all three phases (n = 67) showed that impulsivity measured at baseline, after alcohol consumption, or after serotonin depletion did not predict naturalistic drinking or treatment outcomes from a moderation-based CM treatment. Conclusions Contingency management interventions may prove to be an effective intervention for impulsive individuals, however, normal variations in measured impulsivity do not seem to relate to normal variations in drinking pattern or response to moderation-based contingency management. PMID:27746702
Depletion of juvenile hormone esterase extends larval growth in Bombyx mori.
Zhang, Zhongjie; Liu, Xiaojing; Shiotsuki, Takahiro; Wang, Zhisheng; Xu, Xia; Huang, Yongping; Li, Muwang; Li, Kai; Tan, Anjiang
2017-02-01
Two major hormones, juvenile hormone (JH) and 20-hydroxyecdysone (20E), regulate insect growth and development according to their precisely coordinated titres, which are controlled by both biosynthesis and degradation pathways. Juvenile hormone esterase (JHE) is the primary JH-specific degradation enzyme that plays a key role in regulating JH titers, along with JH epoxide hydrolase (JHEH) and JH diol kinase (JHDK). In the current study, a loss-of-function analysis of JHE in the silkworm, Bombyx mori, was performed by targeted gene disruption using the transgenic CRISPR/Cas9 (clustered regularly interspaced short palindromic repeats/RNA-guided Cas9 nucleases) system. Depletion of B. mori JHE (BmJHE) resulted in the extension of larval stages, especially the penultimate and ultimate larval stages, without deleterious effects to silkworm physiology. The expression of JHEH and JHDK was upregulated in mutant animals, indicating the existence of complementary routes in the JH metabolism pathway in which inactivation of one enzyme will activate other enzymes. RNA-Seq analysis of mutant animals revealed that genes involved in protein processing in the endoplasmic reticulum and in amino acid metabolism were affected by BmJHE depletion. Depletion of JHE and subsequent delayed JH metabolism activated genes in the TOR pathway, which are ultimately responsible for extending larval growth. The transgenic Cas9 system used in the current study provides a promising approach for analysing the actions of JH, especially in nondrosophilid insects. Furthermore, prolonging larval stages produced larger larvae and cocoons, which is greatly beneficial to silk production. Copyright © 2017 Elsevier Ltd. All rights reserved.
Thermodynamic analysis of the advanced zero emission power plant
NASA Astrophysics Data System (ADS)
Kotowicz, Janusz; Job, Marcin
2016-03-01
The paper presents the structure and parameters of advanced zero emission power plant (AZEP). This concept is based on the replacement of the combustion chamber in a gas turbine by the membrane reactor. The reactor has three basic functions: (i) oxygen separation from the air through the membrane, (ii) combustion of the fuel, and (iii) heat transfer to heat the oxygen-depleted air. In the discussed unit hot depleted air is expanded in a turbine and further feeds a bottoming steam cycle (BSC) through the main heat recovery steam generator (HRSG). Flue gas leaving the membrane reactor feeds the second HRSG. The flue gas consist mainly of CO2 and water vapor, thus, CO2 separation involves only the flue gas drying. Results of the thermodynamic analysis of described power plant are presented.
LARGE SCALE DISASTER ANALYSIS AND MANAGEMENT: SYSTEM LEVEL STUDY ON AN INTEGRATED MODEL
The increasing intensity and scale of human activity across the globe leading to severe depletion and deterioration of the Earth's natural resources has meant that sustainability has emerged as a new paradigm of analysis and management. Sustainability, conceptually defined by the...
Mal-Xtract: Hidden Code Extraction using Memory Analysis
NASA Astrophysics Data System (ADS)
Lim, Charles; Syailendra Kotualubun, Yohanes; Suryadi; Ramli, Kalamullah
2017-01-01
Software packer has been used effectively to hide the original code inside a binary executable, making it more difficult for existing signature based anti malware software to detect malicious code inside the executable. A new method of written and rewritten memory section is introduced to to detect the exact end time of unpacking routine and extract original code from packed binary executable using Memory Analysis running in an software emulated environment. Our experiment results show that at least 97% of the original code from the various binary executable packed with different software packers could be extracted. The proposed method has also been successfully extracted hidden code from recent malware family samples.
Interactive Finite Elements for General Engine Dynamics Analysis
NASA Technical Reports Server (NTRS)
Adams, M. L.; Padovan, J.; Fertis, D. G.
1984-01-01
General nonlinear finite element codes were adapted for the purpose of analyzing the dynamics of gas turbine engines. In particular, this adaptation required the development of a squeeze-film damper element software package and its implantation into a representative current generation code. The ADINA code was selected because of prior use of it and familiarity with its internal structure and logic. This objective was met and the results indicate that such use of general purpose codes is viable alternative to specialized codes for general dynamics analysis of engines.
Impact of Preservation of Subsoil Water Act on Groundwater Depletion: The Case of Punjab, India.
Tripathi, Amarnath; Mishra, Ashok K; Verma, Geetanjali
2016-07-01
Indian states like Punjab and Haryana, epicenters of the Green Revolution, are facing severe groundwater shortages and falling water tables. Recognizing it as a serious concern, the Government of Punjab enacted the Punjab Preservation of Subsoil Water Act in 2009 (or the 2009 act) to slow groundwater depletion. The objective of this study is to assess the impact of this policy on groundwater depletion, using panel data from 1985 to 2011. Results from this study find a robust effect of the 2009 act on reducing groundwater depletion. Our models for pre-monsoon, post-monsoon, and overall periods of analysis find that since implementation of the 2009 act, groundwater tables have improved significantly. Second, our study reveals that higher shares of tube wells per total cropped area and increased population density have led to a significant decline in the groundwater tables. On the other hand, rainfall and the share of area irrigated by surface water have had an augmenting effect on groundwater resources. In the two models, pre-monsoon and post-monsoon, this study shows that seasonality plays a key role in determining the groundwater table in Punjab. Specifically, monsoon rainfall has a very prominent impact on groundwater.
Oettel, M
2004-04-01
We analyze the depletion interaction between two hard colloids in a hard-sphere solvent and pay special attention to the limit of large size ratio between colloids and solvent particles which is governed by the well-known Derjaguin approximation. For separations between the colloids of less than the diameter of the solvent particles (defining the depletion region), the solvent structure between the colloids can be analyzed in terms of an effective two-dimensional gas. Thereby we find that the Derjaguin limit is approached more slowly than previously thought. This analysis is in good agreement with simulation data which are available for a moderate size ratio of 10. Small discrepancies in results from density functional theory (DFT) at this size ratio become amplified for larger size ratios. Therefore we have improved upon previous DFT techniques by imposing test-particle consistency which connects DFT to integral equations. However, the improved results show no convergence towards the Derjaguin limit and thus we conclude that this implementation of DFT together with previous ones which rely on test-particle insertion become unreliable in predicting the force in the depletion region for size ratios larger than 10.
Berger, Markus; van der Ent, Ruud; Eisner, Stephanie; Bach, Vanessa; Finkbeiner, Matthias
2014-04-15
Aiming to enhance the analysis of water consumption and resulting consequences along the supply chain of products, the water accounting and vulnerability evaluation (WAVE) model is introduced. On the accounting level, atmospheric evaporation recycling within drainage basins is considered for the first time, which can reduce water consumption volumes by up to 32%. Rather than predicting impacts, WAVE analyzes the vulnerability of basins to freshwater depletion. Based on local blue water scarcity, the water depletion index (WDI) denotes the risk that water consumption can lead to depletion of freshwater resources. Water scarcity is determined by relating annual water consumption to availability in more than 11,000 basins. Additionally, WDI accounts for the presence of lakes and aquifers which have been neglected in water scarcity assessments so far. By setting WDI to the highest value in (semi)arid basins, absolute freshwater shortage is taken into account in addition to relative scarcity. This avoids mathematical artifacts of previous indicators which turn zero in deserts if consumption is zero. As illustrated in a case study of biofuels, WAVE can help to interpret volumetric water footprint figures and, thus, promotes a sustainable use of global freshwater resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Page, R.; Jones, J.R.
1997-07-01
Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation toolsmore » is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.« less
Li, Xingang; Lu, Hongming; Fan, Guilian; He, Miao; Sun, Yu; Xu, Kai; Shi, Fengjun
2017-11-01
Osteosarcoma (OS) is one of the most prevalent primary malignant bone tumors in adolescent. HOTAIR is highly expressed and associated with the epigenetic modifications, especially DNA methylation, in cancer. However, the regulation mechanism between HOTAIR and DNA methylation and the biological effects of them in the pathogenesis of osteosarcoma remains elusive. Through RNA-sequencing and computational analysis, followed by a variety of experimental validations, we report a novel interplay between HOTAIR, miR-126, and DNA methylation in OS. We found that HOTAIR is highly expressed in OS cells and the knockdown of HOTAIR leads to the down-regulation of DNMT1, as well as the decrease of global DNA methylation level. RNA-sequencing analysis of HOTAIR-regulated gene shows that CDKN2A is significantly repressed by HOTAIR. A series of experiments show that HOTAIR represses the expression of CDKN2A through inhibiting the promoter activity of CDKN2A by DNA hypermethylation. Further evidence shows that HOTAIR activates the expression of DNMT1 through repressing miR-126, which is the negative regulator of DNMT1. Functionally, HOTAIR depletion increases the sensibility of OS cells to DNMT1 inhibitor through regulating the viability and apoptosis of OS cells via HOTAIR-miR126-DNMT1-CDKN2A axis. These results not only enrich our understanding of the regulation relationship between non-coding RNA, DNA methylation, and gene expression, however, also provide a novel direction in developing more sophisticated therapeutic strategies for OS patients.
Error control techniques for satellite and space communications
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.
1988-01-01
During the period December 1, 1987 through May 31, 1988, progress was made in the following areas: construction of Multi-Dimensional Bandwidth Efficient Trellis Codes with MPSK modulation; performance analysis of Bandwidth Efficient Trellis Coded Modulation schemes; and performance analysis of Bandwidth Efficient Trellis Codes on Fading Channels.
Modeling of rolling element bearing mechanics. Computer program user's manual
NASA Technical Reports Server (NTRS)
Greenhill, Lyn M.; Merchant, David H.
1994-01-01
This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.
Sandia Engineering Analysis Code Access System v. 2.0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjaardema, Gregory D.
The Sandia Engineering Analysis Code Access System (SEACAS) is a suite of preprocessing, post processing, translation, visualization, and utility applications supporting finite element analysis software using the Exodus database file format.
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bednarcyk, Brett A.; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Walton, Owen
2015-01-01
Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MACGMC composite material analysis code. The resulting code is called FEAMACCARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMACCARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMACCARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.
Pang, Junbiao; Qin, Lei; Zhang, Chunjie; Zhang, Weigang; Huang, Qingming; Yin, Baocai
2015-12-01
Local coordinate coding (LCC) is a framework to approximate a Lipschitz smooth function by combining linear functions into a nonlinear one. For locally linear classification, LCC requires a coding scheme that heavily determines the nonlinear approximation ability, posing two main challenges: 1) the locality making faraway anchors have smaller influences on current data and 2) the flexibility balancing well between the reconstruction of current data and the locality. In this paper, we address the problem from the theoretical analysis of the simplest local coding schemes, i.e., local Gaussian coding and local student coding, and propose local Laplacian coding (LPC) to achieve the locality and the flexibility. We apply LPC into locally linear classifiers to solve diverse classification tasks. The comparable or exceeded performances of state-of-the-art methods demonstrate the effectiveness of the proposed method.
Superimposed Code Theoretic Analysis of DNA Codes and DNA Computing
2008-01-01
complements of one another and the DNA duplex formed is a Watson - Crick (WC) duplex. However, there are many instances when the formation of non-WC...that the user’s requirements for probe selection are met based on the Watson - Crick probe locality within a target. The second type, called...AFRL-RI-RS-TR-2007-288 Final Technical Report January 2008 SUPERIMPOSED CODE THEORETIC ANALYSIS OF DNA CODES AND DNA COMPUTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchibori, Akihiro; Kurihara, Akikazu; Ohshima, Hiroyuki
A multiphysics analysis system for sodium-water reaction phenomena in a steam generator of sodium-cooled fast reactors was newly developed. The analysis system consists of the mechanistic numerical analysis codes, SERAPHIM, TACT, and RELAP5. The SERAPHIM code calculates the multicomponent multiphase flow and sodium-water chemical reaction caused by discharging of pressurized water vapor. Applicability of the SERAPHIM code was confirmed through the analyses of the experiment on water vapor discharging in liquid sodium. The TACT code was developed to calculate heat transfer from the reacting jet to the adjacent tube and to predict the tube failure occurrence. The numerical models integratedmore » into the TACT code were verified through some related experiments. The RELAP5 code evaluates thermal hydraulic behavior of water inside the tube. The original heat transfer correlations were corrected for the tube rapidly heated by the reacting jet. The developed system enables evaluation of the wastage environment and the possibility of the failure propagation.« less
Issues in Stratospheric Ozone Depletion.
NASA Astrophysics Data System (ADS)
Lloyd, Steven Andrew
Following the announcement of the discovery of the Antarctic ozone hole in 1985 there have arisen a multitude of questions pertaining to the nature and consequences of polar ozone depletion. This thesis addresses several of these specific questions, using both computer models of chemical kinetics and the Earth's radiation field as well as laboratory kinetic experiments. A coupled chemical kinetic-radiative numerical model was developed to assist in the analysis of in situ field measurements of several radical and neutral species in the polar and mid-latitude lower stratosphere. Modeling was used in the analysis of enhanced polar ClO, mid-latitude diurnal variation of ClO, and simultaneous measurements of OH, HO_2, H_2 O and O_3. Most importantly, such modeling was instrumental in establishing the link between the observed ClO and BrO concentrations in the Antarctic polar vortex and the observed rate of ozone depletion. The principal medical concern of stratospheric ozone depletion is that ozone loss will lead to the enhancement of ground-level UV-B radiation. Global ozone climatology (40^circS to 50^ circN latitude) was incorporated into a radiation field model to calculate the biologically accumulated dosage (BAD) of UV-B radiation, integrated over days, months, and years. The slope of the annual BAD as a function of latitude was found to correspond to epidemiological data for non-melanoma skin cancers for 30^circ -50^circN. Various ozone loss scenarios were investigated. It was found that a small ozone loss in the tropics can provide as much additional biologically effective UV-B as a much larger ozone loss at higher latitudes. Also, for ozone depletions of > 5%, the BAD of UV-B increases exponentially with decreasing ozone levels. An important key player in determining whether polar ozone depletion can propagate into the populated mid-latitudes is chlorine nitrate, ClONO_2 . As yet this molecule is only indirectly accounted for in computer models and field measurements. Therefore a laboratory prototype of an instrument to measure ClONO _2 concentrations in situ was developed, adapting techniques recently developed in this research group to measure ClO concentrations at the part-per-trillion level. The detection scheme involves heating a flowing air sample to almost 500K, thermally dissociating ClONO _2 into ClO and NO_2 , and measuring the resulting ClO concentration by titrating with NO to produce Cl atoms, which are detected by resonance fluoresence. The calibration of this technique is very sensitive to flow parameters (temperature, pressure, flow velocity, added NO concentration, and homogeneity of flow). The issues developed in this thesis contribute to our understanding of the mechanisms of stratospheric ozone depletion and its potential global impact. It is becoming increasingly apparent that our ability to predict the future course of global ozone depletion is critically dependent on our ability to reproduce in situ and remote measurements with numerical models.
Continous Monitoring of Melt Composition
NASA Technical Reports Server (NTRS)
Frazer, R. E.; Andrews, T. W.
1984-01-01
Compositions of glasses and alloys analyzed and corrected in real time. Spectral analysis and temperature measurement performed simultaneously on molten material in container, such as open-hearth furnace, crucible or tank of continuous furnace. Speed of analysis makes it possible to quickly measure concentration of volatile elements depleted by prolonged heating.
This study used phylogenetic probes in hybridization analysis to (i) determine in situ microbial community structures in regions of a shallow sand aquifer that were oxygen depleted and fuel contaminated (FC) or aerobic and noncontaminted (NC) and (ii) examine alterations in micro...
NASA Technical Reports Server (NTRS)
Stoll, Frederick
1993-01-01
The NLPAN computer code uses a finite-strip approach to the analysis of thin-walled prismatic composite structures such as stiffened panels. The code can model in-plane axial loading, transverse pressure loading, and constant through-the-thickness thermal loading, and can account for shape imperfections. The NLPAN code represents an attempt to extend the buckling analysis of the VIPASA computer code into the geometrically nonlinear regime. Buckling mode shapes generated using VIPASA are used in NLPAN as global functions for representing displacements in the nonlinear regime. While the NLPAN analysis is approximate in nature, it is computationally economical in comparison with finite-element analysis, and is thus suitable for use in preliminary design and design optimization. A comprehensive description of the theoretical approach of NLPAN is provided. A discussion of some operational considerations for the NLPAN code is included. NLPAN is applied to several test problems in order to demonstrate new program capabilities, and to assess the accuracy of the code in modeling various types of loading and response. User instructions for the NLPAN computer program are provided, including a detailed description of the input requirements and example input files for two stiffened-panel configurations.
Focal adhesion kinase regulates smooth muscle cell recruitment to the developing vasculature
Cheng, Zhaokang; Sundberg-Smith, Liisa J.; Mangiante, Lee E.; Sayers, Rebecca L.; Hakim, Zeenat S.; Musunuri, Srilaxmi; Maguire, Colin T.; Majesky, Mark W.; Zhou, Zhigang; Mack, Christopher P.; Taylor, Joan M.
2011-01-01
Objective The investment of newly formed endothelial cell tubes with differentiated smooth muscle cells (SMC) is critical for appropriate vessel formation, but the underlying mechanisms remain unknown. We previously showed that depletion of focal adhesion kinase (FAK) in the nkx2.5 expression domain led to aberrant outflow tract (OFT) morphogenesis and strove herein to determine the cell types and mechanisms involved. Methods and Results We crossed fakloxp targeted mice with available Cre drivers to deplete FAK in OFT SMC (FAKwnt and FAKnk) or coronary SMC (FAKcSMC). In each case, depletion of FAK led to defective vasculogenesis that was incompatible with post-natal life. Immunohistochemical analysis of the mutant vascular structures revealed that FAK was not required for progenitor cell proliferation, survival, or differentiation into SMC, but was necessary for subsequent SMC recruitment to developing vasculature. Using a novel FAK-null SMC culture model, we found that depletion of FAK did not influence SMC growth or survival, but blocked directional SMC motility and invasion toward the potent endothelial-derived chemokine, PDGFBB. FAK depletion resulted in un-stable lamellipodial protrusions due to defective spatial-temporal activation of the small GTPase, Rac-1 and lack of Rac1-dependent recruitment of cortactin (an actin stabilizing protein) to the leading edge. Moreover, FAK null SMC exhibited a significant reduction in PDGF-stimulated extracellular matrix degradation. Conclusions FAK drives PDGFBB-stimulated SMC chemotaxis/invasion and is essential for SMC to appropriately populate the aorticopulmonary septum and the coronary vascular plexus. PMID:21757658
Manipulating the Mitochondrial Genome To Enhance Cattle Embryo Development
Srirattana, Kanokwan; St. John, Justin C.
2017-01-01
The mixing of mitochondrial DNA (mtDNA) from the donor cell and the recipient oocyte in embryos and offspring derived from somatic cell nuclear transfer (SCNT) compromises genetic integrity and affects embryo development. We set out to generate SCNT embryos that inherited their mtDNA from the recipient oocyte only, as is the case following natural conception. While SCNT blastocysts produced from Holstein (Bos taurus) fibroblasts were depleted of their mtDNA, and oocytes derived from Angus (Bos taurus) cattle possessed oocyte mtDNA only, the coexistence of donor cell and oocyte mtDNA resulted in blastocysts derived from nondepleted cells. Moreover, the use of the reprogramming agent, Trichostatin A (TSA), further improved the development of embryos derived from depleted cells. RNA-seq analysis highlighted 35 differentially expressed genes from the comparison between blastocysts generated from nondepleted cells and blastocysts from depleted cells, both in the presence of TSA. The only differences between these two sets of embryos were the presence of donor cell mtDNA, and a significantly higher mtDNA copy number for embryos derived from nondepleted cells. Furthermore, the use of TSA on embryos derived from depleted cells positively modulated the expression of CLDN8, TMEM38A, and FREM1, which affect embryonic development. In conclusion, SCNT embryos produced by mtDNA depleted donor cells have the same potential to develop to the blastocyst stage without the presumed damaging effect resulting from the mixture of donor and recipient mtDNA. PMID:28500053
Liu, Wei; Li, Shi-Zhu; Li, Zhi; Wang, Yang; Li, Xi-Yin; Zhong, Jian-Xiang; Zhang, Xiao-Juan; Zhang, Jun; Zhou, Li; Gui, Jian-Fang
2015-11-18
Gynogenesis is one of unisexual reproduction modes in vertebrates, and produces all-female individuals with identical genetic background. In sexual reproduction vertebrates, the roles of primordial germ cells on sexual dimorphism and gonadal differentiation have been largely studied, and two distinct functional models have been proposed. However, the role of primordial germ cells remains unknown in unisexual animals, and it is also unclear whether the functional models in sexual reproduction animals are common in unisexual animals. To solve these puzzles, we attempt to utilize the gynogenetic superiority of polyploid Carassius gibelio to create a complete germ cell-depleted gonad model by a similar morpholino-mediated knockdown approach used in other examined sexual reproduction fishes. Through the germ cell-depleted gonad model, we have performed comprehensive and comparative transcriptome analysis, and revealed a complete alteration of sex-biased gene expression. Moreover, the expression alteration leads to up-regulation of testis-biased genes and down-regulation of ovary-biased genes, and results in the occurrence of sterile all-males with testis-like gonads and secondary sex characteristics in the germ cell-depleted gynogenetic Carassius gibelio. Our current results have demonstrated that unisexual gynogenetic embryos remain keeping male sex determination information in the genome, and the complete depletion of primordial germ cells in the all-female fish leads to sex-biased gene expression alteration and sterile all-male occurrence.
CRITICA: coding region identification tool invoking comparative analysis
NASA Technical Reports Server (NTRS)
Badger, J. H.; Olsen, G. J.; Woese, C. R. (Principal Investigator)
1999-01-01
Gene recognition is essential to understanding existing and future DNA sequence data. CRITICA (Coding Region Identification Tool Invoking Comparative Analysis) is a suite of programs for identifying likely protein-coding sequences in DNA by combining comparative analysis of DNA sequences with more common noncomparative methods. In the comparative component of the analysis, regions of DNA are aligned with related sequences from the DNA databases; if the translation of the aligned sequences has greater amino acid identity than expected for the observed percentage nucleotide identity, this is interpreted as evidence for coding. CRITICA also incorporates noncomparative information derived from the relative frequencies of hexanucleotides in coding frames versus other contexts (i.e., dicodon bias). The dicodon usage information is derived by iterative analysis of the data, such that CRITICA is not dependent on the existence or accuracy of coding sequence annotations in the databases. This independence makes the method particularly well suited for the analysis of novel genomes. CRITICA was tested by analyzing the available Salmonella typhimurium DNA sequences. Its predictions were compared with the DNA sequence annotations and with the predictions of GenMark. CRITICA proved to be more accurate than GenMark, and moreover, many of its predictions that would seem to be errors instead reflect problems in the sequence databases. The source code of CRITICA is freely available by anonymous FTP (rdp.life.uiuc.edu in/pub/critica) and on the World Wide Web (http:/(/)rdpwww.life.uiuc.edu).
Zhang, Shulin; Li, Fang-Yuan; Bass, Harold N; Pursley, Amber; Schmitt, Eric S; Brown, Blaire L; Brundage, Ellen K; Mardach, Rebecca; Wong, Lee-Jun
2010-01-01
Thymidine kinase 2 (TK2), encoded by the TK2 gene on chromosome 16q22, is one of the deoxyribonucleoside kinases responsible for the maintenance of mitochondrial deoxyribonucleotide pools. Defects in TK2 mainly cause a myopathic form of the mitochondrial DNA depletion syndrome (MDDS). Currently, only point mutations and small insertions and deletions have been reported in TK2 gene; gross rearrangements of TK2 gene and possible hepatic involvement in patients with TK2 mutations have not been described. We report a non-consanguineous Jordanian family with three deceased siblings due to mtDNA depletion. Sequence analysis of the father detected a heterozygous c.761T>A (p.I254N) mutation in his TK2 gene; however, point mutations in the mother were not detected. Subsequent gene dosage analysis using oligonucleotide array CGH identified an intragenic approximately 5.8-kb deletion encompassing the 5'UTR to intron 2 of her TK2 gene. Sequence analysis confirmed that the deletion spans c.1-495 to c.283-2899 of the TK2 gene (nucleotide 65,136,256-65,142,086 of chromosome 16). Analysis of liver and muscle specimens from one of the deceased infants in this family revealed compound heterozygosity for the paternal point mutation and maternal intragenic deletion. In addition, a significant reduction of the mtDNA content in liver and muscle was detected (10% and 20% of age- and tissue-matched controls, respectively). Prenatal diagnosis was performed in the third pregnancy. The fetus was found to carry both the point mutation and the deletion. This child died 6months after birth due to myopathy. A serum specimen demonstrated elevated liver transaminases in two of the infants from whom results were available. This report expands the mutation spectrum associated with TK2 deficiency. While the myopathic form of MDDS appears to be the main phenotype of TK2 mutations, liver dysfunction may also be a part of the mitochondrial depletion syndrome caused by TK2 gene defects.
NASA Astrophysics Data System (ADS)
Masciopinto, Costantino; Volpe, Angela; Palmiotta, Domenico; Cherubini, Claudia
2010-09-01
A combination of a parallel fracture model with the PHREEQC-2 geochemical model was developed to simulate sequential flow and chemical transport with reactions in fractured media where both laminar and turbulent flows occur. The integration of non-laminar flow resistances in one model produced relevant effects on water flow velocities, thus improving model prediction capabilities on contaminant transport. The proposed conceptual model consists of 3D rock-blocks, separated by horizontal bedding plane fractures with variable apertures. Particle tracking solved the transport equations for conservative compounds and provided input for PHREEQC-2. For each cluster of contaminant pathways, PHREEQC-2 determined the concentration for mass-transfer, sorption/desorption, ion exchange, mineral dissolution/precipitation and biodegradation, under kinetically controlled reactive processes of equilibrated chemical species. Field tests have been performed for the code verification. As an example, the combined model has been applied to a contaminated fractured aquifer of southern Italy in order to simulate the phenol transport. The code correctly fitted the field available data and also predicted a possible rapid depletion of phenols as a result of an increased biodegradation rate induced by a simulated artificial injection of nitrates, upgradient to the sources.
Validation of the new code package APOLLO2.8 for accurate PWR neutronics calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santamarina, A.; Bernard, D.; Blaise, P.
2013-07-01
This paper summarizes the Qualification work performed to demonstrate the accuracy of the new APOLLO2.S/SHEM-MOC package based on JEFF3.1.1 nuclear data file for the prediction of PWR neutronics parameters. This experimental validation is based on PWR mock-up critical experiments performed in the EOLE/MINERVE zero-power reactors and on P.I. Es on spent fuel assemblies from the French PWRs. The Calculation-Experiment comparison for the main design parameters is presented: reactivity of UOX and MOX lattices, depletion calculation and fuel inventory, reactivity loss with burnup, pin-by-pin power maps, Doppler coefficient, Moderator Temperature Coefficient, Void coefficient, UO{sub 2}-Gd{sub 2}O{sub 3} poisoning worth, Efficiency ofmore » Ag-In-Cd and B4C control rods, Reflector Saving for both standard 2-cm baffle and GEN3 advanced thick SS reflector. From this qualification process, calculation biases and associated uncertainties are derived. This code package APOLLO2.8 is already implemented in the ARCADIA new AREVA calculation chain for core physics and is currently under implementation in the future neutronics package of the French utility Electricite de France. (authors)« less
Plasma density perturbation caused by probes at low gas pressure
NASA Astrophysics Data System (ADS)
Sternberg, Natalia; Godyak, Valery
2017-09-01
An analysis of plasma parameter perturbations caused by a spherical probe immersed into a spherical plasma is presented for arbitrary collisionality and arbitrary ratios of probe to plasma dimensions. The plasma was modeled by the fluid plasma equations with ion inertia and nonlinear ion friction force that dominate plasma transport at low gas pressures. Significant depletion of the plasma density around the probe surface has been found. The area of plasma depletion coincides with the sensing area of different kinds of magnetic and microwave probes and will therefore lead to errors in data inferred from measurements with such probes.
Verification of a Viscous Computational Aeroacoustics Code using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Verification of a Viscous Computational Aeroacoustics Code Using External Verification Analysis
NASA Technical Reports Server (NTRS)
Ingraham, Daniel; Hixon, Ray
2015-01-01
The External Verification Analysis approach to code verification is extended to solve the three-dimensional Navier-Stokes equations with constant properties, and is used to verify a high-order computational aeroacoustics (CAA) code. After a brief review of the relevant literature, the details of the EVA approach are presented and compared to the similar Method of Manufactured Solutions (MMS). Pseudocode representations of EVA's algorithms are included, along with the recurrence relations needed to construct the EVA solution. The code verification results show that EVA was able to convincingly verify a high-order, viscous CAA code without the addition of MMS-style source terms, or any other modifications to the code.
Adjoint-Based Implicit Uncertainty Analysis for Figures of Merit in a Laser Inertial Fusion Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, J E; Fratoni, M; Kramer, K J
A primary purpose of computational models is to inform design decisions and, in order to make those decisions reliably, the confidence in the results of such models must be estimated. Monte Carlo neutron transport models are common tools for reactor designers. These types of models contain several sources of uncertainty that propagate onto the model predictions. Two uncertainties worthy of note are (1) experimental and evaluation uncertainties of nuclear data that inform all neutron transport models and (2) statistical counting precision, which all results of a Monte Carlo codes contain. Adjoint-based implicit uncertainty analyses allow for the consideration of anymore » number of uncertain input quantities and their effects upon the confidence of figures of merit with only a handful of forward and adjoint transport calculations. When considering a rich set of uncertain inputs, adjoint-based methods remain hundreds of times more computationally efficient than Direct Monte-Carlo methods. The LIFE (Laser Inertial Fusion Energy) engine is a concept being developed at Lawrence Livermore National Laboratory. Various options exist for the LIFE blanket, depending on the mission of the design. The depleted uranium hybrid LIFE blanket design strives to close the fission fuel cycle without enrichment or reprocessing, while simultaneously achieving high discharge burnups with reduced proliferation concerns. Neutron transport results that are central to the operation of the design are tritium production for fusion fuel, fission of fissile isotopes for energy multiplication, and production of fissile isotopes for sustained power. In previous work, explicit cross-sectional uncertainty analyses were performed for reaction rates related to the figures of merit for the depleted uranium hybrid LIFE blanket. Counting precision was also quantified for both the figures of merit themselves and the cross-sectional uncertainty estimates to gauge the validity of the analysis. All cross-sectional uncertainties were small (0.1-0.8%), bounded counting uncertainties, and were precise with regard to counting precision. Adjoint/importance distributions were generated for the same reaction rates. The current work leverages those adjoint distributions to transition from explicit sensitivities, in which the neutron flux is constrained, to implicit sensitivities, in which the neutron flux responds to input perturbations. This treatment vastly expands the set of data that contribute to uncertainties to produce larger, more physically accurate uncertainty estimates.« less
A modern study of HD 166734: a massive supergiant system
NASA Astrophysics Data System (ADS)
Mahy, L.; Damerdji, Y.; Gosset, E.; Nitschelm, C.; Eenens, P.; Sana, H.; Klotz, A.
2017-11-01
Aims: HD 166734 is an eccentric eclipsing binary system composed of two supergiant O-type stars, orbiting with a 34.5-day period. In this rare configuration for such stars, the two objects mainly evolve independently, following single-star evolution so far. This system provides a chance to study the individual parameters of two supergiant massive stars and to derive their real masses. Methods: An intensive monitoring was dedicated to HD 166734. We analyzed mid- and high-resolution optical spectra to constrain the orbital parameters of this system. We also studied its light curve for the first time, obtained in the VRI filters. Finally, we disentangled the spectra of the two stars and modeled them with the CMFGEN atmosphere code in order to determine the individual physical parameters. Results: HD 166734 is a O7.5If+O9I(f) binary. We confirm its orbital period but we revise the other orbital parameters. In comparison to what we found in the literature, the system is more eccentric and, now, the hottest and the most luminous component is also the most massive one. The light curve exhibits only one eclipse and its analysis indicates an inclination of 63.0° ± 2.7°. The photometric analysis provides us with a good estimation of the luminosities of the stars, and therefore their exact positions in the Hertzsprung-Russell diagram. The evolutionary and the spectroscopic masses show good agreement with the dynamical masses of 39.5 M⊙ for the primary and 33.5 M⊙ for the secondary, within the uncertainties. The two components are both enriched in helium and in nitrogen and depleted in carbon. In addition, the primary also shows a depletion in oxygen. Their surface abundances are however not different from those derived from single supergiant stars, yielding, for both components, an evolution similar to that of single stars. Based on observations collected at the European Southern Observatory (La Silla, Chile) with FEROS and TAROT and on data collected at the San Pedro Mártir observatory (Mexico).The reduced spectra and the light curves are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/607/A96
Goldberg, Tony L; Bennett, Andrew J; Kityo, Robert; Kuhn, Jens H; Chapman, Colin A
2017-07-13
Bats are natural reservoir hosts of highly virulent pathogens such as Marburg virus, Nipah virus, and SARS coronavirus. However, little is known about the role of bat ectoparasites in transmitting and maintaining such viruses. The intricate relationship between bats and their ectoparasites suggests that ectoparasites might serve as viral vectors, but evidence to date is scant. Bat flies, in particular, are highly specialized obligate hematophagous ectoparasites that incidentally bite humans. Using next-generation sequencing, we discovered a novel ledantevirus (mononegaviral family Rhabdoviridae, genus Ledantevirus) in nycteribiid bat flies infesting pteropodid bats in western Uganda. Mitochondrial DNA analyses revealed that both the bat flies and their bat hosts belong to putative new species. The coding-complete genome of the new virus, named Kanyawara virus (KYAV), is only distantly related to that of its closest known relative, Mount Elgon bat virus, and was found at high titers in bat flies but not in blood or on mucosal surfaces of host bats. Viral genome analysis indicates unusually low CpG dinucleotide depletion in KYAV compared to other ledanteviruses and rhabdovirus groups, with KYAV displaying values similar to rhabdoviruses of arthropods. Our findings highlight the possibility of a yet-to-be-discovered diversity of potentially pathogenic viruses in bat ectoparasites.
A high-fidelity Monte Carlo evaluation of CANDU-6 safety parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Y.; Hartanto, D.
2012-07-01
Important safety parameters such as the fuel temperature coefficient (FTC) and the power coefficient of reactivity (PCR) of the CANDU-6 (CANada Deuterium Uranium) reactor have been evaluated by using a modified MCNPX code. For accurate analysis of the parameters, the DBRC (Doppler Broadening Rejection Correction) scheme was implemented in MCNPX in order to account for the thermal motion of the heavy uranium nucleus in the neutron-U scattering reactions. In this work, a standard fuel lattice has been modeled and the fuel is depleted by using the MCNPX and the FTC value is evaluated for several burnup points including the mid-burnupmore » representing a near-equilibrium core. The Doppler effect has been evaluated by using several cross section libraries such as ENDF/B-VI, ENDF/B-VII, JEFF, JENDLE. The PCR value is also evaluated at mid-burnup conditions to characterize safety features of equilibrium CANDU-6 reactor. To improve the reliability of the Monte Carlo calculations, huge number of neutron histories are considered in this work and the standard deviation of the k-inf values is only 0.5{approx}1 pcm. It has been found that the FTC is significantly enhanced by accounting for the Doppler broadening of scattering resonance and the PCR are clearly improved. (authors)« less
Regional-scale, fully coupled modelling of stream aquifer interaction in a tropical catchment
NASA Astrophysics Data System (ADS)
Werner, Adrian D.; Gallagher, Mark R.; Weeks, Scott W.
2006-09-01
SummaryThe planning and management of water resources in the Pioneer Valley, north-eastern Australia requires a tool for assessing the impact of groundwater and stream abstractions on water supply reliabilities and environmental flows in Sandy Creek (the main surface water system studied). Consequently, a fully coupled stream-aquifer model has been constructed using the code MODHMS, calibrated to near-stream observations of watertable behaviour and multiple components of gauged stream flow. This model has been tested using other methods of estimation, including stream depletion analysis and radon isotope tracer sampling. The coarseness of spatial discretisation, which is required for practical reasons of computational efficiency, limits the model's capacity to simulate small-scale processes (e.g., near-stream groundwater pumping, bank storage effects), and alternative approaches are required to complement the model's range of applicability. Model predictions of groundwater influx to Sandy Creek are compared with baseflow estimates from three different hydrograph separation techniques, which were found to be unable to reflect the dynamics of Sandy Creek stream-aquifer interactions. The model was also used to infer changes in the water balance of the system caused by historical land use change. This led to constraints on the recharge distribution which can be implemented to improve model calibration performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleicher, Frederick N.; Williamson, Richard L.; Ortensi, Javier
The MOOSE neutron transport application RATTLESNAKE was coupled to the fuels performance application BISON to provide a higher fidelity tool for fuel performance simulation. This project is motivated by the desire to couple a high fidelity core analysis program (based on the self-adjoint angular flux equations) to a high fidelity fuel performance program, both of which can simulate on unstructured meshes. RATTLESNAKE solves self-adjoint angular flux transport equation and provides a sub-pin level resolution of the multigroup neutron flux with resonance treatment during burnup or a fast transient. BISON solves the coupled thermomechanical equations for the fuel on a sub-millimetermore » scale. Both applications are able to solve their respective systems on aligned and unaligned unstructured finite element meshes. The power density and local burnup was transferred from RATTLESNAKE to BISON with the MOOSE Multiapp transfer system. Multiple depletion cases were run with one-way data transfer from RATTLESNAKE to BISON. The eigenvalues are shown to agree well with values obtained from the lattice physics code DRAGON. The one-way data transfer of power density is shown to agree with the power density obtained from an internal Lassman-style model in BISON.« less
Self-Regulatory Capacities Are Depleted in a Domain-Specific Manner
Zhang, Rui; Stock, Ann-Kathrin; Rzepus, Anneka; Beste, Christian
2017-01-01
Performing an act of self-regulation such as making decisions has been suggested to deplete a common limited resource, which impairs all subsequent self-regulatory actions (ego depletion theory). It has however remained unclear whether self-referred decisions truly impair behavioral control even in seemingly unrelated cognitive domains, and which neurophysiological mechanisms are affected by these potential depletion effects. In the current study, we therefore used an inter-individual design to compare two kinds of depletion, namely a self-referred choice-based depletion and a categorization-based switching depletion, to a non-depleted control group. We used a backward inhibition (BI) paradigm to assess the effects of depletion on task switching and associated inhibition processes. It was combined with EEG and source localization techniques to assess both behavioral and neurophysiological depletion effects. The results challenge the ego depletion theory in its current form: Opposing the theory’s prediction of a general limited resource, which should have yielded comparable effects in both depletion groups, or maybe even a larger depletion in the self-referred choice group, there were stronger performance impairments following a task domain-specific depletion (i.e., the switching-based depletion) than following a depletion based on self-referred choices. This suggests at least partly separate and independent resources for various cognitive control processes rather than just one joint resource for all self-regulation activities. The implications are crucial to consider for people making frequent far-reaching decisions e.g., in law or economy. PMID:29033798
Self-Regulatory Capacities Are Depleted in a Domain-Specific Manner.
Zhang, Rui; Stock, Ann-Kathrin; Rzepus, Anneka; Beste, Christian
2017-01-01
Performing an act of self-regulation such as making decisions has been suggested to deplete a common limited resource, which impairs all subsequent self-regulatory actions (ego depletion theory). It has however remained unclear whether self-referred decisions truly impair behavioral control even in seemingly unrelated cognitive domains, and which neurophysiological mechanisms are affected by these potential depletion effects. In the current study, we therefore used an inter-individual design to compare two kinds of depletion, namely a self-referred choice-based depletion and a categorization-based switching depletion, to a non-depleted control group. We used a backward inhibition (BI) paradigm to assess the effects of depletion on task switching and associated inhibition processes. It was combined with EEG and source localization techniques to assess both behavioral and neurophysiological depletion effects. The results challenge the ego depletion theory in its current form: Opposing the theory's prediction of a general limited resource, which should have yielded comparable effects in both depletion groups, or maybe even a larger depletion in the self-referred choice group, there were stronger performance impairments following a task domain-specific depletion (i.e., the switching-based depletion) than following a depletion based on self-referred choices. This suggests at least partly separate and independent resources for various cognitive control processes rather than just one joint resource for all self-regulation activities. The implications are crucial to consider for people making frequent far-reaching decisions e.g., in law or economy.
Dependency graph for code analysis on emerging architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shashkov, Mikhail Jurievich; Lipnikov, Konstantin
Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.
Automatic Coding of Dialogue Acts in Collaboration Protocols
ERIC Educational Resources Information Center
Erkens, Gijsbert; Janssen, Jeroen
2008-01-01
Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
Finite element analysis of inviscid subsonic boattail flow
NASA Technical Reports Server (NTRS)
Chima, R. V.; Gerhart, P. M.
1981-01-01
A finite element code for analysis of inviscid subsonic flows over arbitrary nonlifting planar or axisymmetric bodies is described. The code solves a novel primitive variable formulation of the coupled irrotationality and compressible continuity equations. Results for flow over a cylinder, a sphere, and a NACA 0012 airfoil verify the code. Computed subcritical flows over an axisymmetric boattailed afterbody compare well with finite difference results and experimental data. Interative coupling with an integral turbulent boundary layer code shows strong viscous effects on the inviscid flow. Improvements in code efficiency and extensions to transonic flows are discussed.
PASCO: Structural panel analysis and sizing code: Users manual - Revised
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.
1981-01-01
A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.
Rankin, Carl Robert; Theodorou, Evangelos; Law, Ivy Ka Man; Rowe, Lorraine; Kokkotou, Efi; Pekow, Joel; Wang, Jiafang; Martin, Martin G; Pothoulakis, Charalabos; Padua, David Miguel
2018-06-28
Inflammatory bowel disease (IBD) is a complex disorder that is associated with significant morbidity. While many recent advances have been made with new diagnostic and therapeutic tools, a deeper understanding of its basic pathophysiology is needed to continue this trend towards improving treatments. By utilizing an unbiased, high-throughput transcriptomic analysis of two well-established mouse models of colitis, we set out to uncover novel coding and non-coding RNAs that are differentially expressed in the setting of colonic inflammation. RNA-seq analysis was performed using colonic tissue from two mouse models of colitis, a dextran sodium sulfate induced model and a genetic-induced model in mice lacking IL-10. We identified 81 coding RNAs that were commonly altered in both experimental models. Of these coding RNAs, 12 of the human orthologs were differentially expressed in a transcriptomic analysis of IBD patients. Interestingly, 5 of the 12 of human differentially expressed genes have not been previously identified as IBD-associated genes, including ubiquitin D. Our analysis also identified 15 non-coding RNAs that were differentially expressed in either mouse model. Surprisingly, only three non-coding RNAs were commonly dysregulated in both of these models. The discovery of these new coding and non-coding RNAs expands our transcriptional knowledge of mouse models of IBD and offers additional targets to deepen our understanding of the pathophysiology of IBD.
A CFD/CSD Interaction Methodology for Aircraft Wings
NASA Technical Reports Server (NTRS)
Bhardwaj, Manoj K.
1997-01-01
With advanced subsonic transports and military aircraft operating in the transonic regime, it is becoming important to determine the effects of the coupling between aerodynamic loads and elastic forces. Since aeroelastic effects can contribute significantly to the design of these aircraft, there is a strong need in the aerospace industry to predict these aero-structure interactions computationally. To perform static aeroelastic analysis in the transonic regime, high fidelity computational fluid dynamics (CFD) analysis tools must be used in conjunction with high fidelity computational structural fluid dynamics (CSD) analysis tools due to the nonlinear behavior of the aerodynamics in the transonic regime. There is also a need to be able to use a wide variety of CFD and CSD tools to predict these aeroelastic effects in the transonic regime. Because source codes are not always available, it is necessary to couple the CFD and CSD codes without alteration of the source codes. In this study, an aeroelastic coupling procedure is developed which will perform static aeroelastic analysis using any CFD and CSD code with little code integration. The aeroelastic coupling procedure is demonstrated on an F/A-18 Stabilator using NASTD (an in-house McDonnell Douglas CFD code) and NASTRAN. In addition, the Aeroelastic Research Wing (ARW-2) is used for demonstration of the aeroelastic coupling procedure by using ENSAERO (NASA Ames Research Center CFD code) and a finite element wing-box code (developed as part of this research).
14 CFR 417.207 - Trajectory analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... any stage that has the potential to impact the Earth and does not burn to propellant depletion before a programmed thrust termination. (3) For launch vehicles flown with a flight safety system, a...
14 CFR 417.207 - Trajectory analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... any stage that has the potential to impact the Earth and does not burn to propellant depletion before a programmed thrust termination. (3) For launch vehicles flown with a flight safety system, a...
14 CFR 417.207 - Trajectory analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... any stage that has the potential to impact the Earth and does not burn to propellant depletion before a programmed thrust termination. (3) For launch vehicles flown with a flight safety system, a...
Mielczarek, M; Frąszczak, M; Giannico, R; Minozzi, G; Williams, John L; Wojdak-Maksymiec, K; Szyda, J
2017-07-01
Thirty-two whole genome DNA sequences of cows were analyzed to evaluate inter-individual variability in the distribution and length of copy number variations (CNV) and to functionally annotate CNV breakpoints. The total number of deletions per individual varied between 9,731 and 15,051, whereas the number of duplications was between 1,694 and 5,187. Most of the deletions (81%) and duplications (86%) were unique to a single cow. No relation between the pattern of variant sharing and a family relationship or disease status was found. The animal-averaged length of deletions was from 5,234 to 9,145 bp and the average length of duplications was between 7,254 and 8,843 bp. Highly significant inter-individual variation in length and number of CNV was detected for both deletions and duplications. The majority of deletion and duplication breakpoints were located in intergenic regions and introns, whereas fewer were identified in noncoding transcripts and splice regions. Only 1.35 and 0.79% of the deletion and duplication breakpoints were observed within coding regions. A gene with the highest number of deletion breakpoints codes for protein kinase cGMP-dependent type I, whereas the T-cell receptor α constant gene had the most duplication breakpoints. The functional annotation of genes with the largest incidence of deletion/duplication breakpoints identified 87/112 Kyoto Encyclopedia of Genes and Genomes pathways, but none of the pathways were significantly enriched or depleted with breakpoints. The analysis of Gene Ontology (GO) terms revealed that a cluster with the highest enrichment score among genes with many deletion breakpoints was represented by GO terms related to ion transport, whereas the GO term cluster mostly enriched among the genes with many duplication breakpoints was related to binding of macromolecules. Furthermore, when considering the number of deletion breakpoints per gene functional category, no significant differences were observed between the "housekeeping" and "strong selection" categories, but genes representing the "low selection pressure" group showed a significantly higher number of breakpoints. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Development of Safety Analysis Code System of Beam Transport and Core for Accelerator Driven System
NASA Astrophysics Data System (ADS)
Aizawa, Naoto; Iwasaki, Tomohiko
2014-06-01
Safety analysis code system of beam transport and core for accelerator driven system (ADS) is developed for the analyses of beam transients such as the change of the shape and position of incident beam. The code system consists of the beam transport analysis part and the core analysis part. TRACE 3-D is employed in the beam transport analysis part, and the shape and incident position of beam at the target are calculated. In the core analysis part, the neutronics, thermo-hydraulics and cladding failure analyses are performed by the use of ADS dynamic calculation code ADSE on the basis of the external source database calculated by PHITS and the cross section database calculated by SRAC, and the programs of the cladding failure analysis for thermoelastic and creep. By the use of the code system, beam transient analyses are performed for the ADS proposed by Japan Atomic Energy Agency. As a result, the rapid increase of the cladding temperature happens and the plastic deformation is caused in several seconds. In addition, the cladding is evaluated to be failed by creep within a hundred seconds. These results have shown that the beam transients have caused a cladding failure.
EBT reactor systems analysis and cost code: description and users guide (Version 1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santoro, R.T.; Uckan, N.A.; Barnes, J.M.
1984-06-01
An ELMO Bumpy Torus (EBT) reactor systems analysis and cost code that incorporates the most recent advances in EBT physics has been written. The code determines a set of reactors that fall within an allowed operating window determined from the coupling of ring and core plasma properties and the self-consistent treatment of the coupled ring-core stability and power balance requirements. The essential elements of the systems analysis and cost code are described, along with the calculational sequences leading to the specification of the reactor options and their associated costs. The input parameters, the constraints imposed upon them, and the operatingmore » range over which the code provides valid results are discussed. A sample problem and the interpretation of the results are also presented.« less
ERIC Educational Resources Information Center
Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali
2013-01-01
This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…
Computer codes developed and under development at Lewis
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1992-01-01
The objective of this summary is to provide a brief description of: (1) codes developed or under development at LeRC; and (2) the development status of IPACS with some typical early results. The computer codes that have been developed and/or are under development at LeRC are listed in the accompanying charts. This list includes: (1) the code acronym; (2) select physics descriptors; (3) current enhancements; and (4) present (9/91) code status with respect to its availability and documentation. The computer codes list is grouped by related functions such as: (1) composite mechanics; (2) composite structures; (3) integrated and 3-D analysis; (4) structural tailoring; and (5) probabilistic structural analysis. These codes provide a broad computational simulation infrastructure (technology base-readiness) for assessing the structural integrity/durability/reliability of propulsion systems. These codes serve two other very important functions: they provide an effective means of technology transfer; and they constitute a depository of corporate memory.
Improvements in the MGA Code Provide Flexibility and Better Error Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruhter, W D; Kerr, J
2005-05-26
The Multi-Group Analysis (MGA) code is widely used to determine nondestructively the relative isotopic abundances of plutonium by gamma-ray spectrometry. MGA users have expressed concern about the lack of flexibility and transparency in the code. Users often have to ask the code developers for modifications to the code to accommodate new measurement situations, such as additional peaks being present in the plutonium spectrum or expected peaks being absent. We are testing several new improvements to a prototype, general gamma-ray isotopic analysis tool with the intent of either revising or replacing the MGA code. These improvements will give the user themore » ability to modify, add, or delete the gamma- and x-ray energies and branching intensities used by the code in determining a more precise gain and in the determination of the relative detection efficiency. We have also fully integrated the determination of the relative isotopic abundances with the determination of the relative detection efficiency to provide a more accurate determination of the errors in the relative isotopic abundances. We provide details in this paper on these improvements and a comparison of results obtained with current versions of the MGA code.« less
A Computer Program for Flow-Log Analysis of Single Holes (FLASH)
Day-Lewis, F. D.; Johnson, C.D.; Paillet, Frederick L.; Halford, K.J.
2011-01-01
A new computer program, FLASH (Flow-Log Analysis of Single Holes), is presented for the analysis of borehole vertical flow logs. The code is based on an analytical solution for steady-state multilayer radial flow to a borehole. The code includes options for (1) discrete fractures and (2) multilayer aquifers. Given vertical flow profiles collected under both ambient and stressed (pumping or injection) conditions, the user can estimate fracture (or layer) transmissivities and far-field hydraulic heads. FLASH is coded in Microsoft Excel with Visual Basic for Applications routines. The code supports manual and automated model calibration. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.
Environmental science: Eating ourselves dry
NASA Astrophysics Data System (ADS)
Aldaya, Maite M.
2017-03-01
Do human consumption habits affect groundwater depletion as a result of international food trade? A global analysis indicates that they do, and shows which products and countries have the biggest impact. See Letter p.700
Lee, S. M.; Thatcher, N.; Dougal, M.; Margison, G. P.
1993-01-01
There is increasing experimental evidence to suggest that endogenous expression of O6-alkylguanine-DNA-alkyltransferase (ATase) is a major factor in cellular resistance to certain chemotherapeutic agents including dacarbazine (DTIC). We have recently shown wide interindividual variation in the depletion and subsequent regeneration of ATase in peripheral blood mononuclear cells (PMCs) following DTIC and this has now been extended to ascertain whether or not depletion is related to dosage of DTIC used and repeated treatment cycles of chemotherapy. ATase levels were measured in three groups of 25 patients (pts) up to 24 h after receiving DTIC at 400 mg m-2, 500 mg m-2 or 800 mg m-2. Each group also received fotemustine (100 mg m-2), 4 h after DTIC. The lowest extent of ATase depletion (highest nadir ATase) was seen in patients receiving 400 mg m-2. The mean nadir ATase, expressed as a percentage of pre-treatment ATase, was respectively 56.3%, 26.4% and 23.9% for 400 mg m-2, 500 mg m-2 and 800 mg m-2. The median nadir of ATase activity for pts receiving 800 mg m-2 pts was at 4-6 h and for pts given lower doses it was at 2-3 h. In addition, repeated measures analysis of variance of observations before chemotherapy, then at 2, 3, 4, 6 and 18 h after chemotherapy provides some evidence that ATase was depleted to a lesser extent after cycle 1 than after subsequent cycles (P = 0.025). It also provides evidence that the change in ATase activity over time varied with dose and cycle. The findings can be interpreted on the basis of a dosage-dependent metabolism of DTIC to an agent capable of methylation of DNA and subsequent depletion of PMC ATase: with higher DTIC doses, the extent of ATase depletion may be limited by the pharmacokinetics of DTIC metabolism. PMC ATase was measured in another group of 8 pts at various times after receiving only fotemustine (100 mg m-2) and in contrast to DTIC, no ATase depletion was seen suggesting that insufficient concentrations of fotemustine and/or its metabolites were available to react with DNA to produce a depletion of PMC ATase activity. PMID:8431354
Unfolding DNA condensates produced by DNA-like charged depletants: A force spectroscopy study
NASA Astrophysics Data System (ADS)
Lima, C. H. M.; Rocha, M. S.; Ramos, E. B.
2017-02-01
In this work, we have measured, by means of optical tweezers, forces acting on depletion-induced DNA condensates due to the presence of the DNA-like charged protein bovine serum albumin (BSA). The stretching and unfolding measurements performed on the semi-flexible DNA chain reveal (1) the softening of the uncondensed DNA contour length and (2) a mechanical behavior strikingly different from those previously observed: the force-extension curves of BSA-induced DNA condensates lack the "saw-tooth" pattern and applied external forces as high as ≈80 pN are unable to fully unfold the condensed DNA contour length. This last mechanical experimental finding is in agreement with force-induced "unpacking" detailed Langevin dynamics simulations recently performed by Cortini et al. on model rod-like shaped condensates. Furthermore, a simple thermodynamics analysis of the unfolding process has enabled us to estimate the free energy involved in the DNA condensation: the estimated depletion-induced interactions vary linearly with both the condensed DNA contour length and the BSA concentration, in agreement with the analytical and numerical analysis performed on model DNA condensates. We hope that future additional experiments can decide whether the rod-like morphology is the actual one we are dealing with (e.g. pulling experiments coupled with super-resolution fluorescence microscopy).
Huang, Chih-Yang; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Chang, Yen-Lin; Chen, Tung-Sheng
2016-12-01
Regular hemodialysis treatment induces an elevation in oxidative stress in patients with end-stage renal failure, resulting in oxidative damage of the most abundant serum protein, albumin. Oxidation of serum albumin causes depletion of albumin reactive thiols, leading to oxidative modification of serum albumin. The aim of this study was to screen the antioxidant capacity of albumins isolated from uremic patients (HD-ALB) or healthy volunteers (N-ALB). From high-performance liquid chromatography spectra, we observed that one uremic solute binds to HD-ALB via the formation of disulfide bonds between HD-ALB and the uremic solute. Furthermore, we found using chemiluminescent analysis that the antioxidant capacities for N-ALB to scavenge reactive oxygen species including singlet oxygen, hypochlorite and hydrogen peroxide were higher than HD-ALB. Our results suggest that protein-bound uremic solute binds to albumin via formation of disulfide bonds, resulting in the depletion of albumin reactive thiols. The depletion of albumin reactive thiols leads to a reduced antioxidant capacity of HD-ALB, implying postmodification of albumin. This situation may reduce the antioxidant capacity of albumin and increase oxidative stress, resulting in increase in complications related to oxidative damage in uremic patients. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Présent, Romain M; Rotureau, Elise; Billard, Patrick; Pagnout, Christophe; Sohm, Bénédicte; Flayac, Justine; Gley, Renaud; Pinheiro, José P; Duval, Jérôme F L
2017-11-08
Genetically engineered microorganisms are alternatives to physicochemical methods for remediation of metal-contaminated aquifers due to their remarkable bioaccumulation capacities. The design of such biosystems would benefit from the elaboration of a sound quantitative connection between performance in terms of metal removal from aqueous solution and dynamics of the multiscale processes leading to metal biouptake. In this work, this elaboration is reported for Escherichia coli cells modified to overexpress intracellular metallothionein (MTc), a strong proteinaceous metal chelator. Depletion kinetics of Cd(ii) from bulk solution following biouptake and intracellular accumulation is addressed as a function of cell volume fraction using electroanalytical probes and ligand exchange-based analyses. It is shown that metal biouptake in the absence and presence of MTc is successfully interpreted on the basis of a formalism recently developed for metal partitioning dynamics at biointerfaces with integration of intracellular metal speciation. The analysis demonstrates how fast sequestration of metals by intracellular MTc bypasses metal excretion (efflux) and enhances the rate of metal depletion to an extent such that complete removal is achieved at sufficiently large cell volume fractions. The magnitude of the stability constant of nanoparticulate metal-MTc complexes, as derived from refined analysis of macroscopic bulk metal depletion data, is further confirmed by independent electrochemical measurement of metal binding by purified MTc extracts.
Genes and Gut Bacteria Involved in Luminal Butyrate Reduction Caused by Diet and Loperamide.
Hwang, Nakwon; Eom, Taekil; Gupta, Sachin K; Jeong, Seong-Yeop; Jeong, Do-Youn; Kim, Yong Sung; Lee, Ji-Hoon; Sadowsky, Michael J; Unno, Tatsuya
2017-11-28
Unbalanced dietary habits and gut dysmotility are causative factors in metabolic and functional gut disorders, including obesity, diabetes, and constipation. Reduction in luminal butyrate synthesis is known to be associated with gut dysbioses, and studies have suggested that restoring butyrate formation in the colon may improve gut health. In contrast, shifts in different types of gut microbiota may inhibit luminal butyrate synthesis, requiring different treatments to restore colonic bacterial butyrate synthesis. We investigated the influence of high-fat diets (HFD) and low-fiber diets (LFD), and loperamide (LPM) administration, on key bacteria and genes involved in reduction of butyrate synthesis in mice. MiSeq-based microbiota analysis and HiSeq-based differential gene analysis indicated that different types of bacteria and genes were involved in butyrate metabolism in each treatment. Dietary modulation depleted butyrate kinase and phosphate butyryl transferase by decreasing members of the Bacteroidales and Parabacteroides . The HFD also depleted genes involved in succinate synthesis by decreasing Lactobacillus . The LFD and LPM treatments depleted genes involved in crotonoyl-CoA synthesis by decreasing Roseburia and Oscilllibacter . Taken together, our results suggest that different types of bacteria and genes were involved in gut dysbiosis, and that selected treatments may be needed depending on the cause of gut dysfunction.
Barriers to Early Detection of Breast Cancer Among African American Females Over Age of 55
2005-02-01
used for data analysis. NUDIST , software for qualitative data analysis will be used for systematic coding. All transcripts, as well as interviewer notes...will be coded in NUDIST . Dr. Smith and Mr. Worts will jointly develop the NUDIST coding system. Each of them will separately code each transcript and...already provided training in NUDIST to Dr. Smith and Mr. Worts. All interviews will be conducted by the Principal Investigator for this study who is
Development and application of structural dynamics analysis capabilities
NASA Technical Reports Server (NTRS)
Heinemann, Klaus W.; Hozaki, Shig
1994-01-01
Extensive research activities were performed in the area of multidisciplinary modeling and simulation of aerospace vehicles that are relevant to NASA Dryden Flight Research Facility. The efforts involved theoretical development, computer coding, and debugging of the STARS code. New solution procedures were developed in such areas as structures, CFD, and graphics, among others. Furthermore, systems-oriented codes were developed for rendering the code truly multidisciplinary and rather automated in nature. Also, work was performed in pre- and post-processing of engineering analysis data.
Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.
Tauber, J; Lahav, M
1987-11-01
A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.
Teaching, Morality, and Responsibility: A Structuralist Analysis of a Teachers' Code of Conduct
ERIC Educational Resources Information Center
Shortt, Damien; Hallett, Fiona; Spendlove, David; Hardy, Graham; Barton, Amanda
2012-01-01
In this paper we conduct a Structuralist analysis of the General Teaching Council for England's "Code of Conduct and Practice for Registered Teachers" in order to reveal how teachers are required to fulfil an apparently impossible social role. The GTCE's "Code," we argue, may be seen as an attempt by a government agency to…
New Tool Released for Engine-Airframe Blade-Out Structural Simulations
NASA Technical Reports Server (NTRS)
Lawrence, Charles
2004-01-01
Researchers at the NASA Glenn Research Center have enhanced a general-purpose finite element code, NASTRAN, for engine-airframe structural simulations during steady-state and transient operating conditions. For steady-state simulations, the code can predict critical operating speeds, natural modes of vibration, and forced response (e.g., cabin noise and component fatigue). The code can be used to perform static analysis to predict engine-airframe response and component stresses due to maneuver loads. For transient response, the simulation code can be used to predict response due to bladeoff events and subsequent engine shutdown and windmilling conditions. In addition, the code can be used as a pretest analysis tool to predict the results of the bladeout test required for FAA certification of new and derivative aircraft engines. Before the present analysis code was developed, all the major aircraft engine and airframe manufacturers in the United States and overseas were performing similar types of analyses to ensure the structural integrity of engine-airframe systems. Although there were many similarities among the analysis procedures, each manufacturer was developing and maintaining its own structural analysis capabilities independently. This situation led to high software development and maintenance costs, complications with manufacturers exchanging models and results, and limitations in predicting the structural response to the desired degree of accuracy. An industry-NASA team was formed to overcome these problems by developing a common analysis tool that would satisfy all the structural analysis needs of the industry and that would be available and supported by a commercial software vendor so that the team members would be relieved of maintenance and development responsibilities. Input from all the team members was used to ensure that everyone's requirements were satisfied and that the best technology was incorporated into the code. Furthermore, because the code would be distributed by a commercial software vendor, it would be more readily available to engine and airframe manufacturers, as well as to nonaircraft companies that did not previously have access to this capability.
Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor
NASA Astrophysics Data System (ADS)
Mkhabela, Peter Tshepo
The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.
Sehrawat, Ankita; Abat, Jasmeet K.; Deswal, Renu
2013-01-01
Although in the last few years good number of S-nitrosylated proteins are identified but information on endogenous targets is still limiting. Therefore, an attempt is made to decipher NO signaling in cold treated Brassica juncea seedlings. Treatment of seedlings with substrate, cofactor and inhibitor of Nitric-oxide synthase and nitrate reductase (NR), indicated NR mediated NO biosynthesis in cold. Analysis of the in vivo thiols showed depletion of low molecular weight thiols and enhancement of available protein thiols, suggesting redox changes. To have a detailed view, S-nitrosylation analysis was done using biotin switch technique (BST) and avidin-affinity chromatography. Ribulose-1,5-bisphosphate carboxylase/oxygenase (RuBisCO) is S-nitrosylated and therefore, is identified as target repeatedly due to its abundance. It also competes out low abundant proteins which are important NO signaling components. Therefore, RuBisCO was removed (over 80%) using immunoaffinity purification. Purified S-nitrosylated RuBisCO depleted proteins were resolved on 2-D gel as 110 spots, including 13 new, which were absent in the crude S-nitrosoproteome. These were identified by nLC-MS/MS as thioredoxin, fructose biphosphate aldolase class I, myrosinase, salt responsive proteins, peptidyl-prolyl cis-trans isomerase and malate dehydrogenase. Cold showed differential S-nitrosylation of 15 spots, enhanced superoxide dismutase activity (via S-nitrosylation) and promoted the detoxification of superoxide radicals. Increased S-nitrosylation of glyceraldehyde-3-phosphate dehydrogenase sedoheptulose-biphosphatase, and fructose biphosphate aldolase, indicated regulation of Calvin cycle by S-nitrosylation. The results showed that RuBisCO depletion improved proteome coverage and provided clues for NO signaling in cold. PMID:24032038
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmes, Thomas D.; Guilmette, Raymond A.; Cheng, Yung-Sung
2009-03-01
The Capstone Depleted Uranium Aerosol Study was undertaken to obtain aerosol samples resulting from a kinetic-energy cartridge with a large-caliber depleted uranium (DU) penetrator striking an Abrams or Bradley test vehicle. The sampling strategy was designed to (1) optimize the performance of the samplers and maintain their integrity in the extreme environment created during perforation of an armored vehicle by a DU penetrator, (2) collect aerosols as a function of time post-impact, and (3) obtain size-classified samples for analysis of chemical composition, particle morphology, and solubility in lung fluid. This paper describes the experimental setup and sampling methodologies used tomore » achieve these objectives. Custom-designed arrays of sampling heads were secured to the inside of the target in locations approximating the breathing zones of the vehicle commander, loader, gunner, and driver. Each array was designed to support nine filter cassettes and nine cascade impactors mounted with quick-disconnect fittings. Shielding and sampler placement strategies were used to minimize sampler loss caused by the penetrator impact and the resulting fragments of eroded penetrator and perforated armor. A cyclone train was used to collect larger quantities of DU aerosol for chemical composition and solubility. A moving filter sample was used to obtain semicontinuous samples for depleted uranium concentration determination. Control for the air samplers was provided by five remotely located valve control and pressure monitoring units located inside and around the test vehicle. These units were connected to a computer interface chassis and controlled using a customized LabVIEW engineering computer control program. The aerosol sampling arrays and control systems for the Capstone study provided the needed aerosol samples for physicochemical analysis, and the resultant data were used for risk assessment of exposure to DU aerosol.« less
Liu, Fengliang; Fan, Xiuzhen; Auclair, Sarah; Ferguson, Monique; Sun, Jiaren; Soong, Lynn; Hou, Wei; Redfield, Robert R.; Birx, Deborah L.; Ratto-Kim, Silvia; Robb, Merlin L.; Kim, Jerome H.; Michael, Nelson L.; Hu, Haitao
2016-01-01
Loss of immune control over opportunistic infections can occur at different stages of HIV-1 (HIV) disease, among which mucosal candidiasis caused by the fungal pathogen Candida albicans (C. albicans) is one of the early and common manifestations in HIV-infected human subjects. The underlying immunological basis is not well defined. We have previously shown that compared to cytomegalovirus (CMV)-specific CD4 cells, C. albicans-specific CD4 T cells are highly permissive to HIV in vitro. Here, based on an antiretroviral treatment (ART) naïve HIV infection cohort (RV21), we investigated longitudinally the impact of HIV on C. albicans- and CMV-specific CD4 T-cell immunity in vivo. We found a sequential dysfunction and preferential depletion for C. albicans-specific CD4 T cell response during progressive HIV infection. Compared to Th1 (IFN-γ, MIP-1β) functional subsets, the Th17 functional subsets (IL-17, IL-22) of C. albicans-specific CD4 T cells were more permissive to HIV in vitro and impaired earlier in HIV-infected subjects. Infection history analysis showed that C. albicans-specific CD4 T cells were more susceptible to HIV in vivo, harboring modestly but significantly higher levels of HIV DNA, than CMV-specific CD4 T cells. Longitudinal analysis of HIV-infected individuals with ongoing CD4 depletion demonstrated that C. albicans-specific CD4 T-cell response was preferentially and progressively depleted. Taken together, these data suggest a potential mechanism for earlier loss of immune control over mucosal candidiasis in HIV-infected patients and provide new insights into pathogen-specific immune failure in AIDS pathogenesis. PMID:27280548
National Combustion Code Parallel Performance Enhancements
NASA Technical Reports Server (NTRS)
Quealy, Angela; Benyo, Theresa (Technical Monitor)
2002-01-01
The National Combustion Code (NCC) is being developed by an industry-government team for the design and analysis of combustion systems. The unstructured grid, reacting flow code uses a distributed memory, message passing model for its parallel implementation. The focus of the present effort has been to improve the performance of the NCC code to meet combustor designer requirements for model accuracy and analysis turnaround time. Improving the performance of this code contributes significantly to the overall reduction in time and cost of the combustor design cycle. This report describes recent parallel processing modifications to NCC that have improved the parallel scalability of the code, enabling a two hour turnaround for a 1.3 million element fully reacting combustion simulation on an SGI Origin 2000.
NASA Astrophysics Data System (ADS)
Sikder, Somali; Ghosh, Shila
2018-02-01
This paper presents the construction of unipolar transposed modified Walsh code (TMWC) and analysis of its performance in optical code-division multiple-access (OCDMA) systems. Specifically, the signal-to-noise ratio, bit error rate (BER), cardinality, and spectral efficiency were investigated. The theoretical analysis demonstrated that the wavelength-hopping time-spreading system using TMWC was robust against multiple-access interference and more spectrally efficient than systems using other existing OCDMA codes. In particular, the spectral efficiency was calculated to be 1.0370 when TMWC of weight 3 was employed. The BER and eye pattern for the designed TMWC were also successfully obtained using OptiSystem simulation software. The results indicate that the proposed code design is promising for enhancing network capacity.
Analysis of Phenix end-of-life natural convection test with the MARS-LMR code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeong, H. Y.; Ha, K. S.; Lee, K. L.
The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less
NASA Astrophysics Data System (ADS)
Powers, Jeffrey J.
2011-12-01
This study focused on creating a new tristructural isotropic (TRISO) coated particle fuel performance model and demonstrating the integration of this model into an existing system of neutronics and heat transfer codes, creating a user-friendly option for including fuel performance analysis within system design optimization and system-level trade-off studies. The end product enables both a deeper understanding and better overall system performance of nuclear energy systems limited or greatly impacted by TRISO fuel performance. A thorium-fueled hybrid fusion-fission Laser Inertial Fusion Energy (LIFE) blanket design was used for illustrating the application of this new capability and demonstrated both the importance of integrating fuel performance calculations into mainstream design studies and the impact that this new integrated analysis had on system-level design decisions. A new TRISO fuel performance model named TRIUNE was developed and verified and validated during this work with a novel methodology established for simulating the actual lifetime of a TRISO particle during repeated passes through a pebble bed. In addition, integrated self-consistent calculations were performed for neutronics depletion analysis, heat transfer calculations, and then fuel performance modeling for a full parametric study that encompassed over 80 different design options that went through all three phases of analysis. Lastly, side studies were performed that included a comparison of thorium and depleted uranium (DU) LIFE blankets as well as some uncertainty quantification work to help guide future experimental work by assessing what material properties in TRISO fuel performance modeling are most in need of improvement. A recommended thorium-fueled hybrid LIFE engine design was identified with an initial fuel load of 20MT of thorium, 15% TRISO packing within the graphite fuel pebbles, and a 20cm neutron multiplier layer with beryllium pebbles in flibe molten salt coolant. It operated at a system power level of 2000 MWth, took about 3.5 years to reach full plateau power, and was capable of an End of Plateau burnup of 38.7 %FIMA if considering just the neutronic constraints in the system design; however, fuel performance constraints led to a maximum credible burnup of 12.1 %FIMA due to a combination of internal gas pressure and irradiation effects on the TRISO materials (especially PyC) leading to SiC pressure vessel failures. The optimal neutron spectrum for the thorium-fueled blanket options evaluated seemed to favor a hard spectrum (low but non-zero neutron multiplier thicknesses and high TRISO packing fractions) in terms of neutronic performance but the fuel performance constraints demonstrated that a significantly softer spectrum would be needed to decrease the rate of accumulation of fast neutron fluence in order to improve the maximum credible burnup the system could achieve.
Multi-Species Fluxes for the Parallel Quiet Direct Simulation (QDS) Method
NASA Astrophysics Data System (ADS)
Cave, H. M.; Lim, C.-W.; Jermy, M. C.; Krumdieck, S. P.; Smith, M. R.; Lin, Y.-J.; Wu, J.-S.
2011-05-01
Fluxes of multiple species are implemented in the Quiet Direct Simulation (QDS) scheme for gas flows. Each molecular species streams independently. All species are brought to local equilibrium at the end of each time step. The multi species scheme is compared to DSMC simulation, on a test case of a Mach 20 flow of a xenon/helium mixture over a forward facing step. Depletion of the heavier species in the bow shock and the near-wall layer are seen. The multi-species QDS code is then used to model the flow in a pulsed-pressure chemical vapour deposition reactor set up for carbon film deposition. The injected gas is a mixture of methane and hydrogen. The temporal development of the spatial distribution of methane over the substrate is tracked.
Cloned Viral Protein Vaccine for Foot-and-Mouth Disease: Responses in Cattle and Swine
NASA Astrophysics Data System (ADS)
Kleid, Dennis G.; Yansura, Daniel; Small, Barbara; Dowbenko, Donald; Moore, Douglas M.; Grubman, Marvin J.; McKercher, Peter D.; Morgan, Donald O.; Robertson, Betty H.; Bachrach, Howard L.
1981-12-01
A DNA sequence coding for the immunogenic capsid protein VP3 of foot-and-mouth disease virus A12, prepared from the virion RNA, was ligated to a plasmid designed to express a chimeric protein from the Escherichia coli tryptophan promoter-operator system. When Escherichia coli transformed with this plasmid was grown in tryptophan-depleted media, approximately 17 percent of the total cellular protein was found to be an insoluble and stable chimeric protein. The purified chimeric protein competed equally on a molar basis with VP3 for specific antibodies to foot-and-mouth disease virus. When inoculated into six cattle and two swine, this protein elicited high levels of neutralizing antibody and protection against challenge with foot-and-mouth disease virus.
Experimental validation of depletion calculations with VESTA 2.1.5 using JEFF-3.2
NASA Astrophysics Data System (ADS)
Haeck, Wim; Ichou, Raphaëlle
2017-09-01
The removal of decay heat is a significant safety concern in nuclear engineering for the operation of a nuclear reactor both in normal and accidental conditions and for intermediate and long term waste storage facilities. The correct evaluation of the decay heat produced by an irradiated material requires first of all the calculation of the composition of the irradiated material by depletion codes such as VESTA 2.1, currently under development at IRSN in France. A set of PWR assembly decay heat measurements performed by the Swedish Central Interim Storage Facility (CLAB) located in Oskarshamm (Sweden) have been calculated using different nuclear data libraries: ENDF/B-VII.0, JEFF-3.1, JEFF-3.2 and JEFF-3.3T1. Using these nuclear data libraries, VESTA 2.1 calculates the assembly decay heat for almost all cases within 4% of the measured decay heat. On average, the ENDF/B-VII.0 calculated decay heat values appear to give a systematic underestimation of only 0.5%. When using the JEFF-3.1 library, this results a systematic underestimation of about 2%. By switching to the JEFF-3.2 library, this systematic underestimation is improved slighty (up to 1.5%). The changes made in the JEFF-3.3T1 beta library appear to be overcorrecting, as the systematic underestimation is transformed into a systematic overestimation of about 1.5%.
NASA Astrophysics Data System (ADS)
Stoetter, T.; van Hardenbroek, M.; Rinta, P.; Schilder, J.; Schubert, C. J.; Heiri, O.
2013-12-01
Methane (CH4) is a major greenhouse gas and lakes are an important but poorly studied source of CH4 to the atmosphere. Lipid analysis was used before to identify and quantify CH4 oxidizing bacteria (MOB), giving insight into CH4 oxidation and production in lakes. However, few studies are available that examine how closely the distribution and the carbon isotopic signature (δ13C) of lipids are related to CH4 concentrations and fluxes in different lake ecosystems. In a multi-lake survey we quantified the relationship between lipids, mainly fatty acids (FAs), and CH4 concentrations or fluxes, with the aim of assessing whether FA analysis of lake sediment samples can provide information on past CH4 abundance and production in lakes. The study sites include small lakes in Sweden, Finland, the Netherlands, and Switzerland. Surface sediments collected in the deepest point of the lakes were examined using gas chromatography with flame ionization for determining FA concentrations, gas chromatography mass spectrometry (GC-MS) for identification of individual FAs, and isotope ratio mass spectrometry (IRMS) for determining compound specific δ13C values. Since CH4 is significantly more depleted in 13C than other carbon sources, δ13C is a good tracer for CH4 related processes. The analysis of the acid fraction in the sediments showed that mainly three FAs, identified as C16:1ω7, C16:1ω5 and C18:1ω7, were more depleted in 13C than the others, suggesting that they may originate from MOB. Comparison with literature sources indicated that these FAs are produced by MOB, however, not exclusively. The relative abundance of these depleted FAs showed clear relations to CH4 parameters. For example, increasing abundances were observed with increasing CH4 concentrations in the sediment or with increasing CH4 flux measured at the lake surface. An explanation for these relations would be an increase in MOB biomass with increasing CH4 availability, as they use CH4 as energy and carbon source, which would lead to increasing abundances of MOB produced FAs in the sediment. The presence or absence of oxygen above the sediments seems to have a strong effect on these relationships. In lakes with oxic bottom water, the abundance of depleted FAs shows a stronger rise with increasing CH4 concentrations than in lakes with anoxic bottom waters, suggesting that aerobic CH4 oxidizers are an important source of these depleted FAs. With increasing CH4 concentrations, for example just above the sediment, we find more depleted values in C16:1ω7 and C18:1ω7. This correlation is only strong if we exclude lakes with a strong terrestrial influence. Our preliminary analysis of FAs in surface sediment samples showed clear relations to CH4 parameters measured in the examined lake ecosystems suggesting that it may be possible to use FA analysis of lake sediment records as a proxy for CH4 availability in lakes. However, our results also show that oxygen conditions at the sediment-water interface and organic matter imported from the lake catchment can have a strong effect.
Performance Analysis of Hybrid ARQ Protocols in a Slotted Code Division Multiple-Access Network
1989-08-01
Convolutional Codes . in Proc Int. Conf. Commun., 21.4.1-21.4.5, 1987. [27] J. Hagenauer. Rate Compatible Punctured Convolutional Codes . in Proc Int. Conf...achieved by using a low rate (r = 0.5), high constraint length (e.g., 32) punctured convolutional code . Code puncturing provides for a variable rate code ...investigated the use of convolutional codes in Type II Hybrid ARQ protocols. The error
Development of the Off-line Analysis Code for GODDESS
NASA Astrophysics Data System (ADS)
Garland, Heather; Cizewski, Jolie; Lepailleur, Alex; Walters, David; Pain, Steve; Smith, Karl
2016-09-01
Determining (n, γ) cross sections on unstable nuclei is important for understanding the r-process that is theorized to occur in supernovae and neutron-star mergers. However, (n, γ) reactions are difficult to measure directly because of the short lifetime of the involved neutron rich nuclei. A possible surrogate for the (n, γ) reaction is the (d,p γ) reaction; the measurement of these reactions in inverse kinematics is part of the scope of GODDESS - Gammasphere ORRUBA (Oak Ridge Rutgers University Barrel Array): Dual Detectors for Experimental Structure Studies. The development of an accurate and efficient off-line analysis code for GODDESS experiments is not only essential, but also provides a unique opportunity to create an analysis code designed specifically for transfer reaction experiments. The off-line analysis code has been developed to produce histograms from the binary data file to determine how to best sort events. Recent developments in the off-line analysis code will be presented as well as details on the energy and position calibrations for the ORRUBA detectors. This work is supported in part by the U.S. Department of Energy and National Science Foundation.
Color Coding of Circuit Quantities in Introductory Circuit Analysis Instruction
ERIC Educational Resources Information Center
Reisslein, Jana; Johnson, Amy M.; Reisslein, Martin
2015-01-01
Learning the analysis of electrical circuits represented by circuit diagrams is often challenging for novice students. An open research question in electrical circuit analysis instruction is whether color coding of the mathematical symbols (variables) that denote electrical quantities can improve circuit analysis learning. The present study…
Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes
NASA Technical Reports Server (NTRS)
Abbasfar, A.; Divsalar, D.; Yao, K.
2004-01-01
In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.
NASA Astrophysics Data System (ADS)
Tavakkoli, M.; Kharrat, R.; Masihi, M.; Ghazanfari, M. H.; Fadaei, S.
2012-12-01
Thermodynamic modeling is known as a promising tool for phase behavior modeling of asphaltene precipitation under different conditions such as pressure depletion and CO2 injection. In this work, a thermodynamic approach is used for modeling the phase behavior of asphaltene precipitation. The precipitated asphaltene phase is represented by an improved solid model, while the oil and gas phases are modeled with an equation of state. The PR-EOS was used to perform flash calculations. Then, the onset point and the amount of precipitated asphaltene were predicted. A computer code based on an improved solid model has been developed and used for predicting asphaltene precipitation data for one of Iranian heavy crudes, under pressure depletion and CO2 injection conditions. A significant improvement has been observed in predicting the asphaltene precipitation data under gas injection conditions. Especially for the maximum value of asphaltene precipitation and for the trend of the curve after the peak point, good agreement was observed. For gas injection conditions, comparison of the thermodynamic micellization model and the improved solid model showed that the thermodynamic micellization model cannot predict the maximum of precipitation as well as the improved solid model. The non-isothermal improved solid model has been used for predicting asphaltene precipitation data under pressure depletion conditions. The pressure depletion tests were done at different levels of temperature and pressure, and the parameters of a non-isothermal model were tuned using three onset pressures at three different temperatures for the considered crude. The results showed that the model is highly sensitive to the amount of solid molar volume along with the interaction coefficient parameter between the asphaltene component and light hydrocarbon components. Using a non-isothermal improved solid model, the asphaltene phase envelope was developed. It has been revealed that at high temperatures, an increase in the temperature results in a lower amount of asphaltene precipitation and also it causes the convergence of lower and upper boundaries of the asphaltene phase envelope. This work illustrates successful application of a non-isothermal improved solid model for developing the asphaltene phase envelope of heavy crude which can be helpful for monitoring and controlling of asphaltene precipitation through the wellbore and surface facilities during heavy oil production.
Exposure to nature counteracts aggression after depletion.
Wang, Yan; She, Yihan; Colarelli, Stephen M; Fang, Yuan; Meng, Hui; Chen, Qiuju; Zhang, Xin; Zhu, Hongwei
2018-01-01
Acts of self-control are more likely to fail after previous exertion of self-control, known as the ego depletion effect. Research has shown that depleted participants behave more aggressively than non-depleted participants, especially after being provoked. Although exposure to nature (e.g., a walk in the park) has been predicted to replenish resources common to executive functioning and self-control, the extent to which exposure to nature may counteract the depletion effect on aggression has yet to be determined. The present study investigated the effects of exposure to nature on aggression following depletion. Aggression was measured by the intensity of noise blasts participants delivered to an ostensible opponent in a competition reaction-time task. As predicted, an interaction occurred between depletion and environmental manipulations for provoked aggression. Specifically, depleted participants behaved more aggressively in response to provocation than non-depleted participants in the urban condition. However, provoked aggression did not differ between depleted and non-depleted participants in the natural condition. Moreover, within the depletion condition, participants in the natural condition had lower levels of provoked aggression than participants in the urban condition. This study suggests that a brief period of nature exposure may restore self-control and help depleted people regain control over aggressive urges. © 2017 Wiley Periodicals, Inc.
Pisanu, Salvatore; Biosa, Grazia; Carcangiu, Laura; Uzzau, Sergio; Pagnozzi, Daniela
2018-08-01
Seven commercial products for human serum depletion/enrichment were tested and compared by shotgun proteomics. Methods were based on four different capturing agents: antibodies (Qproteome Albumin/IgG Depletion kit, ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit, Top 2 Abundant Protein Depletion Spin Columns, and Top 12 Abundant Protein Depletion Spin Columns), specific ligands (Albumin/IgG Removal), mixture of antibodies and ligands (Albumin and IgG Depletion SpinTrap), and combinatorial peptide ligand libraries (ProteoMiner beads), respectively. All procedures, to a greater or lesser extent, allowed an increase of identified proteins. ProteoMiner beads provided the highest number of proteins; Albumin and IgG Depletion SpinTrap and ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit resulted the most efficient in albumin removal; Top 2 and Top 12 Abundant Protein Depletion Spin Columns decreased the overall immunoglobulin levels more than other procedures, whereas specifically gamma immunoglobulins were mostly removed by Albumin and IgG Depletion SpinTrap, ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit, and Top 2 Abundant Protein Depletion Spin Columns. Albumin/IgG Removal, a resin bound to a mixture of protein A and Cibacron Blue, behaved less efficiently than the other products. Copyright © 2018 Elsevier B.V. All rights reserved.
Response surface method in geotechnical/structural analysis, phase 1
NASA Astrophysics Data System (ADS)
Wong, F. S.
1981-02-01
In the response surface approach, an approximating function is fit to a long running computer code based on a limited number of code calculations. The approximating function, called the response surface, is then used to replace the code in subsequent repetitive computations required in a statistical analysis. The procedure of the response surface development and feasibility of the method are shown using a sample problem in slop stability which is based on data from centrifuge experiments of model soil slopes and involves five random soil parameters. It is shown that a response surface can be constructed based on as few as four code calculations and that the response surface is computationally extremely efficient compared to the code calculation. Potential applications of this research include probabilistic analysis of dynamic, complex, nonlinear soil/structure systems such as slope stability, liquefaction, and nuclear reactor safety.
Moderate Deviation Analysis for Classical Communication over Quantum Channels
NASA Astrophysics Data System (ADS)
Chubb, Christopher T.; Tan, Vincent Y. F.; Tomamichel, Marco
2017-11-01
We analyse families of codes for classical data transmission over quantum channels that have both a vanishing probability of error and a code rate approaching capacity as the code length increases. To characterise the fundamental tradeoff between decoding error, code rate and code length for such codes we introduce a quantum generalisation of the moderate deviation analysis proposed by Altŭg and Wagner as well as Polyanskiy and Verdú. We derive such a tradeoff for classical-quantum (as well as image-additive) channels in terms of the channel capacity and the channel dispersion, giving further evidence that the latter quantity characterises the necessary backoff from capacity when transmitting finite blocks of classical data. To derive these results we also study asymmetric binary quantum hypothesis testing in the moderate deviations regime. Due to the central importance of the latter task, we expect that our techniques will find further applications in the analysis of other quantum information processing tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arndt, S.A.
1997-07-01
The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for codemore » use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.« less
NASA Astrophysics Data System (ADS)
Woodson, J.
2017-12-01
Deplete is intended to demonstrate by analogy the harmful effect that Green House Gases (GHG's) such as CO2 and H2O vapor are causing to the Ozone Layer. Increasing temperatures from human activities are contributing to the depletion of Ozone.
Englert, Chris; Persaud, Brittany N; Oudejans, Raôul R D; Bertrams, Alex
2015-01-01
We tested the assumption that ego depletion would affect the sprint start in a sample of N = 38 athletes without track and field experience in an experiment by applying a mixed between- (depletion vs. non-depletion) within- (T1: before manipulation of ego depletion vs. T2: after manipulation of ego depletion) subjects design. We assumed that ego depletion would increase the possibility for a false start, as regulating the impulse to initiate the sprinting movement too soon before the starting signal requires self-control. In line with our assumption, we found a significant interaction as there was only a significant increase in the number of false starts from T1 to T2 for the depletion group while this was not the case for the non-depletion group. We conclude that ego depletion has a detrimental influence on the sprint start in athletes without track and field experience.
Englert, Chris; Persaud, Brittany N.; Oudejans, Raôul R. D.; Bertrams, Alex
2015-01-01
We tested the assumption that ego depletion would affect the sprint start in a sample of N = 38 athletes without track and field experience in an experiment by applying a mixed between- (depletion vs. non-depletion) within- (T1: before manipulation of ego depletion vs. T2: after manipulation of ego depletion) subjects design. We assumed that ego depletion would increase the possibility for a false start, as regulating the impulse to initiate the sprinting movement too soon before the starting signal requires self-control. In line with our assumption, we found a significant interaction as there was only a significant increase in the number of false starts from T1 to T2 for the depletion group while this was not the case for the non-depletion group. We conclude that ego depletion has a detrimental influence on the sprint start in athletes without track and field experience. PMID:26347678
Colour cyclic code for Brillouin distributed sensors
NASA Astrophysics Data System (ADS)
Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne
2015-09-01
For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.
NASA Astrophysics Data System (ADS)
Alvarez-Aviles, L.; Simpson, W. R.; Douglas, T. A.; Sturm, M.; Perovich, D. K.
2006-12-01
Frost flowers are believed to be responsible for most of the salt aerosol and possibly the bromine in the gas phase during springtime in Polar Regions. Frost flowers are vapor deposited ice crystals that form on new forming sea ice and wick brine from the sea-ice surface resulting in high salinities. We propose a conceptual model of frost flower growth and chemical fractionation using chemical analysis to support this model. We also consider how the chemical composition of frost flowers can tell us about the role of frost flowers in bromine activation and aerosol production. Our conceptual model is centered in two important events that occur when sea ice grows and the ice surface temperature gets colder. Brine on the sea-ice surface is drawn up the frost flower by capillary forces, therefore the high salinity values found. Secondarily salt hydrates begin to precipitate at certain temperatures. These precipitation reactions modify the chemical composition of the frost flowers and residual brine, and are the main topic of this research. We found variability and generally depletion of sulfate as compared to sea-water composition in most of the mature frost flowers. This result is in agreement with the literature, which proposes the depletion in sulfate occurs because mirabilite (Na2SO4 · 10H2O) precipitates before the brine is wicked. The observation of some slightly sulfate-enhanced samples in addition to depleted samples indicates that the brine/frost flower environment is the location where mirabilite precipitation and separation from residual brine occurs. Frost flowers bromide enhancement factors are all, within analytical limits, identical to sea water, although nearby snow is depleted in bromide. Because of the high salt concentrations in frost flowers, significant bromine activation could occur from frost flowers without being detected by this measurement. However, if all bromide activation occurred on frost flowers, and frost flowers are not depleted in bromide, no snow would be found that was depleted in bromide. Therefore, the observation of snow that is depleted in bromide shows there must be some activation of bromide subsequent to frost flowers formation.
Bland, D; Rona, R; Coggon, D; Anderson, J; Greenberg, N; Hull, L; Wessely, S
2007-01-01
Objectives To assess the distribution and risk factors of depleted uranium uptake in military personnel who had taken part in the invasion of Iraq in 2003. Methods Sector field inductively coupled plasma-mass spectrometry (SF-ICP-MS) was used to determine the uranium concentration and 238U/235U isotopic ratio in spot urine samples. The authors collected urine samples from four groups identified a priori as having different potential for exposure to depleted uranium. These groups were: combat personnel (n = 199); non-combat personnel (n = 96); medical personnel (n = 22); and “clean-up” personnel (n = 24) who had been involved in the maintenance, repair or clearance of potentially contaminated vehicles in Iraq. A short questionnaire was used to ascertain individual experience of circumstances in which depleted uranium exposure might have occurred. Results There was no statistically significant difference in the 238U/235U ratio between groups. Mean ratios by group varied from 138.0 (95% CI 137.3 to 138.7) for clean-up personnel to 138.2 (95% CI 138.0 to 138.5) for combat personnel, and were close to the ratio of 137.9 for natural uranium. The two highest individual ratios (146.9 and 147.7) were retested using more accurate, multiple collector inductively coupled plasma-mass spectrometry (MC-ICP-MS) and found to be within measurement of error of that for natural uranium. There were no significant differences in isotope ratio between participants according to self-reported circumstances of potential depleted uranium exposure. Conclusions Based on measurements using a SF-ICP-MS apparatus, this study provides reassurance following concern for potential widespread depleted uranium uptake in the UK military. The rare occurrence of elevated ratios may reflect the limits of accuracy of the SF-ICP-MS apparatus and not a real increase from the natural proportions of the isotopes. Any uptake of depleted uranium among participants in this study sample would be very unlikely to have any implications for health. PMID:17609224
Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bekar, Kursat B.; Ibrahim, Ahmad M.
2017-05-01
This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which compliesmore » with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.« less
Airfoil Vibration Dampers program
NASA Technical Reports Server (NTRS)
Cook, Robert M.
1991-01-01
The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.
Oh, Chang Seok; Lee, Soong Deok; Kim, Yi-Suk; Shin, Dong Hoon
2015-01-01
Previous study showed that East Asian mtDNA haplogroups, especially those of Koreans, could be successfully assigned by the coupled use of analyses on coding region SNP markers and control region mutation motifs. In this study, we tried to see if the same triple multiplex analysis for coding regions SNPs could be also applicable to ancient samples from East Asia as the complementation for sequence analysis of mtDNA control region. By the study on Joseon skeleton samples, we know that mtDNA haplogroup determined by coding region SNP markers successfully falls within the same haplogroup that sequence analysis on control region can assign. Considering that ancient samples in previous studies make no small number of errors in control region mtDNA sequencing, coding region SNP analysis can be used as good complimentary to the conventional haplogroup determination, especially of archaeological human bone samples buried underground over long periods. PMID:26345190
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
A study on the cytotoxicity of carbon-based materials
Saha, Dipendu; Heldt, Caryn L.; Gencoglu, Maria F.; ...
2016-05-25
With an aim to understand the origin and key contributing factors towards carboninduced cytotoxicity, we have studied five different carbon samples with diverse surface area, pore width, shape and size, conductivity and surface functionality. All the carbon materials were characterized with surface area and pore size distribution, x-ray photoelectron spectroscopy (XPS) and electron microscopic imaging. We performed cytotoxicity study in Caco-2 cells by colorimetric assay, oxidative stress analysis by reactive oxygen species (ROX) detection, cellular metabolic activity measurement by adenosine triphosphate (ATP) depletion and visualization of cellular internalization by TEM imaging. The carbon materials demonstrated a varying degree of cytotoxicitymore » in contact with Caco-2 cells. The lowest cell survival rate was observed for nanographene, which possessed the minimal size amongst all the carbon samples under study. None of the carbons induced oxidative stress to the cells as indicated by the ROX generation results. Cellular metabolic activity study revealed that the carbon materials caused ATP depletion in cells and nanographene caused the highest depletion. Visual observation by TEM imaging indicated the cellular internalization of nanographene. This study confirmed that the size is the key cause of carbon-induced cytotoxicity and it is probably caused by the ATP depletion within the cell.« less
A novel MALDI–TOF based methodology for genotyping single nucleotide polymorphisms
Blondal, Thorarinn; Waage, Benedikt G.; Smarason, Sigurdur V.; Jonsson, Frosti; Fjalldal, Sigridur B.; Stefansson, Kari; Gulcher, Jeffery; Smith, Albert V.
2003-01-01
A new MALDI–TOF based detection assay was developed for analysis of single nucleotide polymorphisms (SNPs). It is a significant modification on the classic three-step minisequencing method, which includes a polymerase chain reaction (PCR), removal of excess nucleotides and primers, followed by primer extension in the presence of dideoxynucleotides using modified thermostable DNA polymerase. The key feature of this novel assay is reliance upon deoxynucleotide mixes, lacking one of the nucleotides at the polymorphic position. During primer extension in the presence of depleted nucleotide mixes, standard thermostable DNA polymerases dissociate from the template at positions requiring a depleted nucleotide; this principal was harnessed to create a genotyping assay. The assay design requires a primer- extension primer having its 3′-end one nucleotide upstream from the interrogated site. The assay further utilizes the same DNA polymerase in both PCR and the primer extension step. This not only simplifies the assay but also greatly reduces the cost per genotype compared to minisequencing methodology. We demonstrate accurate genotyping using this methodology for two SNPs run in both singleplex and duplex reactions. We term this assay nucleotide depletion genotyping (NUDGE). Nucleotide depletion genotyping could be extended to other genotyping assays based on primer extension such as detection by gel or capillary electrophoresis. PMID:14654708
Theoretical analysis of nBn infrared photodetectors
NASA Astrophysics Data System (ADS)
Ting, David Z.; Soibel, Alexander; Khoshakhlagh, Arezou; Gunapala, Sarath D.
2017-09-01
The depletion and surface leakage dark current suppression properties of unipolar barrier device architectures such as the nBn have been highly beneficial for III-V semiconductor-based infrared detectors. Using a one-dimensional drift-diffusion model, we theoretically examine the effects of contact doping, minority carrier lifetime, and absorber doping on the dark current characteristics of nBn detectors to explore some basic aspects of their operation. We found that in a properly designed nBn detector with highly doped excluding contacts the minority carriers are extracted to nonequilibrium levels under reverse bias in the same manner as the high operating temperature (HOT) detector structure. Longer absorber Shockley-Read-Hall (SRH) lifetimes result in lower diffusion and depletion dark currents. Higher absorber doping can also lead to lower diffusion and depletion dark currents, but the benefit should be weighted against the possibility of reduced diffusion length due to shortened SRH lifetime. We also briefly examined nBn structures with unintended minority carrier blocking barriers due to excessive n-doping in the unipolar electron barrier, or due to a positive valence band offset between the barrier and the absorber. Both types of hole blocking structures lead to higher turn-on bias, although barrier n-doping could help suppress depletion dark current.
NASA Astrophysics Data System (ADS)
Mok, Angus; Wilson, C. D.; Golding, J.; Warren, B. E.; Israel, F. P.; Serjeant, S.; Knapen, J. H.; Sánchez-Gallego, J. R.; Barmby, P.; Bendo, G. J.; Rosolowsky, E.; van der Werf, P.
2016-03-01
We present a study of the molecular gas properties in a sample of 98 H I - flux selected spiral galaxies within ˜25 Mpc, using the CO J = 3 - 2 line observed with the James Clerk Maxwell Telescope. We use the technique of survival analysis to incorporate galaxies with CO upper limits into our results. Comparing the group and Virgo samples, we find a larger mean H2 mass in the Virgo galaxies, despite their lower mean H I mass. This leads to a significantly higher H2 to H I ratio for Virgo galaxies. Combining our data with complementary Hα star formation rate measurements, Virgo galaxies have longer molecular gas depletion times compared to group galaxies, due to their higher H2 masses and lower star formation rates. We suggest that the longer depletion times may be a result of heating processes in the cluster environment or differences in the turbulent pressure. From the full sample, we find that the molecular gas depletion time has a positive correlation with the stellar mass, indicative of differences in the star formation process between low- and high-mass galaxies, and a negative correlation between the molecular gas depletion time and the specific star formation rate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, H.M.; Ruther, W.E.; Sanecki, J.E.
1991-08-01
High- and commercial-purity heats of Type 304 stainless steel, obtained from neutron absorber tubes after irradiation to fluence levels of up to 2 {times} 10{sup 21} n{center dot}cm{sup {minus}2} (E > 1 MeV) in two boiling water reactors, were examined by Auger electron spectroscopy to characterize irradiation-induced grain- boundary segregation and depletion of alloying and impurity elements. Segregation of Si, P, Ni, and an unidentified element or compound that gives rise to an Auger energy peak at 59 eV was observed in the commercial-purity heat. Such segregation was negligible in high-purity material, except for Ni. No evidence of S segregationmore » was observed in either material. Cr depletion was more pronounced in the high-purity material than in the commercial-purity material. These observations suggest a synergism between the significant level of impurities and Cr depletion in the commercial-purity heat. In the absence of such synergism, Cr depletion appears more pronounced in the high-purity heat. Initial results of constant-extension-rate tests conducted on the two heats in air an in simulated BWR water were correlated with the results from analysis by Auger electron spectroscopy. 15 refs., 10 figs.« less
Impact of macrophage deficiency and depletion on continuous glucose monitoring in vivo
Klueh, Ulrike; Qiao, Yi; Frailey, Jackman T.; Kreutzer, Donald L.
2014-01-01
Although it is assumed that macrophages (MQ) have a major negative impact on continuous glucose monitoring (CGM), surprisingly there is no data in the literature to directly support or refute the role of MQ or related foreign body giant cells in the bio-fouling of glucose sensors in vivo. As such, we developed the hypothesis that MQ are key in controlling glucose sensor performance and CGM in vivo and MQ deficiencies or depletion would enhance CGM. To test this hypothesis we determined the presence/distribution of MQ at the sensor tissue interface over a 28-day time period using F4/80 antibody and immunohistochemical analysis. We also evaluated the impact of spontaneous MQ deficiency (op/op mice) and induced-transgenic MQ depletions (Diphtheria Toxin Receptor (DTR) mice) on sensor function and CGM utilizing our murine CGM system. The results of these studies demonstrated: 1) a time dependent increase in MQ accumulation (F4/80 positive cells) at the sensor tissue interface; and 2) MQ deficient mice and MQ depleted C57BL/6 mice demonstrated improved sensor performance (MARD) when compared to normal mice (C57BL/6). These studies directly demonstrate the importance of MQ in sensor function and CGM in vivo. PMID:24331705
Hampshire, Tobias; Soneji, Shamit; Bacon, Joanna; James, Brian W.; Hinds, Jason; Laing, Ken; Stabler, Richard A; Marsh, Philip D.; Butcher, Philip D
2011-01-01
Summary The majority of individuals infected with TB develop a latent infection, in which organisms survive within the body while evading the host immune system. Such persistent bacilli are capable of surviving several months of combinatorial antibiotic treatment. Evidence suggests that stationary phase bacteria adapt to increase their tolerance to environmental stresses. We have developed a unique in vitro model of dormancy based on the characterization of a single, large volume fermenter culture of M. tuberculosis, as it adapts to stationary phase. Cells are maintained in controlled and defined aerobic conditions (50% dissolved oxygen tension), using probes that measure dissolved oxygen tension, temperature, and pH. Microarray analysis has been used in conjunction with viability and nutrient depletion assays to dissect differential gene expression. Following exponential phase growth the gradual depletion of glucose/glycerol resulted in a small population of survivors that were characterized for periods in excess of 100 days. Bacilli adapting to nutrient depletion displayed characteristics associated with persistence in vivo, including entry into a non-replicative state and the up-regulation of genes involved in β-oxidation of fatty acids and virulence. A reduced population of non-replicating bacilli went on to adapt sufficiently to re-initiate cellular division. PMID:15207492
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir; O'Malley, Daniel; Lin, Youzuo
2016-07-01
Mads.jl (Model analysis and decision support in Julia) is a code that streamlines the process of using data and models for analysis and decision support. It is based on another open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11- 035). Mads.jl can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. It enables a number of data- and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. The code also can use a series of alternative adaptive computational techniques for Bayesian sampling, Monte Carlo,more » and Bayesian Information-Gap Decision Theory. The code is implemented in the Julia programming language, and has high-performance (parallel) and memory management capabilities. The code uses a series of third party modules developed by others. The code development will also include contributions to the existing third party modules written in Julia; this contributions will be important for the efficient implementation of the algorithm used by Mads.jl. The code also uses a series of LANL developed modules that are developed by Dan O'Malley; these modules will be also a part of the Mads.jl release. Mads.jl will be released under GPL V3 license. The code will be distributed as a Git repo at gitlab.com and github.com. Mads.jl manual and documentation will be posted at madsjulia.lanl.gov.« less
Development of Web Interfaces for Analysis Codes
NASA Astrophysics Data System (ADS)
Emoto, M.; Watanabe, T.; Funaba, H.; Murakami, S.; Nagayama, Y.; Kawahata, K.
Several codes have been developed to analyze plasma physics. However, most of them are developed to run on supercomputers. Therefore, users who typically use personal computers (PCs) find it difficult to use these codes. In order to facilitate the widespread use of these codes, a user-friendly interface is required. The authors propose Web interfaces for these codes. To demonstrate the usefulness of this approach, the authors developed Web interfaces for two analysis codes. One of them is for FIT developed by Murakami. This code is used to analyze the NBI heat deposition, etc. Because it requires electron density profiles, electron temperatures, and ion temperatures as polynomial expressions, those unfamiliar with the experiments find it difficult to use this code, especially visitors from other institutes. The second one is for visualizing the lines of force in the LHD (large helical device) developed by Watanabe. This code is used to analyze the interference caused by the lines of force resulting from the various structures installed in the vacuum vessel of the LHD. This code runs on PCs; however, it requires that the necessary parameters be edited manually. Using these Web interfaces, users can execute these codes interactively.
The Influence of Chronic Ego Depletion on Goal Adherence: An Experience Sampling Study.
Wang, Ligang; Tao, Ting; Fan, Chunlei; Gao, Wenbin; Wei, Chuguang
2015-01-01
Although ego depletion effects have been widely observed in experiments in which participants perform consecutive self-control tasks, the process of ego depletion remains poorly understood. Using the strength model of self-control, we hypothesized that chronic ego depletion adversely affects goal adherence and that mental effort and motivation are involved in the process of ego depletion. In this study, 203 students reported their daily performance, mental effort, and motivation with respect to goal directed behavior across a 3-week time period. People with high levels of chronic ego depletion were less successful in goal adherence than those with less chronic ego depletion. Although daily effort devoted to goal adherence increased with chronic ego depletion, motivation to adhere to goals was not affected. Participants with high levels of chronic ego depletion showed a stronger positive association between mental effort and performance, but chronic ego depletion did not play a regulatory role in the effect of motivation on performance. Chronic ego depletion increased the likelihood of behavior regulation failure, suggesting that it is difficult for people in an ego-depletion state to adhere to goals. We integrate our results with the findings of previous studies and discuss possible theoretical implications.