Science.gov

Sample records for distributed source term

  1. Geometric discretization of the multidimensional Dirac delta distribution - Application to the Poisson equation with singular source terms

    NASA Astrophysics Data System (ADS)

    Egan, Raphael; Gibou, Frédéric

    2017-10-01

    We present a discretization method for the multidimensional Dirac distribution. We show its applicability in the context of integration problems, and for discretizing Dirac-distributed source terms in Poisson equations with constant or variable diffusion coefficients. The discretization is cell-based and can thus be applied in a straightforward fashion to Quadtree/Octree grids. The method produces second-order accurate results for integration. Superlinear convergence is observed when it is used to model Dirac-distributed source terms in Poisson equations: the observed order of convergence is 2 or slightly smaller. The method is consistent with the discretization of Dirac delta distribution for codimension one surfaces presented in [1,2]. We present Quadtree/Octree construction procedures to preserve convergence and present various numerical examples, including multi-scale problems that are intractable with uniform grids.

  2. Spatial distribution of HTO activity in unsaturated soil depth in the vicinity of long-term release source

    SciTech Connect

    Golubev, A.; Golubeva, V.; Mavrin, S.

    2015-03-15

    Previous studies reported about a correlation between HTO activity distribution in unsaturated soil layer and atmospheric long-term releases of HTO in the vicinity of Savannah River Site. The Tritium Working Group of BIOMASS Programme has performed a model-model intercomparison study of HTO transport from atmosphere to unsaturated soil and has evaluated HTO activity distribution in the unsaturated soil layer in the vicinity of permanent atmospheric sources. The Tritium Working Group has also reported about such a correlation, however the conclusion was that experimental data sets are needed to confirm this conclusion and also to validate appropriate computer models. (authors)

  3. Size distribution, directional source contributions and pollution status of PM from Chengdu, China during a long-term sampling campaign.

    PubMed

    Shi, Guo-Liang; Tian, Ying-Ze; Ma, Tong; Song, Dan-Lin; Zhou, Lai-Dong; Han, Bo; Feng, Yin-Chang; Russell, Armistead G

    2017-06-01

    Long-term and synchronous monitoring of PM10 and PM2.5 was conducted in Chengdu in China from 2007 to 2013. The levels, variations, compositions and size distributions were investigated. The sources were quantified by two-way and three-way receptor models (PMF2, ME2-2way and ME2-3way). Consistent results were found: the primary source categories contributed 63.4% (PMF2), 64.8% (ME2-2way) and 66.8% (ME2-3way) to PM10, and contributed 60.9% (PMF2), 65.5% (ME2-2way) and 61.0% (ME2-3way) to PM2.5. Secondary sources contributed 31.8% (PMF2), 32.9% (ME2-2way) and 31.7% (ME2-3way) to PM10, and 35.0% (PMF2), 33.8% (ME2-2way) and 36.0% (ME2-3way) to PM2.5. The size distribution of source categories was estimated better by the ME2-3way method. The three-way model can simultaneously consider chemical species, temporal variability and PM sizes, while a two-way model independently computes datasets of different sizes. A method called source directional apportionment (SDA) was employed to quantify the contributions from various directions for each source category. Crustal dust from east-north-east (ENE) contributed the highest to both PM10 (12.7%) and PM2.5 (9.7%) in Chengdu, followed by the crustal dust from south-east (SE) for PM10 (9.8%) and secondary nitrate & secondary organic carbon from ENE for PM2.5 (9.6%). Source contributions from different directions are associated with meteorological conditions, source locations and emission patterns during the sampling period. These findings and methods provide useful tools to better understand PM pollution status and to develop effective pollution control strategies. Copyright © 2016. Published by Elsevier B.V.

  4. Chernobyl source term estimation

    SciTech Connect

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1990-09-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. The model simulations revealed that the radioactive cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. By optimizing the agreement between the observed cloud arrival times and duration of peak concentrations measured over Europe, Japan, Kuwait, and the US with the model predicted concentrations, it was possible to derive source term estimates for those radionuclides measured in airborne radioactivity. This was extended to radionuclides that were largely unmeasured in the environment by performing a reactor core radionuclide inventory analysis to obtain release fractions for the various chemical transport groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium and about 1% or less of the more refractory elements were released. These estimates are in excellent agreement with those obtained on the basis of worldwide deposition measurements. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the {sup 137}Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests, while the {sup 131}I and {sup 90}Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests. 13 refs., 2 figs., 7 tabs.

  5. Design parameters and source terms: Volume 3, Source terms

    SciTech Connect

    Not Available

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 11 refs., 9 tabs.

  6. Long-term measurements of particle number size distributions and the relationships with air mass history and source apportionment in the summer of Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Z. B.; Hu, M.; Wu, Z. J.; Yue, D. L.; He, L. Y.; Huang, X. F.; Liu, X. G.; Wiedensohler, A.

    2013-02-01

    A series of long-term and temporary measurements were conducted to study the improvement of air quality in Beijing during Olympic Games period (8-24 August 2008). To evaluate actions taken to improve the air quality, comparisons of particle number and volume size distributions of August 2008 and 2004-2007 were performed. The total particle number and volume concentrations were 14 000 cm-3 and 37 μm3 cm-3 in August of 2008, respectively. These were reductions of 41% and 35% compared with the mean values of August 2004-2007. A cluster analysis on air mass history and source apportionment were performed, exploring reasons of the reduction of particle concentrations. Back trajectories were classified into five major clusters. Air mass from south direction are always associated with pollution events during the summertime of Beijing. In August 2008, the frequency of air mass arriving from south has been twice higher compared to the average of the previous years, these southerly air masses did however not result in elevated particle volume concentrations in Beijing. This result implied that the air mass history was not the key factor, explaining reduced particle number and volume concentrations during the Beijing 2008 Olympic Games. Four factors were found influencing particle concentrations using a Positive matrix factorization (PMF) model. They were identified to local and remote traffic emissions, combustion sources as well as secondary transformation. The reductions of the four sources were calculated to 47%, 44%, 43% and 30%, respectively. The significant reductions of particle number and volume concentrations may attribute to actions taken, focusing on primary emissions, especially related to the traffic and combustion sources.

  7. Long-term measurements of particle number size distributions and the relationships with air mass history and source apportionment in the summer of Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Z. B.; Hu, M.; Wu, Z. J.; Yue, D. L.; He, L. Y.; Huang, X. F.; Liu, X. G.; Wiedensohler, A.

    2013-10-01

    A series of long-term and temporary measurements were conducted to study the improvement of air quality in Beijing during the Olympic Games period (8-24 August 2008). To evaluate actions taken to improve the air quality, comparisons of particle number and volume size distributions of August 2008 and 2004-2007 were performed. The total particle number and volume concentrations were 14 000 cm-3 and 37 μm-3 cm-3 in August of 2008, respectively. These were reductions of 41% and 35% compared with mean values of August 2004-2007. A cluster analysis on air mass history and source apportionment were performed, exploring reasons for the reduction of particle concentrations. Back trajectories were classified into five major clusters. Air masses from the south direction are always associated with pollution events during the summertime in Beijing. In August 2008, the frequency of air mass arriving from the south was 1.3 times higher compared to the average of the previous years, which however did not result in elevated particle volume concentrations in Beijing. Therefore, the reduced particle number and volume concentrations during the 2008 Beijing Olympic Games cannot be only explained by meteorological conditions. Four factors were found influencing particle concentrations using a positive matrix factorization (PMF) model. They were identified as local and remote traffic emissions, combustion sources as well as secondary transformation. The reductions of the four sources were calculated to 47%, 44%, 43% and 30%, respectively. The significant reductions of particle number and volume concentrations may attribute to actions taken, focusing on primary emissions, especially related to the traffic and combustion sources.

  8. Infrared image processing devoted to thermal non-contact characterization-Applications to Non-Destructive Evaluation, Microfluidics and 2D source term distribution for multispectral tomography

    NASA Astrophysics Data System (ADS)

    Batsale, Jean-Christophe; Pradere, Christophe

    2015-11-01

    The cost of IR cameras is more and more decreasing. Beyond the preliminary calibration step and the global instrumentation, the infrared image processing is then one of the key step for achieving in very broad domains. Generally the IR images are coming from the transient temperature field related to the emission of a black surface in response to an external or internal heating (active IR thermography). The first applications were devoted to the so called thermal Non-Destructive Evaluation methods by considering a thin sample and 1D transient heat diffusion through the sample (transverse diffusion). With simplified assumptions related to the transverse diffusion, the in-plane diffusion and transport phenomena can be also considered. A general equation can be applied in order to balance the heat transfer at the pixel scale or between groups of pixels in order to estimate several fields of thermophysical properties (heterogeneous field of in-plane diffusivity, flow distributions, source terms). There is a lot of possible strategies to process the space and time distributed big amount of data (previous integral transformation of the images, compression, elimination of the non useful areas...), generally based on the necessity to analyse the derivative versus space and time of the temperature field. Several illustrative examples related to the Non-Destructive Evaluation of heterogeneous solids, the thermal characterization of chemical reactions in microfluidic channels and the design of systems for multispectral tomography, will be presented.

  9. HTGR Mechanistic Source Terms White Paper

    SciTech Connect

    Wayne Moe

    2010-07-01

    The primary purposes of this white paper are: (1) to describe the proposed approach for developing event specific mechanistic source terms for HTGR design and licensing, (2) to describe the technology development programs required to validate the design methods used to predict these mechanistic source terms and (3) to obtain agreement from the NRC that, subject to appropriate validation through the technology development program, the approach for developing event specific mechanistic source terms is acceptable

  10. SOURCE TERMS FOR AVERAGE DOE SNF CANISTERS

    SciTech Connect

    K. L. Goluoglu

    2000-06-09

    The objective of this calculation is to generate source terms for each type of Department of Energy (DOE) spent nuclear fuel (SNF) canister that may be disposed of at the potential repository at Yucca Mountain. The scope of this calculation is limited to generating source terms for average DOE SNF canisters, and is not intended to be used for subsequent calculations requiring bounding source terms. This calculation is to be used in future Performance Assessment calculations, or other shielding or thermal calculations requiring average source terms.

  11. Source term calculations for assessing radiation dose to equipment

    SciTech Connect

    Denning, R.S.; Freeman-Kelly, R.; Cybulskis, P.; Curtis, L.A.

    1989-07-01

    This study examines results of analyses performed with the Source Term Code Package to develop updated source terms using NUREG-0956 methods. The updated source terms are to be used to assess the adequacy of current regulatory source terms used as the basis for equipment qualification. Time-dependent locational distributions of radionuclides within a containment following a severe accident have been developed. The Surry reactor has been selected in this study as representative of PWR containment designs. Similarly, the Peach Bottom reactor has been used to examine radionuclide distributions in boiling water reactors. The time-dependent inventory of each key radionuclide is provided in terms of its activity in curies. The data are to be used by Sandia National Laboratories to perform shielding analyses to estimate radiation dose to equipment in each containment design. See NUREG/CR-5175, Beta and Gamma Dose Calculations for PWR and BWR Containments.'' 6 refs., 11 tabs.

  12. Calculation of source terms for NUREG-1150

    SciTech Connect

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP.

  13. SOURCE TERMS FOR HLW GLASS CANISTERS

    SciTech Connect

    J.S. Tang

    2000-08-15

    This calculation is prepared by the Monitored Geologic Repository (MGR) Waste Package Design Section. The objective of this calculation is to determine the source terms that include radionuclide inventory, decay heat, and radiation sources due to gamma rays and neutrons for the high-level radioactive waste (HLW) from the, West Valley Demonstration Project (WVDP), Savannah River Site (SRS), Hanford Site (HS), and Idaho National Engineering and Environmental Laboratory (INEEL). This calculation also determines the source terms of the canister containing the SRS HLW glass and immobilized plutonium. The scope of this calculation is limited to source terms for a time period out to one million years. The results of this calculation may be used to carry out performance assessment of the potential repository and to evaluate radiation environments surrounding the waste packages (WPs). This calculation was performed in accordance with the Development Plan ''Source Terms for HLW Glass Canisters'' (Ref. 7.24).

  14. Mechanistic facility safety and source term analysis

    SciTech Connect

    PLYS, M.G.

    1999-06-09

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here.

  15. Dose distributions in regions containing beta sources: Irregularly shaped source distributions in homogeneous media

    SciTech Connect

    Werner, B.L. )

    1991-11-01

    Methods are introduced by which dose rate distributions due to nonuniform, irregularly shaped distributions of beta emitters can be calculated using dose rate distributions for uniform, spherical source distributions. The dose rate distributions can be written in the MIRD formalism.

  16. Assessing sensitivity of source term estimation

    NASA Astrophysics Data System (ADS)

    Long, Kerrie J.; Haupt, Sue Ellen; Young, George S.

    2010-04-01

    Source term estimation algorithms compute unknown atmospheric transport and dispersion modeling variables from concentration observations made by sensors in the field. Insufficient spatial and temporal resolution in the meteorological data as well as inherent uncertainty in the wind field data make source term estimation and the prediction of subsequent transport and dispersion extremely difficult. This work addresses the question: how many sensors are necessary in order to successfully estimate the source term and meteorological variables required for atmospheric transport and dispersion modeling? The source term estimation system presented here uses a robust optimization technique - a genetic algorithm (GA) - to find the combination of source location, source height, source strength, surface wind direction, surface wind speed, and time of release that produces a concentration field that best matches the sensor observations. The approach is validated using the Gaussian puff as the dispersion model in identical twin numerical experiments. The limits of the system are tested by incorporating additive and multiplicative noise into the synthetic data. The minimum requirements for data quantity and quality are determined by an extensive grid sensitivity analysis. Finally, a metric is developed for quantifying the minimum number of sensors necessary to accurately estimate the source term and to obtain the relevant wind information.

  17. Supernate source term analysis: Revision 1

    SciTech Connect

    Aponte, C.I.

    1994-10-13

    The HM Process (modified PUREX) has been used in the H-Canyon since 1959 to recover uranium and byproduct neptunium. The PUREX process has been used in the Separation facilities in F and H-Area. This report analyzes both the inhalation and ingestion radionuclide dose impact of the HM and PUREX process soluble portion of their waste streams. The spent fuel assemblies analyzed are the Mark 16B, Mar 22 for the HM process, and the Mark 31A, Mark 31B for the PUREX process. The results from this analysis are combined with an analysis of the current Safety Analysis Report SAR source term to evaluate source terms for HLW supernate. Analysis of fission yield data and SAR source term values demonstrates that a limited number of radionuclides contribute 1% or more to the total dose and that cesium and plutonium isotopes are the radionuclides with major impact in the supernate source term. This report analyses both volatile and evaporative impact as recommended by DOE guidance. In reality, the only radionuclide volatilized during evaporative conditions is tritium. No evidence of selective volatility occurs during forced evaporation in HLW. The results obtained permit reducing the list of radionuclides to be considered in the development of source terms to support the High Level Waste Safety Analysis Report.

  18. A Bayesian Algorithm for Assessing Uncertainty in Radionuclide Source Terms

    NASA Astrophysics Data System (ADS)

    Robins, Peter

    2015-04-01

    Inferring source term parameters for a radionuclide release is difficult, due to the large uncertainties in forward dispersion modelling as a consequence of imperfect knowledge pertaining to wind vector fields and turbulent diffusion in the Earth's atmosphere. Additional sources of error include the radionuclide measurements obtained from sensors. These measurements may either be subject to random fluctuations or are simple indications that the true, unobserved quantity is below a detection limit. Consequent large reconstruction uncertainties can render a "best" estimate meaningless. A Markov Chain Monte Carlo (MCMC) Bayesian Algorithm is presented that attempts to account for uncertainties in atmospheric transport modelling and radionuclide sensor measurements to quantify uncertainties in radionuclide release source term parameters. Prior probability distributions are created for likely release locations at existing nuclear facilities and seismic events. Likelihood models are constructed using CTBTO adjoint modelling output and probability distributions of sensor response. Samples from the resulting multi-isotope source term parameters posterior probability distribution are generated that can be used to make probabilistic statements about the source term. Examples are given of marginal probability distributions obtained from simulated sensor data. The consequences of errors in numerical weather prediction wind fields are demonstrated with a reconstruction of the Fukushima nuclear reactor accident from International Monitoring System radionuclide particulate sensor data.

  19. Distribution of noise sources for seismic interferometry

    NASA Astrophysics Data System (ADS)

    Harmon, Nicholas; Rychert, Catherine; Gerstoft, Peter

    2010-12-01

    We demonstrate that the distribution of seismic noise sources affects the accuracy of Green's function estimates and therefore isotropic and anisotropic tomographic inversions for both velocity and attenuation. We compare three methods for estimating seismic noise source distributions and quantify the potential error in phase velocity, azimuthal anisotropy and attenuation estimates due to inhomogenous source distributions. The methods include: (1) least-squares inversion of beamformer output, (2) a least-squares inversion of year long stacked noise correlation functions assuming both a 2-D plane wave source density model and (3) a 3-D plane wave source density model. We use vertical component data from the 190 stations of the Southern California Seismic Network and some US Array stations for 2008. The good agreement between the three models suggests the 2-D plane wave model, with the fewest number of unknown parameters, is generally sufficient to describe the noise density function for tomographic inversions. At higher frequencies, 3-D and beamforming models are required to resolve peaks in energy associated with body waves. We illustrate and assess isotropic and azimuthally anisotropic phase velocity and attenuation uncertainties for the noise source distribution in southern California by inverting isotropic lossless synthetic Fourier transformed noise correlation function predictions from modelled 2-D source distribution. We find that the variation in phase velocity with azimuth from inhomogeneous source distribution yields up to 1 per cent apparent peak-to-peak anisotropy. We predict apparent attenuation coefficients from our lossless synthetics on the same order of magnitude as those previously reported for the region from ambient noise. Since noise source distributions are likely inhomogeneous varying regionally and with time, we recommend that noise correlation studies reporting attenuation and anisotropy incorporate source density information.

  20. BWR Source Term Generation and Evaluation

    SciTech Connect

    J.C. Ryman

    2003-07-31

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operating conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the

  1. SUBURFACE SHIELDING-SPECIFIC SOURCE TERM EVALUATION

    SciTech Connect

    S. Su

    1999-08-24

    The purpose of this work is to provide supporting calculations for determination of the radiation source terms specific to subsurface shielding design and analysis. These calculations are not intended to provide the absolute values of the source terms, which are under the charter of the Waste Package Operations (WPO) Group. Rather, the calculations focus on evaluation of the various combinations of fuel enrichment, burnup and cooling time for a given decay heat output, consistent with the waste package (WP) thermal design basis. The objective is to determine the worst-case combination of the fuel characteristics (enrichment, burnup and cooling time) which would give the maximum radiation fields for subsurface shielding considerations. The calculations are limited to PWR fuel only, since the WP design is currently evolving with thinner walls and a reduced heat load as compared to the viability assessment (VA) reference design. The results for PWR fuel will provide a comparable indication of the trend for BWR fuel, as their characteristics are similar. The source term development for defense high-level waste and other spent nuclear fuel (SNF) is the responsibility of the WPO Group, and therefore, is not included this work. This work includes the following items responsive to the stated purpose and objective: (1) Determine the possible fuel parameters (initial enrichment, burnup and cooling time), that give the same decay heat value as specified for the waste package thermal design; (2) Obtain the neutron and gamma source terms for the various combinations of the fuel parameters for use in radiation field calculations; and (3) Calculate radiation fields on the surfaces of the waste package and its transporter to quantify the effects of the fuel parameters with the same decay heat value for use in identifying the worst-case combination of the fuel parameters.

  2. Reevaluation of HFIR source term: Supplement 2

    SciTech Connect

    Thomas, W.E.

    1986-11-01

    The HFIR source term has been reevaluated to assess the impact of the increase in core lifetime from 15 to 24 days. Calculations were made to determine the nuclide activities of the iodines, noble gases, and other fission products. The results show that there is no significant change in off-site dose due to the increased fuel cycle for the release scenario postulated in ORNL-3573.

  3. Over-Distribution in Source Memory

    PubMed Central

    Brainerd, C. J.; Reyna, V. F.; Holliday, R. E.; Nakamura, K.

    2012-01-01

    Semantic false memories are confounded with a second type of error, over-distribution, in which items are attributed to contradictory episodic states. Over-distribution errors have proved to be more common than false memories when the two are disentangled. We investigated whether over-distribution is prevalent in another classic false memory paradigm: source monitoring. It is. Conventional false memory responses (source misattributions) were predominantly over-distribution errors, but unlike semantic false memory, over-distribution also accounted for more than half of true memory responses (correct source attributions). Experimental control of over-distribution was achieved via a series of manipulations that affected either recollection of contextual details or item memory (concreteness, frequency, list-order, number of presentation contexts, and individual differences in verbatim memory). A theoretical model was used to analyze the data (conjoint process dissociation) that predicts that predicts that (a) over-distribution is directly proportional to item memory but inversely proportional to recollection and (b) item memory is not a necessary precondition for recollection of contextual details. The results were consistent with both predictions. PMID:21942494

  4. Atmospheric distribution and sources of nonmethane hydrocarbons

    NASA Technical Reports Server (NTRS)

    Singh, Hanwant B.; Zimmerman, Patrick B.

    1992-01-01

    The paper discusses the atmospheric distribution of natural and man-made nonmethane hydrocarbons (NMHCs), the major species of airborne NMHCs, and their sources and sinks. Particular attention is given to the techniques for measuring atmospheric NMHCs; diurnal and seasonal variations of atmospheric NMHCs and differences between rural, urban, and marine environments; latitudinal and vertical distributions; and available stratospheric NMHC measurements. A formula defining the atmospheric lifetime of a NMHC from its reaction rates with OH and O3 is presented.

  5. Microseism Source Distribution Observed from Ireland

    NASA Astrophysics Data System (ADS)

    Craig, David; Bean, Chris; Donne, Sarah; Le Pape, Florian; Möllhoff, Martin

    2017-04-01

    Ocean generated microseisms (OGM) are recorded globally with similar spectral features observed everywhere. The generation mechanism for OGM and their subsequent propagation to continental regions has led to their use as a proxy for sea-state characteristics. Also many modern seismological methods make use of OGM signals. For example, the Earth's crust and upper mantle can be imaged using ``ambient noise tomography``. For many of these methods an understanding of the source distribution is necessary to properly interpret the results. OGM recorded on near coastal seismometers are known to be related to the local ocean wavefield. However, contributions from more distant sources may also be present. This is significant for studies attempting to use OGM as a proxy for sea-state characteristics such as significant wave height. Ireland has a highly energetic ocean wave climate and is close to one of the major source regions for OGM. This provides an ideal location to study an OGM source region in detail. Here we present the source distribution observed from seismic arrays in Ireland. The region is shown to consist of several individual source areas. These source areas show some frequency dependence and generally occur at or near the continental shelf edge. We also show some preliminary results from an off-shore OBS network to the North-West of Ireland. The OBS network includes instruments on either side of the shelf and should help interpret the array observations.

  6. Directional perception of distributed sound sources.

    PubMed

    Santala, Olli; Pulkki, Ville

    2011-03-01

    The perception of spatially distributed sound sources was investigated by conducting two listening experiments in anechoic conditions with 13 loudspeakers evenly distributed in the frontal horizontal plane emitting incoherent noise signals. In the first experiment, widely distributed sound sources with gaps in their distribution emitted pink noise. The results indicated that the exact loudspeaker distribution could not be perceived accurately and that the width of the distribution was perceived to be narrower than it was in reality. Up to three spatially distributed loudspeakers that were simultaneously emitting sound could be individually perceived. In addition, the number of loudspeakers that were indicated as emitting sound was smaller than the actual number. In the second experiment, a reference with 13 loudspeakers and test cases with fewer loudspeakers were presented and their perceived spatial difference was rated. The effect of the noise bandwidth was of particular interest. Noise with different bandwidths centered around 500 and 4000 Hz was used. The results indicated that when the number of loudspeakers was increased from four to seven, the perceived auditory event was very similar to that perceived with 13 loudspeakers at all bandwidths. The perceived differences were larger in wideband noise than in narrow-band noise. © 2011 Acoustical Society of America

  7. Improved source term estimation using blind outlier detection

    NASA Astrophysics Data System (ADS)

    Martinez-Camara, Marta; Bejar Haro, Benjamin; Vetterli, Martin; Stohl, Andreas

    2014-05-01

    Emissions of substances into the atmosphere are produced in situations such as volcano eruptions, nuclear accidents or pollutant releases. It is necessary to know the source term - how the magnitude of these emissions changes with time - in order to predict the consequences of the emissions, such as high radioactivity levels in a populated area or high concentration of volcanic ash in an aircraft flight corridor. However, in general, we know neither how much material was released in total, nor the relative variation of emission strength with time. Hence, estimating the source term is a crucial task. Estimating the source term generally involves solving an ill-posed linear inverse problem using datasets of sensor measurements. Several so-called inversion methods have been developed for this task. Unfortunately, objective quantitative evaluation of the performance of inversion methods is difficult due to the fact that the ground truth is unknown for practically all the available measurement datasets. In this work we use the European Tracer Experiment (ETEX) - a rare example of an experiment where the ground truth is available - to develop and to test new source estimation algorithms. Knowledge of the ground truth grants us access to the additive error term. We show that the distribution of this error is heavy-tailed, which means that some measurements are outliers. We also show that precisely these outliers severely degrade the performance of traditional inversion methods. Therefore, we develop blind outlier detection algorithms specifically suited to the source estimation problem. Then, we propose new inversion methods that combine traditional regularization techniques with blind outlier detection. Such hybrid methods reduce the error of reconstruction of the source term up to 45% with respect to previously proposed methods.

  8. Design parameters and source terms: Volume 2, Source terms: Revision 0

    SciTech Connect

    Not Available

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 2 tabs.

  9. Design parameters and source terms: Volume 2, Source terms: Revision 0

    SciTech Connect

    Not Available

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan---Conceptual Design Report SCP-CDR. The previous study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites. Volume 2 contains tables of source terms.

  10. Uncertainties in source distribution temperature and correlated colour temperature

    NASA Astrophysics Data System (ADS)

    Gardner, J. L.

    2006-10-01

    Uncertainties in the distribution temperature (DT) and correlated colour temperature are estimated for common sources and typical measurement uncertainties of a spectral transfer from a reference lamp. Uncertainty sensitivity coefficients for both parameters in terms of measured values of spectral irradiance are derived using a generalized matrix inverse. The uncertainty values are compared with differences in the source temperature parameters. DT is calculated using the CIE definition; shifts in DT due to the alternative of a direct fit of Planck's distribution, and to including weights in the process, are compared with the estimated uncertainties.

  11. TRIGA MARK-II source term

    SciTech Connect

    Usang, M. D. Hamzah, N. S. Abi, M. J. B. Rawi, M. Z. M. Rawi Abu, M. P.

    2014-02-12

    ORIGEN 2.2 are employed to obtain data regarding γ source term and the radio-activity of irradiated TRIGA fuel. The fuel composition are specified in grams for use as input data. Three types of fuel are irradiated in the reactor, each differs from the other in terms of the amount of Uranium compared to the total weight. Each fuel are irradiated for 365 days with 50 days time step. We obtain results on the total radioactivity of the fuel, the composition of activated materials, composition of fission products and the photon spectrum of the burned fuel. We investigate the differences of results using BWR and PWR library for ORIGEN. Finally, we compare the composition of major nuclides after 1 year irradiation of both ORIGEN library with results from WIMS. We found only minor disagreements between the yields of PWR and BWR libraries. In comparison with WIMS, the errors are a little bit more pronounced. To overcome this errors, the irradiation power used in ORIGEN could be increased a little, so that the differences in the yield of ORIGEN and WIMS could be reduced. A more permanent solution is to use a different code altogether to simulate burnup such as DRAGON and ORIGEN-S. The result of this study are essential for the design of radiation shielding from the fuel.

  12. TRIGA MARK-II source term

    NASA Astrophysics Data System (ADS)

    Usang, M. D.; Hamzah, N. S.; J. B., Abi M.; M. Z., M. Rawi; Abu, M. P.

    2014-02-01

    ORIGEN 2.2 are employed to obtain data regarding γ source term and the radio-activity of irradiated TRIGA fuel. The fuel composition are specified in grams for use as input data. Three types of fuel are irradiated in the reactor, each differs from the other in terms of the amount of Uranium compared to the total weight. Each fuel are irradiated for 365 days with 50 days time step. We obtain results on the total radioactivity of the fuel, the composition of activated materials, composition of fission products and the photon spectrum of the burned fuel. We investigate the differences of results using BWR and PWR library for ORIGEN. Finally, we compare the composition of major nuclides after 1 year irradiation of both ORIGEN library with results from WIMS. We found only minor disagreements between the yields of PWR and BWR libraries. In comparison with WIMS, the errors are a little bit more pronounced. To overcome this errors, the irradiation power used in ORIGEN could be increased a little, so that the differences in the yield of ORIGEN and WIMS could be reduced. A more permanent solution is to use a different code altogether to simulate burnup such as DRAGON and ORIGEN-S. The result of this study are essential for the design of radiation shielding from the fuel.

  13. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  14. Calculation of external dose from distributed source

    SciTech Connect

    Kocher, D.C.

    1986-01-01

    This paper discusses a relatively simple calculational method, called the point kernel method (Fo68), for estimating external dose from distributed sources that emit photon or electron radiations. The principles of the point kernel method are emphasized, rather than the presentation of extensive sets of calculations or tables of numerical results. A few calculations are presented for simple source geometries as illustrations of the method, and references and descriptions are provided for other caluclations in the literature. This paper also describes exposure situations for which the point kernel method is not appropriate and other, more complex, methods must be used, but these methods are not discussed in any detail.

  15. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  16. Uncertainty propagation within source term estimation

    NASA Astrophysics Data System (ADS)

    Rodriguez, Luna Marie

    We can never quantify the atmospheric state precisely: there will always be an uncertainty associated with measured quantities and model output. This work seeks to understand both the uncertainty introduced by measurements and the uncertainty introduced by approximating the model's nonlinear terms. We seek both to understand these sources of uncertainty and to incorporate that understanding throughout the scientific process for Atmospheric Transport & Dispersion (AT&D) problems. First we will examine how errors in the input wind fields may translate into AT&D model solution errors. We focus on street-level concentration plume errors that occur in building-aware AT&D models for a set of hazardous release scenarios where the release location varies relative to the building locations and city building configurations. Second, we use Source Term Estimation (STE) techniques to examine how estimates of uncertainty in measurements, e.g. the wind direction, can be used to help bound the problem. We use two techniques to examine the STE problem. Using the Genetic Algorithm coupled with an AT&D model Variational (GA-Var) method we preform sensitivity analyses to achieve five goals: (1) establish adequate thresholds to filter out noise in our concentration data without decimating the signal; (2) use a robust statistical method to quantify the uncertainty in our predictions; (3) determine the best cost function for each of the variables we seek to retrieve; (4) given that real-time wind direction data are difficult to come by, determine if the GA-estimated wind direction is representative of the advecting wind; and (5) determine the robustness of the GA when a limited number of sensors are available. To further examine the STE problem we use the Variational Iterative Refinement STE Algorithm (VIRSA). VIRSA is a combined modeling system that includes the Second-order Closure Integrated PUFF model, a hybrid Lagrangian-Eulerian Plume Model (LEPM), and its formal adjoint. While

  17. Sensitivity analysis of distributed volcanic source inversion

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  18. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The... to January 10, 1997, who seek to revise the current accident source term used in their design...

  19. Particle size distribution of indoor aerosol sources

    SciTech Connect

    Shah, K.B.

    1990-10-24

    As concern about Indoor Air Quality (IAQ) has grown in recent years, it has become necessary to determine the nature of particles produced by different indoor aerosol sources and the typical concentration that these sources tend to produce. These data are important in predicting the dose of particles to people exposed to these sources and it will also enable us to take effective mitigation procedures. Further, it will also help in designing appropriate air cleaners. A new state of the art technique, DMPS (Differential Mobility Particle Sizer) System is used to determine the particle size distributions of a number of sources. This system employs the electrical mobility characteristics of these particles and is very effective in the 0.01--1.0 {mu}m size range. A modified system that can measure particle sizes in the lower size range down to 3 nm was also used. Experimental results for various aerosol sources is presented in the ensuing chapters. 37 refs., 20 figs., 2 tabs.

  20. Bayesian Estimation of Prior Variance in Source Term Determination

    NASA Astrophysics Data System (ADS)

    Smidl, Vaclav; Hofman, Radek

    2015-04-01

    The problem of determination of source term of an atmospheric release is studied. We assume that the observations y are obtained as linear combination of the source term, x, and source-receptor sensitivities, which can be written in matrix notation as y = Mx with source receptor sensitivity matrix M. Direct estimation of the source term vector x is not possible since the system is often ill-conditioned. The solution is thus found by minimization of a cost function with regularization terms. A typical cost function is: C (x) = (y - M x)TR-1(y- M x) + αxTDT Dx, (1) where the first term minimizes the error of the measurements with covariance matrix R, and the second term is the regularization with weight α. Various types of regularization arise for different choices of matrix D. For example, Tikhonov regularization arises for D in the form of identity matrix, and smoothing regularization for D in the form of a tri-diagonal matrix (Laplacian operator). Typically, the form of matrix D is assumed to be known, and the weight α is optimized manually by a trial and error procedure. In this contribution, we use the probabilistic formulation of the problem, where term (αDTD)-1 is interpreted as a covariance matrix of the prior distribution of x. Following the Bayesian approach, we relax the assumption of known α and D and assume that these are unknown and estimated from the data. The general problem is not analytically tractable and approximate estimation techniques has to be used. We present Variational Bayesian solution of two special cases of the prior covariance matrix. First, the structure of D is assumed to be known and only the weight α is estimated. Application of the Variational Bayes method to this case yields an iterative estimation algorithm. In the first step, the usual optimization problem is solved for an estimate of α. In the next step, the value of α is re-estimated and the procedure returns to the first step. Positivity of the solution is guaranteed

  1. Experimental quantum key distribution with source flaws

    NASA Astrophysics Data System (ADS)

    Xu, Feihu; Wei, Kejin; Sajeed, Shihan; Kaiser, Sarah; Sun, Shihai; Tang, Zhiyuan; Qian, Li; Makarov, Vadim; Lo, Hoi-Kwong

    2015-09-01

    Decoy-state quantum key distribution (QKD) is a standard technique in current quantum cryptographic implementations. Unfortunately, existing experiments have two important drawbacks: the state preparation is assumed to be perfect without errors and the employed security proofs do not fully consider the finite-key effects for general attacks. These two drawbacks mean that existing experiments are not guaranteed to be proven to be secure in practice. Here, we perform an experiment that shows secure QKD with imperfect state preparations over long distances and achieves rigorous finite-key security bounds for decoy-state QKD against coherent attacks in the universally composable framework. We quantify the source flaws experimentally and demonstrate a QKD implementation that is tolerant to channel loss despite the source flaws. Our implementation considers more real-world problems than most previous experiments, and our theory can be applied to general discrete-variable QKD systems. These features constitute a step towards secure QKD with imperfect devices.

  2. Distributed single source coding with side information

    NASA Astrophysics Data System (ADS)

    Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.

    2004-01-01

    In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.

  3. CONSTRAINING SOURCE REDSHIFT DISTRIBUTIONS WITH GRAVITATIONAL LENSING

    SciTech Connect

    Wittman, D.; Dawson, W. A.

    2012-09-10

    We introduce a new method for constraining the redshift distribution of a set of galaxies, using weak gravitational lensing shear. Instead of using observed shears and redshifts to constrain cosmological parameters, we ask how well the shears around clusters can constrain the redshifts, assuming fixed cosmological parameters. This provides a check on photometric redshifts, independent of source spectral energy distribution properties and therefore free of confounding factors such as misidentification of spectral breaks. We find that {approx}40 massive ({sigma}{sub v} = 1200 km s{sup -1}) cluster lenses are sufficient to determine the fraction of sources in each of six coarse redshift bins to {approx}11%, given weak (20%) priors on the masses of the highest-redshift lenses, tight (5%) priors on the masses of the lowest-redshift lenses, and only modest (20%-50%) priors on calibration and evolution effects. Additional massive lenses drive down uncertainties as N{sub lens}{sup -1/2}, but the improvement slows as one is forced to use lenses further down the mass function. Future large surveys contain enough clusters to reach 1% precision in the bin fractions if the tight lens-mass priors can be maintained for large samples of lenses. In practice this will be difficult to achieve, but the method may be valuable as a complement to other more precise methods because it is based on different physics and therefore has different systematic errors.

  4. Design parameters and source terms: Volume 3, Source terms: Revision 0

    SciTech Connect

    Not Available

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan /endash/ Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites.

  5. Terrestrial sources and distribution of atmospheric sulphur

    PubMed Central

    Lelieveld, J.; Roelofs, G.-J.; Ganzeveld, L.; Feichter, J.; Rodhe, H.

    1997-01-01

    The general circulation model ECHAM has been coupled to a chemistry and sulphur cycle model to study the impact of terrestrial, i.e. mostly anthropogenic sulphur dioxide (SO2), sources on global distributions of sulphur species in the atmosphere. We briefly address currently available source inventories. It appears that global estimates of natural emissions are associated with uncertainties up to a factor of 2, while anthropogenic emissions have uncertainty ranges of about +/- 30 per cent. Further, some recent improvements in the model descriptions of multiphase chemistry and deposition processes are presented. Dry deposition is modelled consistently with meteorological processes and surface properties. The results indicate that surface removal of SO2 is less efficient than previously assumed, and that the SO2 lifetime is thus longer. Coupling of the photochemistry and sulphur chemistry schemes in the model improves the treatment of multiphase processes such as oxidant (hydrogen peroxide) supply in aqueous phase SO2 oxidation. The results suggest that SO2 oxidation by ozone (O3) in the aqueous phase is more important than indicated in earlier work. However, it appears that we still overestimate atmospheric SO2 concentrations near the surface in the relatively polluted Northern Hemisphere. On the other hand, we somewhat underestimate sulphate levels in these regions, which suggests that additional heterogeneous reaction mechanisms, e.g. on aerosols, enhance SO2 oxidation.

  6. Guide to Sources: Term Paper Strategy.

    ERIC Educational Resources Information Center

    White, Lucinda M.

    This two-page guide suggests an eight-step term paper research strategy for students using the Fogler Library at the University of Maine. The student is first guided to encyclopedias for overview articles with bibliographies, then directed to the card catalog; periodical indexes; and indexes for books, journal articles, and newspaper articles.…

  7. Panchromatic spectral energy distributions of Herschel sources

    NASA Astrophysics Data System (ADS)

    Berta, S.; Lutz, D.; Santini, P.; Wuyts, S.; Rosario, D.; Brisbin, D.; Cooray, A.; Franceschini, A.; Gruppioni, C.; Hatziminaoglou, E.; Hwang, H. S.; Le Floc'h, E.; Magnelli, B.; Nordon, R.; Oliver, S.; Page, M. J.; Popesso, P.; Pozzetti, L.; Pozzi, F.; Riguccini, L.; Rodighiero, G.; Roseboom, I.; Scott, D.; Symeonidis, M.; Valtchanov, I.; Viero, M.; Wang, L.

    2013-03-01

    Combining far-infrared Herschel photometry from the PACS Evolutionary Probe (PEP) and Herschel Multi-tiered Extragalactic Survey (HerMES) guaranteed time programs with ancillary datasets in the GOODS-N, GOODS-S, and COSMOS fields, it is possible to sample the 8-500 μm spectral energy distributions (SEDs) of galaxies with at least 7-10 bands. Extending to the UV, optical, and near-infrared, the number of bands increases up to 43. We reproduce the distribution of galaxies in a carefully selected restframe ten colors space, based on this rich data-set, using a superposition of multivariate Gaussian modes. We use this model to classify galaxies and build median SEDs of each class, which are then fitted with a modified version of the magphys code that combines stellar light, emission from dust heated by stars and a possible warm dust contribution heated by an active galactic nucleus (AGN). The color distribution of galaxies in each of the considered fields can be well described with the combination of 6-9 classes, spanning a large range of far- to near-infrared luminosity ratios, as well as different strength of the AGN contribution to bolometric luminosities. The defined Gaussian grouping is used to identify rare or odd sources. The zoology of outliers includes Herschel-detected ellipticals, very blue z ~ 1 Ly-break galaxies, quiescent spirals, and torus-dominated AGN with star formation. Out of these groups and outliers, a new template library is assembled, consisting of 32 SEDs describing the intrinsic scatter in the restframe UV-to-submm colors of infrared galaxies. This library is tested against L(IR) estimates with and without Herschel data included, and compared to eightother popular methods often adopted in the literature. When implementing Herschel photometry, these approaches produce L(IR) values consistent with each other within a median absolute deviation of 10-20%, the scatter being dominated more by fine tuning of the codes, rather than by the choice of

  8. State of the hydrologic source term

    SciTech Connect

    Kersting, A.

    1996-12-01

    The Underground Test Area (UGTA) Operable Unit was defined by the U.S. Department of energy, Nevada operations Office to characterize and potentially remediate groundwaters impacted by nuclear testing at the Nevada Test Site (NTS). Between 1955 and 1992, 828 nuclear devices were detonated underground at the NTS (DOE), 1994. Approximately one third of the nuclear tests were detonated at or below the standing water table and the remainder were located above the water table in the vadose zone. As a result, the distribution of radionuclides in the subsurface and, in particular, the availability of radionuclides for transport away from individual test cavities are major concerns at the NTS. The approach taken is to carry out field-based studies of both groundwaters and host rocks within the near-field in order to develop a detailed understanding of the present-day concentration and spatial distribution of constituent radionuclides. Understanding the current distribution of contamination within the near-field and the conditions under and processes by which the radionuclides were transported make it possible to predict future transport behavior. The results of these studies will be integrated with archival research, experiments and geochemical modeling for complete characterization.

  9. Background and Source Term Identification in Active Neutron Interrogation Methods

    DTIC Science & Technology

    2011-03-24

    theory section, ring detector tallies (f5 – MCNP) provided both neutron and photon fluences [particles/cm2] as functions of their energies. Figure 19...BACKGROUND AND SOURCE TERM IDENTIFICATION IN ACTIVE NEUTRON INTERROGATION METHODS THESIS...M01 BACKGROUND AND SOURCE TERM IDENTIFICATION IN ACTIVE NEUTRON INTERROGATION METHODS THESIS Presented to the Faculty Department of

  10. Stochastic Models for the Distribution of Index Terms.

    ERIC Educational Resources Information Center

    Nelson, Michael J.

    1989-01-01

    Presents a probability model of the occurrence of index terms used to derive discrete distributions which are mixtures of Poisson and negative binomial distributions. These distributions give better fits than the simpler Zipf distribution, have the advantage of being more explanatory, and can incorporate a time parameter if necessary. (25…

  11. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  12. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  13. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  14. Estimating source terms for far field dredge plume modelling.

    PubMed

    Becker, Johannes; van Eekelen, Erik; van Wiechen, Joost; de Lange, William; Damsma, Thijs; Smolders, Tijmen; van Koningsveld, Mark

    2015-02-01

    Far field modelling of dredging induced suspended sediment plumes is important while assessing the environmental aspects of dredging. Realistic estimation of source terms, that define the suspended sediment input for far field dredge plume modelling, is key to any assessment. This paper describes a generic method for source term estimation as it is used in practice in the dredging industry. It is based on soil characteristics and dredge production figures, combined with empirically derived, equipment and condition specific 'source term fractions'. A source term fraction relates the suspended fine sediment that is available for dispersion, to the amount of fine sediment that is present in the soil and the way it is dredged. The use of source term fractions helps to circumvent modelling of complicated near field processes, at least initially, enabling quick assessments. When further detail is required and extra information is available, the applicability of the source term fractions can/should be evaluated by characterisation monitoring and/or near field modelling. An example of a fictitious yet realistic dredging project demonstrates how two different work methods can trigger two distinctly different types of stress to the environmental system in terms of sediment concentration and duration.

  15. Source Term Model for an Array of Vortex Generator Vanes

    NASA Technical Reports Server (NTRS)

    Buning, P. G. (Technical Monitor); Waithe, Kenrick A.

    2003-01-01

    A source term model was developed for numerical simulations of an array of vortex generators. The source term models the side force created by a vortex generator being modeled. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on a local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low-profile vortex generator vane, which is only a fraction of the boundary layer thickness, over a flat plate. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data.

  16. Incorporation of Melcor source term predictions into probabilistic risk assessments

    SciTech Connect

    Summers, R.M.; Helton, J.C.; Leigh, C.D.

    1989-01-01

    The MELCOR code has been developed as an advanced computational tool for performing primary source term analyses that will incorporate current phenomenological understanding into probabilistic risk assessments (PRAs). Although MELCOR is reasonably fast running, it is not feasible to perform a MELCOR calculation for each of the thousands of sets of conditions requiring a source term estimate in an integrated PRA. Therefore, the RELTRAC code is being developed to generate secondary source term estimates for use directly in a PRA for the LaSalle nuclear power plant by appropriately manipulating results from calculations by a primary source term code such as MELCOR. This paper describes the MELCOR and RELTRAC models and the manner in which MELCOR calculations are used to provide input to the RELTRAC model. 26 refs., 2 figs., 1 tab.

  17. Revised accident source terms for light-water reactors

    SciTech Connect

    Soffer, L.

    1995-02-01

    This paper presents revised accident source terms for light-water reactors incorporating the severe accident research insights gained in this area over the last 15 years. Current LWR reactor accident source terms used for licensing date from 1962 and are contained in Regulatory Guides 1.3 and 1.4. These specify that 100% of the core inventory of noble gases and 25% of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental iodine. These assumptions have strongly affected present nuclear air cleaning requirements by emphasizing rapid actuation of spray systems and filtration systems optimized to retain elemental iodine. A proposed revision of reactor accident source terms and some im implications for nuclear air cleaning requirements was presented at the 22nd DOE/NRC Nuclear Air Cleaning Conference. A draft report was issued by the NRC for comment in July 1992. Extensive comments were received, with the most significant comments involving (a) release fractions for both volatile and non-volatile species in the early in-vessel release phase, (b) gap release fractions of the noble gases, iodine and cesium, and (c) the timing and duration for the release phases. The final source term report is expected to be issued in late 1994. Although the revised source terms are intended primarily for future plants, current nuclear power plants may request use of revised accident source term insights as well in licensing. This paper emphasizes additional information obtained since the 22nd Conference, including studies on fission product removal mechanisms, results obtained from improved severe accident code calculations and resolution of major comments, and their impact upon the revised accident source terms. Revised accident source terms for both BWRS and PWRS are presented.

  18. Corpus domain effects on distributional semantic modeling of medical terms.

    PubMed

    Pakhomov, Serguei V S; Finley, Greg; McEwan, Reed; Wang, Yan; Melton, Genevieve B

    2016-12-01

    Automatically quantifying semantic similarity and relatedness between clinical terms is an important aspect of text mining from electronic health records, which are increasingly recognized as valuable sources of phenotypic information for clinical genomics and bioinformatics research. A key obstacle to development of semantic relatedness measures is the limited availability of large quantities of clinical text to researchers and developers outside of major medical centers. Text from general English and biomedical literature are freely available; however, their validity as a substitute for clinical domain to represent semantics of clinical terms remains to be demonstrated. We constructed neural network representations of clinical terms found in a publicly available benchmark dataset manually labeled for semantic similarity and relatedness. Similarity and relatedness measures computed from text corpora in three domains (Clinical Notes, PubMed Central articles and Wikipedia) were compared using the benchmark as reference. We found that measures computed from full text of biomedical articles in PubMed Central repository (rho = 0.62 for similarity and 0.58 for relatedness) are on par with measures computed from clinical reports (rho = 0.60 for similarity and 0.57 for relatedness). We also evaluated the use of neural network based relatedness measures for query expansion in a clinical document retrieval task and a biomedical term word sense disambiguation task. We found that, with some limitations, biomedical articles may be used in lieu of clinical reports to represent the semantics of clinical terms and that distributional semantic methods are useful for clinical and biomedical natural language processing applications. The software and reference standards used in this study to evaluate semantic similarity and relatedness measures are publicly available as detailed in the article. pakh0002@umn.eduSupplementary information: Supplementary data are available at

  19. Source term identification in atmospheric modelling via sparse optimization

    NASA Astrophysics Data System (ADS)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  20. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  1. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Equipment General § 23.1310 Power source capacity and distribution. (a) Each installation whose...

  2. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Equipment General § 23.1310 Power source capacity and distribution. (a) Each installation whose...

  3. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Equipment General § 23.1310 Power source capacity and distribution. (a) Each installation whose...

  4. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  5. Distributed source coding using chaos-based cryptosystem

    NASA Astrophysics Data System (ADS)

    Zhou, Junwei; Wong, Kwok-Wo; Chen, Jianyong

    2012-12-01

    A distributed source coding scheme is proposed by incorporating a chaos-based cryptosystem in the Slepian-Wolf coding. The punctured codeword generated by the chaos-based cryptosystem results in ambiguity at the decoder side. This ambiguity can be removed by the maximum a posteriori decoding with the help of side information. In this way, encryption and source coding are performed simultaneously. This leads to a simple encoder structure with low implementation complexity. Simulation results show that the encoder complexity is lower than that of existing distributed source coding schemes. Moreover, at small block size, the proposed scheme has a performance comparable to existing distributed source coding schemes.

  6. Open source portal to distributed image repositories

    NASA Astrophysics Data System (ADS)

    Tao, Wenchao; Ratib, Osman M.; Kho, Hwa; Hsu, Yung-Chao; Wang, Cun; Lee, Cason; McCoy, J. M.

    2004-04-01

    In large institution PACS, patient data may often reside in multiple separate systems. While most systems tend to be DICOM compliant, none of them offer the flexibility of seamless integration of multiple DICOM sources through a single access point. We developed a generic portal system with a web-based interactive front-end as well as an application programming interface (API) that allows both web users and client applications to query and retrieve image data from multiple DICOM sources. A set of software tools was developed to allow accessing several DICOM archives through a single point of access. An interactive web-based front-end allows user to search image data seamlessly from the different archives and display the results or route the image data to another DICOM compliant destination. An XML-based API allows other software programs to easily benefit from this portal to query and retrieve image data as well. Various techniques are employed to minimize the performance overhead inherent in the DICOM. The system is integrated with a hospital-wide HIPAA-compliant authentication and auditing service that provides centralized management of access to patient medical records. The system is provided under open source free licensing and developed using open-source components (Apache Tomcat for web server, MySQL for database, OJB for object/relational data mapping etc.). The portal paradigm offers a convenient and effective solution for accessing multiple image data sources in a given healthcare enterprise and can easily be extended to multi-institution through appropriate security and encryption mechanisms.

  7. A parameter model for dredge plume sediment source terms

    NASA Astrophysics Data System (ADS)

    Decrop, Boudewijn; De Mulder, Tom; Toorman, Erik; Sas, Marc

    2017-01-01

    The presented model allows for fast simulations of the near-field behaviour of overflow dredging plumes. Overflow dredging plumes occur when dredging vessels employ a dropshaft release system to discharge the excess sea water, which is pumped into the trailing suction hopper dredger (TSHD) along with the dredged sediments. The fine sediment fraction in the loaded water-sediment mixture does not fully settle before it reaches the overflow shaft. By consequence, the released water contains a fine sediment fraction of time-varying concentration. The sediment grain size is in the range of clays, silt and fine sand; the sediment concentration varies roughly between 10 and 200 g/l in most cases, peaking at even higher value with short duration. In order to assess the environmental impact of the increased turbidity caused by this release, plume dispersion predictions are often carried out. These predictions are usually executed with a large-scale model covering a complete coastal zone, bay, or estuary. A source term of fine sediments is implemented in the hydrodynamic model to simulate the fine sediment dispersion. The large-scale model mesh resolution and governing equations, however, do not allow to simulate the near-field plume behaviour in the vicinity of the ship hull and propellers. Moreover, in the near-field, these plumes are under influence of buoyancy forces and air bubbles. The initial distribution of sediments is therefore unknown and has to be based on crude assumptions at present. The initial (vertical) distribution of the sediment source is indeed of great influence on the final far-field plume dispersion results. In order to study this near-field behaviour, a highly-detailed computationally fluid dynamics (CFD) model was developed. This model contains a realistic geometry of a dredging vessel, buoyancy effects, air bubbles and propeller action, and was validated earlier by comparing with field measurements. A CFD model requires significant simulation times

  8. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  9. Charge-exchange source terms in magnetohydrodynamic plasmas

    NASA Astrophysics Data System (ADS)

    DeStefano, Anthony M.; Heerikhuisen, Jacob

    2017-05-01

    In the modeling of space plasma environments, source terms are often used to couple separate species of particles and/or fluids. There have been many techniques developed over the years to make such coupling more tractable while maintaining maximum physical fidelity. In our current application we use the formalism of the Boltzmann collision integral to compute source terms due to charge-exchange events in the heliosphere. The charge-exchange cross sections often encountered in heliospheric interactions can be fit to laboratory data, but in most cases cannot be directly integrated over analytically. Therefore, researchers often employ various levels of approximation, either semi-analytic or numerical. We explore several assumptions to the charge-exchange source term integrals, namely using Maxwellian velocity spaces for like-mass species and either hard-sphere, power-law, or exact forms of the cross section.

  10. Spallation Neutron Source Accident Terms for Environmental Impact Statement Input

    SciTech Connect

    Devore, J.R.; Harrington, R.M.

    1998-08-01

    This report is about accidents with the potential to release radioactive materials into the environment surrounding the Spallation Neutron Source (SNS). As shown in Chap. 2, the inventories of radioactivity at the SNS are dominated by the target facility. Source terms for a wide range of target facility accidents, from anticipated events to worst-case beyond-design-basis events, are provided in Chaps. 3 and 4. The most important criterion applied to these accident source terms is that they should not underestimate potential release. Therefore, conservative methodology was employed for the release estimates. Although the source terms are very conservative, excessive conservatism has been avoided by basing the releases on physical principles. Since it is envisioned that the SNS facility may eventually (after about 10 years) be expanded and modified to support a 4-MW proton beam operational capability, the source terms estimated in this report are applicable to a 4-MW operating proton beam power unless otherwise specified. This is bounding with regard to the 1-MW facility that will be built and operated initially. See further discussion below in Sect. 1.2.

  11. Flowsheets and source terms for radioactive waste projections

    SciTech Connect

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  12. Sourcing and Global Distribution of Medical Supplies

    DTIC Science & Technology

    2014-01-01

    and ships it to OCONUS treatment facilities and operational units. Procuring and distributing medical materiel carries a large annual cost : DoD...Chapter 55, Medical and Dental Care, January 7, 2011. U.S. Code, Title 21—Food and Drugs, Chapter 9—Federal Food, Drug, and Cosmetic Act...the United States Army under Contract No. W74V8H-06-C-0001. iii Preface Concerned with rising Department of Defense (DoD) costs , the Office of

  13. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  14. Dose distributions in regions containing beta sources: Uniform spherical source regions in homogeneous media

    SciTech Connect

    Werner, B.L.; Rahman, M.; Salk, W.N. ); Kwok, C.S. )

    1991-11-01

    The energy-averaged transport model for the calculation of dose rate distributions is applied to uniform, spherical source distributions in homogeneous media for radii smaller than the electron range. The model agrees well with Monte Carlo based calculations for source distributions with radii greater than half the continuous slowing down approximation range. The dose rate distributions can be written in the medical internal radiation dose (MIRD) formalism.

  15. The source and distribution of Galactic positrons

    NASA Technical Reports Server (NTRS)

    Purcell, W. R.; Dixon, D. D.; Cheng, L.-X.; Leventhal, M.; Kinzer, R. L.; Kurfess, J. D.; Skibo, J. G.; Smith, D. M.; Tueller, J.

    1997-01-01

    The oriented scintillation spectrometer experiment (OSSE) observations of the Galactic plane and the Galactic center region were combined with observations acquired with other instruments in order to produce a map of the Galactic 511 keV annihilation radiation. Two mapping techniques were applied to the data: the maximum entropy method, and the basis pursuit inversion method. The resulting maps are qualitatively similar and show evidence for a central bulge and a weak galactic disk component. The weak disk is consistent with that expected from positrons produced by the decay of radioactive Al-26 in the interstellar medium. Both maps suggest an enhanced region of emission near l = -4 deg, b = 7 deg, with a flux of approximately 50 percent of that of the bulge. The existence of this emission appears significant, although the location is not well determined. The source of this enhanced emission is presently unknown.

  16. Sources and distributions of dark matter

    SciTech Connect

    Sikivie, P. |

    1995-12-31

    In the first section, the author tries to convey a sense of the variety of observational inputs that tell about the existence and the spatial distribution of dark matter in the universe. In the second section, he briefly reviews the four main dark matter candidates, taking note of each candidate`s status in the world of particle physics, its production in the early universe, its effect upon large scale structure formation and the means by which it may be detected. Section 3 concerns the energy spectrum of (cold) dark matter particles on earth as may be observed some day in a direct detection experiment. It is a brief account of work done in collaboration with J. Ipser and, more recently, with I. Tkachev and Y. Wang.

  17. Tetrodotoxin: chemistry, toxicity, source, distribution and detection.

    PubMed

    Bane, Vaishali; Lehane, Mary; Dikshit, Madhurima; O'Riordan, Alan; Furey, Ambrose

    2014-02-21

    Tetrodotoxin (TTX) is a naturally occurring toxin that has been responsible for human intoxications and fatalities. Its usual route of toxicity is via the ingestion of contaminated puffer fish which are a culinary delicacy, especially in Japan. TTX was believed to be confined to regions of South East Asia, but recent studies have demonstrated that the toxin has spread to regions in the Pacific and the Mediterranean. There is no known antidote to TTX which is a powerful sodium channel inhibitor. This review aims to collect pertinent information available to date on TTX and its analogues with a special emphasis on the structure, aetiology, distribution, effects and the analytical methods employed for its detection.

  18. Tetrodotoxin: Chemistry, Toxicity, Source, Distribution and Detection

    PubMed Central

    Bane, Vaishali; Lehane, Mary; Dikshit, Madhurima; O’Riordan, Alan; Furey, Ambrose

    2014-01-01

    Tetrodotoxin (TTX) is a naturally occurring toxin that has been responsible for human intoxications and fatalities. Its usual route of toxicity is via the ingestion of contaminated puffer fish which are a culinary delicacy, especially in Japan. TTX was believed to be confined to regions of South East Asia, but recent studies have demonstrated that the toxin has spread to regions in the Pacific and the Mediterranean. There is no known antidote to TTX which is a powerful sodium channel inhibitor. This review aims to collect pertinent information available to date on TTX and its analogues with a special emphasis on the structure, aetiology, distribution, effects and the analytical methods employed for its detection. PMID:24566728

  19. Apparent LFE Magnitude-Frequency Distributions and the Tremor Source

    NASA Astrophysics Data System (ADS)

    Rubin, A. M.; Bostock, M. G.

    2015-12-01

    Over a decade since its discovery, it is disconcerting that we know so little about the kinematics of the tremor source. One could say we are hampered by low signal-to-noise ratio, but often the LFE signal is large and the "noise" is just other LFEs, often nearly co-located. Here we exploit this feature to better characterize the tremor source. A quick examination of LFE catalogs shows, unsurprisingly, that detected magnitudes are large when the background tremor amplitude is large. A simple interpretation is that small LFEs are missed when tremor is loud. An unanswered question is whether, in addition, there is a paucity of small LFEs when tremor is loud. Because we have both the LFE Green's function (from stacks) and some minimum bound on the overall LFE rate (from our catalogs), tremor waveforms provide a consistency check on any assumed magnitude-frequency (M-f) distribution. Beneath southern Vancouver Island, the magnitudes of >10^5 LFEs range from about 1.2-2.4 (Bostock et al. 2015). Interpreted in terms of a power-law distribution, the b-value is >5. But missed small events make even this large value only a lower bound. Binning by background tremor amplitude, and assuming a time-invariant M-f distribution, the b-value increases to >7, implying (e.g.) more than 10 million M>1.2 events for every M=2.2 event. Such numbers are inconsistent with the observed modest increase in tremor amplitude with LFE magnitude, as well as with geodetically-allowable slips. Similar considerations apply to exponential and log-normal moment-frequency distributions. Our preliminary interpretation is that when LFE magnitudes are large, the same portion of the fault is producing larger LFEs, rather than a greater rate of LFEs pulled from the same distribution. If correct, this distinguishes LFEs from repeating earthquakes, where larger background fault slip rates lead not to larger earthquakes but to more frequent earthquakes of similar magnitude. One possible explanation, that LFEs

  20. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    SciTech Connect

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercial spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.

  1. Disposal Unit Source Term (DUST) data input guide

    SciTech Connect

    Sullivan, T.M.

    1993-05-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). The computer code DUST (Disposal Unit Source Term) has been developed to model these processes. This document presents the models used to calculate release from a disposal facility, verification of the model, and instructions on the use of the DUST code. In addition to DUST, a preprocessor, DUSTIN, which helps the code user create input decks for DUST and a post-processor, GRAFXT, which takes selected output files and plots them on the computer terminal have been written. Use of these codes is also described.

  2. Contamination on LDEF: Sources, distribution, and history

    NASA Technical Reports Server (NTRS)

    Pippin, Gary; Crutcher, Russ

    1993-01-01

    An introduction to contamination effects observed on the Long Duration Exposure Facility (LDEF) is presented. The activities reported are part of Boeing's obligation to the LDEF Materials Special Investigation Group. The contamination films and particles had minimal influence on the thermal performance of the LDEF. Some specific areas did have large changes in optical properties. Films also interfered with recession rate determination by reacting with the oxygen or physically shielding underlying material. Generally, contaminant films lessen the measured recession rate relative to 'clean' surfaces. On orbit generation of particles may be an issue for sensitive optics. Deposition on lenses may lead to artifacts on photographic images or cause sensors to respond inappropriately. Particles in the line of sight of sensors can cause stray light to be scattered into sensors. Particles also represent a hazard for mechanisms in that they can physically block and/or increase friction or wear on moving surfaces. LDEF carried a rather complex mixture of samples and support hardware into orbit. The experiments were assembled under a variety of conditions and time constraints and stored for up to five years before launch. The structure itself was so large that it could not be baked after the interior was painted with chemglaze Z-306 polyurethane based black paint. Any analysis of the effects of molecular and particulate contamination must account for a complex array of sources, wide variation in processes over time, and extreme variation in environment from ground to launch to flight. Surface conditions at certain locations on LDEF were established by outgassing of molecular species from particular materials onto adjacent surfaces, followed by alteration of those species due to exposure to atomic oxygen and/or solar radiation.

  3. IMPACTS OF SOURCE TERM HETEROGENEITIES ON WATER PATHWAY DOSE.

    SciTech Connect

    SULLIVAN, T.; GUSKOV, A.; POSKAS, P.; RUPERTI, N.; HANUSIK, V.; ET AL.

    2004-09-15

    and for which a solution has to be found in term of long-term disposal. Together with their casing and packaging, they are one form of heterogeneous waste; many other forms of waste with heterogeneous properties exist. They may arise in very small quantities and with very specific characteristics in the case of small producers, or in larger streams with standard characteristics in others. This wide variety of waste induces three main different levels of waste heterogeneity: (1) hot spot (e.g. disused sealed sources); (2) large item inside a package (e.g. metal components); and (3) very large items to be disposed of directly in the disposal unit (e.g. irradiated pipes, vessels). Safety assessments generally assume a certain level of waste homogeneity in most of the existing or proposed disposal facilities. There is a need to evaluate the appropriateness of such an assumption and the influence on the results of safety assessment. This need is especially acute in the case of sealed sources. There are many cases where are storage conditions are poor, or there is improper management leading to a radiological accident, some with significant or detrimental impacts. Disposal in a near surface disposal facility has been used in the past for some disused sealed sources. This option is currently in use for others sealed sources, or is being studied for the rest of them. The regulatory framework differs greatly between countries. In some countries, large quantities of disused sealed sources have been disposed of without any restriction, in others their disposal is forbidden by law. In any case, evaluation of the acceptability of disposal of disused sealed sources in near surface disposal facility is of utmost importance.

  4. Source term parameterization of unresolved obstacles in wave modelling

    NASA Astrophysics Data System (ADS)

    Mentaschi, Lorenzo; Pérez, Jorge; Besio, Giovanni; Mendez, Fernando; Menendez, Melisa

    2015-04-01

    In the present work we introduce two source terms for the parameterization of energy dissipation due to unresolved obstacles in spectral wave models. The proposed approach differs from the classical one based on spatial propagation schemes because it provides a local representation of phenomena such as unresolved wave energy dissipation. This source term-based approach presents the advantage of decoupling the parameterization of unresolved obstacles from the spatial propagation scheme. Furthermore it opens the way to parameterizations of other unresolved sheltering effects like rotation and frequency shift of spectral components. Energy dissipation due to unresolved obstacles is modeled locally through a Local Dissipation (LD) source term in order to provide a low resolution obstructed cell for the correct average energy. Furthermore a Shadow Effect (SE) source term has been introduced to model the correct energy flux towards downstream cells. The LD-SE scheme source term aims to reproduce in a low resolution grid the average conditions modeled by a high resolution model able to resolve obstacles in an exact way. The LD and SE source terms are expressed as functions of obstructed cell transparency coefficients relative to different spectral components. An interesting finding is that an overall transparency coefficient α for each cell/spectral component is not enough to model adequately the average conditions. A further coefficient β is needed to take into account the layout of the obstacles inside the cell. This coefficient is given by the average transparency of sections starting from the upstream side of the obstructed cell. The mono-dimensional LD and SE source terms are given by: partial F/partial t bigg|LD = - 2 - α_l/βl c_g/Δ x F partial F/partial t bigg|SE= - ( 1 - β_u/α_u)) c_g/Δ x F where "l" and "u" subscripts indicate that α and β coefficients are relative to local and upstream cells respectively. Validation of the source terms has been carried

  5. Initial investigations of SNS target facility accident source terms

    SciTech Connect

    Harrington, R.M.; Devore, J.R.; Beahm, E.C.; Weber, C.F.; Johnson, J.O.

    1998-07-01

    The Spallation Neutron Source (SNS) is a Department of Energy, accelerator-based neutron source proposed for construction at the Oak Ridge National Laboratory. The project is currently nearing the end of the conceptual design stage. The objective of the target facility is to provide beams of pulsed thermal and sub-thermal neutrons for research purposes. The neutrons are created by the action of highly energetic protons ({approximately} 1 GeV) on a mercury target. The proton beam power will be 1 MW with planned upgrades to 2 MW and, eventually, to 4 MW. Over the course of facility life, significant inventories of spallation and activation products will build up in the target mercury. Accordingly, the facility is being designed to prevent or minimize potential environmental source terms. The results of calculations of the SNS target mercury radionuclide inventories and the characteristics of the dominant radionuclides are presented. The effect of the activation/spallation product chemical and physical characteristics on dispersability is discussed. Energy sources that could drive potential releases, credible initiating events and facility preventive and mitigative features are described. The source term for the limiting extremely unlikely mercury spill accident scenario is presented. These results support the conclusion that the facility has a low hazard profile with regard to the accidental release of radioactive material.

  6. Distributed joint source-channel coding in wireless sensor networks.

    PubMed

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency.

  7. PST - a new method for estimating PSA source terms

    SciTech Connect

    1996-12-31

    The Parametric Source Term (PST) code has been developed for estimating radioactivity release fractions. The PST code is a framework of equations based on activity transport between volumes in the release pathway from the core, through the vessel, through the containment, and to the environment. The code is fast-running because it obtains exact solutions to differential equations for activity transport in each volume for each time interval. It has successfully been applied to estimate source terms for the six Pressurized Water Reactors (PWRs) that were selected for initial consideration in the Accident Sequence Precursor (ASP) Level 2 model development effort. This paper describes the PST code and the manner in which it has been applied to estimate radioactivity release fractions for the six PWRs initially considered in the ASP Program.

  8. A nuclear source term analysis for spacecraft power systems

    SciTech Connect

    McCulloch, W.H.

    1998-12-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries.

  9. Basic repository source term and data sheet report: Lavender Canyon

    SciTech Connect

    Not Available

    1988-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Lavender Canyon, Utah. 3 refs; 6 tabs.

  10. Actinide Source Term Program, position paper. Revision 1

    SciTech Connect

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-11-15

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA {open_quotes}expert panel{close_quotes} model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the {open_quotes}inventory limits{close_quotes} model is the only existing defensible model for the actinide source term. The model effort in progress, {open_quotes}chemical modeling of mobile actinide concentrations{close_quotes}, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the {open_quotes}Inventory limits{close_quotes} model.

  11. Paradigms and commonalities in atmospheric source term estimation methods

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Young, George S.; Rodriguez, Luna M.; Annunzio, Andrew J.; Vandenberghe, Francois; Haupt, Sue Ellen

    2017-05-01

    Modeling the downwind hazard area resulting from the unknown release of an atmospheric contaminant requires estimation of the source characteristics of a localized source from concentration or dosage observations and use of this information to model the subsequent transport and dispersion of the contaminant. This source term estimation problem is mathematically challenging because airborne material concentration observations and wind data are typically sparse and the turbulent wind field chaotic. Methods for addressing this problem fall into three general categories: forward modeling, inverse modeling, and nonlinear optimization. Because numerous methods have been developed on various foundations, they often have a disparate nomenclature. This situation poses challenges to those facing a new source term estimation problem, particularly when selecting the best method for the problem at hand. There is, however, much commonality between many of these methods, especially within each category. Here we seek to address the difficulties encountered when selecting an STE method by providing a synthesis of the various methods that highlights commonalities, potential opportunities for component exchange, and lessons learned that can be applied across methods.

  12. Alternative estimate of source distribution in microbial source tracking using posterior probabilities.

    PubMed

    Greenberg, Joshua; Price, Bertram; Ware, Adam

    2010-04-01

    Microbial source tracking (MST) is a procedure used to determine the relative contributions of humans and animals to fecal microbial contamination of surface waters in a given watershed. Studies of MST methodology have focused on optimizing sampling, laboratory, and statistical analysis methods in order to improve the reliability of determining which sources contributed most to surface water fecal contaminant. The usual approach for estimating a source distribution of microbial contamination is to classify water sample microbial isolates into discrete source categories and calculate the proportion of these isolates in each source category. The set of proportions is an estimate of the contaminant source distribution. In this paper we propose and compare an alternative method for estimating a source distribution-averaging posterior probabilities of source identity across isolates. We conducted a Monte Carlo simulation covering a wide variety of watershed scenarios to compare the two methods. The results show that averaging source posterior probabilities across isolates leads to more accurate source distribution estimates than proportions that follow classification.

  13. The distribution of Infrared point sources in nearby elliptical galaxies

    NASA Astrophysics Data System (ADS)

    Gogoi, Rupjyoti; Misra, Ranjeev; Puthiyaveettil, Shalima

    Infra-red point sources in nearby early-type galaxies are often counterparts of sources in other wavebands such as optical and X-rays. In particular, the IR counterpart of X-ray sources may be due to a globular cluster hosting the X-ray source or could be associated directly with the binary, providing crucial information regarding their environment. In general, the IR sources would be from globular clusters and their IR colors would provide insight into their stellar composition. However, many of the IR sources maybe background objects and it is important to identify them or at least quantify the level of background contamination. Archival Spitzer IRAC images provide a unique opportunity to study these sources in nearby Ellipticals and in particular to estimate the distributions of their IR luminosity, color and distance from the center. We will present the results of such an analysis for three nearby galaxies. We have also estimated the background contamination using several blank fields. Our preliminary results suggest that IR colors can be effectively used to differentiate between the background and sources in the galaxy, and that the distribution of sources are markedly different for different Elliptical galaxies.

  14. Short and long term representation of an unfamiliar tone distribution

    PubMed Central

    Diercks, Charlette; Troje, Nikolaus F.; Cuddy, Lola L.

    2016-01-01

    We report on a study conducted to extend our knowledge about the process of gaining a mental representation of music. Several studies, inspired by research on the statistical learning of language, have investigated statistical learning of sequential rules underlying tone sequences. Given that the mental representation of music correlates with distributional properties of music, we tested whether participants are able to abstract distributional information contained in tone sequences to form a mental representation. For this purpose, we created an unfamiliar music genre defined by an underlying tone distribution, to which 40 participants were exposed. Our stimuli allowed us to differentiate between sensitivity to the distributional properties contained in test stimuli and long term representation of the distributional properties of the music genre overall. Using a probe tone paradigm and a two-alternative forced choice discrimination task, we show that listeners are able to abstract distributional properties of music through mere exposure into a long term representation of music. This lends support to the idea that statistical learning is involved in the process of gaining musical knowledge. PMID:27635355

  15. Distributed Encoding Algorithm for Source Localization in Sensor Networks

    NASA Astrophysics Data System (ADS)

    Kim, YoonHak; Ortega, Antonio

    2010-12-01

    We consider sensor-based distributed source localization applications, where sensors transmit quantized data to a fusion node, which then produces an estimate of the source location. For this application, the goal is to minimize the amount of information that the sensor nodes have to exchange in order to attain a certain source localization accuracy. We propose a distributed encoding algorithm that is applied after quantization and achieves significant rate savings by merging quantization bins. The bin-merging technique exploits the fact that certain combinations of quantization bins at each node cannot occur because the corresponding spatial regions have an empty intersection. We apply the algorithm to a system where an acoustic amplitude sensor model is employed at each node for source localization. Our experiments demonstrate significant rate savings (e.g., over 30%, 5 nodes, and 4 bits per node) when our novel bin-merging algorithms are used.

  16. Distributed Sensing for Quickest Change Detection of Point Radiation Sources

    DTIC Science & Technology

    2017-02-01

    source detection problem is formulated where sensors observations are correlated with non-identical distributions. We first derive a centralized detection...algorithm that is asymptotically optimal for vanishing false alarm rate. Then we analyze the performance loss , as measured by the detection latency... correlated through the source location and intensity and therefore these results are not directly applicable. Recently, Qian, et al. [7] considered

  17. Continuous-variable quantum key distribution with Gaussian source noise

    SciTech Connect

    Shen Yujie; Peng Xiang; Yang Jian; Guo Hong

    2011-05-15

    Source noise affects the security of continuous-variable quantum key distribution (CV QKD) and is difficult to analyze. We propose a model to characterize Gaussian source noise through introducing a neutral party (Fred) who induces the noise with a general unitary transformation. Without knowing Fred's exact state, we derive the security bounds for both reverse and direct reconciliations and show that the bound for reverse reconciliation is tight.

  18. Methodology for a bounding estimate of activation source-term.

    PubMed

    Culp, Todd

    2013-02-01

    Sandia National Laboratories' Z-Machine is the world's most powerful electrical device, and experiments have been conducted that make it the world's most powerful radiation source. Because Z-Machine is used for research, an assortment of materials can be placed into the machine; these materials can be subjected to a range of nuclear reactions, producing an assortment of activation products. A methodology was developed to provide a systematic approach to evaluate different materials to be introduced into the machine as wire arrays. This methodology is based on experiment specific characteristics, physical characteristics of specific radionuclides, and experience with Z-Machine. This provides a starting point for bounding calculations of radionuclide source-term that can be used for work planning, development of work controls, and evaluating materials for introduction into the machine.

  19. Trace Metal Source Terms in Carbon Sequestration Environments

    SciTech Connect

    Karamalidis, Athanasios K; Torres, Sharon G; Hakala, J Alexandra; Shao, Hongbo; Cantrell, Kirk J; Carroll, Susan

    2012-02-05

    Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising, however, possible CO₂ or CO₂-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define trace metal source terms from the reaction of supercritical CO₂, storage reservoir brines, reservoir and cap rocks. Storage reservoir source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from sandstones, shales, carbonates, evaporites, basalts and cements from the Frio, In Salah, Illinois Basin – Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution is tracked by measuring solution concentrations over time under conditions (e.g. pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for Maximum Contaminant Levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments due to the presence of CO₂. Results indicate that Cr and Pb released from sandstone reservoir and shale cap rock exceed the MCLs by an order of magnitude while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the reservoir and caprock source term to further evaluate the impact of leakage on groundwater quality.

  20. Trace Metal Source Terms in Carbon Sequestration Environments

    SciTech Connect

    Karamalidis, Athanasios; Torres, Sharon G.; Hakala, Jacqueline A.; Shao, Hongbo; Cantrell, Kirk J.; Carroll, Susan A.

    2013-01-01

    ABSTRACT: Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising; however, possible CO2 or CO2-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define to provide a range of concentrations that can be used as the trace element source term for reservoirs and leakage pathways in risk simulations. Storage source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from cements and sandstones, shales, carbonates, evaporites, and basalts from the Frio, In Salah, Illinois Basin, Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands, and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution was tracked by measuring solution concentrations over time under conditions (e.g., pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for maximum contaminant levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments because of the presence of CO2. Results indicate that Cr and Pb released from sandstone reservoir and shale cap rocks exceed the MCLs byan order of magnitude, while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the trace element source term for reservoirs and leakage pathways in risk simulations to further evaluate the impact of leakage on groundwater quality.

  1. Trace metal source terms in carbon sequestration environments.

    PubMed

    Karamalidis, Athanasios K; Torres, Sharon G; Hakala, J Alexandra; Shao, Hongbo; Cantrell, Kirk J; Carroll, Susan

    2013-01-02

    Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising; however, possible CO(2) or CO(2)-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define a range of concentrations that can be used as the trace element source term for reservoirs and leakage pathways in risk simulations. Storage source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from cements and sandstones, shales, carbonates, evaporites, and basalts from the Frio, In Salah, Illinois Basin, Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands, and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution was tracked by measuring solution concentrations over time under conditions (e.g., pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for maximum contaminant levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments because of the presence of CO(2). Results indicate that Cr and Pb released from sandstone reservoir and shale cap rocks exceed the MCLs by an order of magnitude, while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the trace element source term for reservoirs and leakage pathways in risk simulations to further evaluate the impact of leakage on groundwater quality.

  2. Fourth order wave equations with nonlinear strain and source terms

    NASA Astrophysics Data System (ADS)

    Liu, Yacheng; Xu, Runzhang

    2007-07-01

    In this paper we study the initial boundary value problem for fourth order wave equations with nonlinear strain and source terms. First we introduce a family of potential wells and prove the invariance of some sets and vacuum isolating of solutions. Then we obtain a threshold result of global existence and nonexistence. Finally we discuss the global existence of solutions for the problem with critical initial condition I(u0)[greater-or-equal, slanted]0, E(0)=d. So the Esquivel-Avila's results are generalized and improved.

  3. Development of alternate methods of determining integrated SMR source terms

    SciTech Connect

    Barry, Kenneth

    2014-06-10

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted to the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced

  4. Characteristics of releases from TREAT source term experiment STEP-3

    SciTech Connect

    Fink, J.K.; Schlenger, B.J.; Baker, L. Jr.; Ritzman, R.L.

    1987-01-01

    Four in-pile experiments designed to characterize the radiological source term associated with postulated severe light water reactor accidents were performed at the Transient Reactor Test Facility. STEP-3 simulated a high-pressure TMLB' pressurized water reactor accident sequence that includes the extended loss of all ac power and leads to the loss of long-term decay heat removal. In STEP-3, four fuel elements from the Belgonucleaire BR3 reactor were subjected to temperature and pressures approaching those of a TMLB' accident. A description of the experiment and thermal-hydraulic analysis is reported elsewhere. The aerosols released into the flow stream were collected on coupons, settling plates, and wire impactors. Examination of the collected aerosol deposits was performed using scanning electron microscopy, electron microprobe microanalysis, and secondary ion mass spectroscopy (SIMS), to provide information about the chemical composition and morphology of the release. This paper describes the aerosol deposits and elemental composition of the release.

  5. Method for image reconstruction of moving radionuclide source distribution

    DOEpatents

    Stolin, Alexander V.; McKisson, John E.; Lee, Seung Joon; Smith, Mark Frederick

    2012-12-18

    A method for image reconstruction of moving radionuclide distributions. Its particular embodiment is for single photon emission computed tomography (SPECT) imaging of awake animals, though its techniques are general enough to be applied to other moving radionuclide distributions as well. The invention eliminates motion and blurring artifacts for image reconstructions of moving source distributions. This opens new avenues in the area of small animal brain imaging with radiotracers, which can now be performed without the perturbing influences of anesthesia or physical restraint on the biological system.

  6. Depositional controls, distribution, and effectiveness of world's petroleum source rocks

    SciTech Connect

    Klemme, H.D.; Ulmishek, G.F.

    1989-03-01

    Six stratigraphic intervals representing one-third of Phanerozoic time contain source rocks that have provided more than 90% of the world's discovered oil and gas reserves (in barrels of oil equivalent). The six intervals include (1) Silurian (generated 9% of the world's reserves); (2) Upper Devonian-Tournaisian (8% of reserves); (3) Pennsylvanian-Lower Permian (8% of reserves); (4) Upper Jurassic (25% of reserves); (5) middle Cretaceous (29% of reserves); and (6) Oligocene-Miocene (12.5% of reserves). This uneven distribution of source rocks in time has no immediately obvious cyclicity, nor are the intervals exactly repeatable in the commonality of factors that controlled the formation of source rocks. In this study, source rocks of the six intervals have been mapped worldwide together with oil and gas reserves generated by these rocks. Analysis of the maps shows that the main factors affecting deposition of these source rocks and their spatial distribution and effectiveness in generating hydrocarbon reserves are geologic age, global and regional tectonics, paleogeography, climate, and biologic evolution. The effect of each of the factors on geologic setting and quality of source rocks has been analyzed. Compilation of data on maturation time for these source rocks demonstrated that the majority of discovered oil and gas is very young, more than 80% of the world's oil and gas reserves have been generated since Aptian time, and nearly half of the world's hydrocarbons have been generated and trapped since the Oligocene.

  7. Indian Point 2 pilot program: NUREG-1465 source term

    SciTech Connect

    Jackson, C.W.

    1997-12-01

    NUREG-1465, {open_quotes}Accident Source Terms for Light-Water Nuclear Power Plants{close_quotes} provides a postulated fission product source term that is based on current understanding of light water reactor (LWR) accidents and fission product behavior. Reference 1 is applicable to LWR designs and is intended to form the basis for the development of regulatory guidance. Following publication of NUREG-1465 in early 1995, the U.S. Nuclear Regulatory Commission (NRC) approached the nuclear industry for recommendations on the future use of the NUREG for application to existing LWRs. The result was the formation of an industry group through the Nuclear Energy Institute and the selection of several nuclear power plants to serve as pilot plants for the analysis and proposed applications of the NUREG. Con Edison offered Indian Point unit 2 as a pilot plant. Calculations were performed by Westinghouse and submitted to the NRC to utilize the results in support of plant changes. At this time, the NRC review is ongoing.

  8. Carbon-14 Source Terms and Generation in Fusion Power Cores

    NASA Astrophysics Data System (ADS)

    Khripunov, V. I.; Kurbatov, D. K.; Subbotin, M. L.

    2008-12-01

    A consecutive study of the source terms of 14C as the major contributor to the external costs of fusion and its production rate was performed by system and neutron activation analysis. It shows that the specific 14C activity induced in the low activation structural materials, coolants and breeders suggested for future fusion power reactor cores is significantly dependent upon the assumption for nitrogen content. The determined range of the specific 14C activity ˜2-20 TBq/GW(e)a induced by the near-term water-cooled, gas-cooled and advanced liquid lithium and lithium-lead self-cooled fusion power reactors is given in the paper regarding the values for natural 14C background and artificial 14C sources as fission power reactors and nuclear tests. It is definitely recommended to minimize the nitrogen content below 0.01 wt.% in the beryllium multipliers and in the structural materials, SiC/SiC composite including. Then due to environmental and waste disposal reasons the 14C generation in fusion power blankets will have negligible impact on the cost.

  9. Chernobyl source term, atmospheric dispersion, and dose estimation

    SciTech Connect

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1988-02-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling, and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. These analyses indicated that essentially all of the noble gases, 80% of the radioiodines, 40% of the radiocesium, 10% of the tellurium, and about 1% or less of the more refractory elements were released. Atmospheric dispersion modeling of the radioactive cloud over the Northern Hemisphere revealed that the cloud became segmented during the first day, with the lower section heading toward Scandinavia and the uppper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. The inhalation doses due to direct cloud exposure were estimated to exceed 10 mGy near the Chernobyl area, to range between 0.1 and 0.001 mGy within most of Europe, and to be generally less than 0.00001 mGy within the US. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents, while the /sup 137/Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests. 9 refs., 3 figs., 6 tabs.

  10. Chernobyl source term, atmospheric dispersion, and dose estimation

    SciTech Connect

    Gudiksen, P.H.; Harvey, T.F.; Lange, R. )

    1989-11-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium, and about 1% or less of the more refractory elements were released. Atmospheric dispersion modeling of the radioactive cloud over the Northern Hemisphere revealed that the cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. The inhalation doses due to direct cloud exposure were estimated to exceed 10 mGy near the Chernobyl area, to range between 0.1 and 0.001 mGy within most of Europe, and to be generally less than 0.00001 mGy within the United States. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the 137Cs from the Chernobyl event is about 6% of that released by the U.S. and U.S.S.R. atmospheric nuclear weapon tests, while the 131I and 90Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests.

  11. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  12. Sound source localization using distributed elevated acoustic sensors

    NASA Astrophysics Data System (ADS)

    Di, Xiao; Wagstaff, Ronald A.; Anderson, John D.; Gilbert, Kenneth E.

    2009-05-01

    Detecting and localizing impulsive acoustic sources in the daytime using distributed elevated acoustic sensors with large baseline separations has distinct advantages over small ground-based arrays. There are generally two reasons for this: first, during the daytime, because of more direct and less encumbered propagation paths, signal levels are generally larger at altitude than near the ground. Second, larger baselines provide improved localization accuracy. Results are reported from a distributed array of acoustic sensors deployed during an experiment near Bourges, France during June of 2008. The distributed array consisted of microphones and GPS receivers attached to the tether lines of three widely separated aerostats. The sound sources were various impulsive devices. Results from the measurements are presented and discussed. Localization errors (GPS accuracy, propagation calculation, and aerostat motion, etc) are discussed. Possible ways to improve the localization accuracy are suggested.

  13. Production, Distribution, and Applications of Californium-252 Neutron Sources

    SciTech Connect

    Balo, P.A.; Knauer, J.B.; Martin, R.C.

    1999-10-03

    The radioisotope {sup 252}Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6-year half-life. A source the size of a person's little finger can emit up to 10{sup 11} neutrons/s. Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement, and minerals, as well as for detection and identification of explosives, laud mines, and unexploded military ordnance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards, and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 years of experience and by U.S. Bureau of Mines tests of source survivability during explosions. The production and distribution center for the U. S Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory (ORNL). DOE sells The radioisotope {sup 252}Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6- year half-life. A source the size of a person's little finger can emit up to 10 neutrons/s. Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement, and minerals, as well as for detection and identification of explosives, laud mines, and unexploded military ordnance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards, and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 years of experience and by U.S. Bureau of Mines tests of source survivability during explosions. The production and distribution center for the U. S Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory(ORNL). DOE sells {sup 252}Cf to commercial

  14. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases

  15. Near term climate projections for invasive species distributions

    USGS Publications Warehouse

    Jarnevich, C.S.; Stohlgren, T.J.

    2009-01-01

    Climate change and invasive species pose important conservation issues separately, and should be examined together. We used existing long term climate datasets for the US to project potential climate change into the future at a finer spatial and temporal resolution than the climate change scenarios generally available. These fine scale projections, along with new species distribution modeling techniques to forecast the potential extent of invasive species, can provide useful information to aide conservation and invasive species management efforts. We created habitat suitability maps for Pueraria montana (kudzu) under current climatic conditions and potential average conditions up to 30 years in the future. We examined how the potential distribution of this species will be affected by changing climate, and the management implications associated with these changes. Our models indicated that P. montana may increase its distribution particularly in the Northeast with climate change and may decrease in other areas. ?? 2008 Springer Science+Business Media B.V.

  16. Long-Term Stability of the NIST Standard Ultrasonic Source.

    PubMed

    Fick, Steven E

    2008-01-01

    The National Institute of Standards and Technology (NIST) Standard Ultrasonic Source (SUS) is a system comprising a transducer capable of output power levels up to 1 W at multiple frequencies between 1 MHz and 30 MHz, and an electrical impedance-matching network that allows the system to be driven by a conventional 50 Ω rf (radio-frequency) source. It is designed to allow interlaboratory replication of ultrasonic power levels with high accuracy using inexpensive readily available ancillary equipment. The SUS was offered for sale for 14 years (1985 to 1999). Each system was furnished with data for the set of calibration points (combinations of power level and frequency) specified by the customer. Of the systems that had been ordered with some calibration points in common, three were returned more than once to NIST for recalibration. Another system retained at NIST has been recalibrated periodically since 1984. The collective data for these systems comprise 9 calibration points and 102 measurements spanning a 17 year interval ending in 2001, the last year NIST ultrasonic power measurement services were available to the public. These data have been analyzed to compare variations in output power with frequency, power level, and time elapsed since the first calibration. The results verify the claim, made in the instruction sheet furnished with every SUS, that "long-term drift, if any, in the calibration of NIST Standard Sources is insignificant compared to the uncertainties associated with a single measurement of ultrasonic power by any method available at NIST."

  17. Production, distribution and applications of californium-252 neutron sources.

    PubMed

    Martin, R C; Knauer, J B; Balo, P A

    2000-01-01

    The radioisotope 252Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6-yr half-life. A source the size of a person's little finger can emit up to 10(11) neutrons s(-1). Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement and minerals, as well as for detection and identification of explosives, land mines and unexploded military ordinance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 yr of experience and by US Bureau of Mines tests of source survivability during explosions. The production and distribution center for the US Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory (ORNL). DOE sells 252Cf to commercial reencapsulators domestically and internationally. Sealed 252Cf sources are also available for loan to agencies and subcontractors of the US government and to universities for educational, research and medical applications. The REDC has established the Californium User Facility (CUF) for Neutron Science to make its large inventory of 252Cf sources available to researchers for irradiations inside uncontaminated hot cells. Experiments at the CUF include a land mine detection system, neutron damage testing of solid-state detectors, irradiation of human cancer cells for boron neutron capture therapy experiments and irradiation of rice to induce genetic mutations.

  18. Influence of the source distribution on the age distribution of galactic cosmic rays

    NASA Technical Reports Server (NTRS)

    Lerche, I.; Schlickeiser, R.

    1985-01-01

    The age distribution of galactic cosmic rays in the diffusion approximation is calculated. The influence of the scale height of the spatial source distribution on the mean age of particles arriving at the solar system is discussed. The broader the source distribution with respect to the galactic plane, the longer the mean age. This result provides a natural explanation for the shorter mean age of secondary cosmic rays compared to primary cosmic rays necessary for the understanding of the observed secondary/primary ratio.

  19. Preliminary investigation of processes that affect source term identification

    SciTech Connect

    Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.

    1991-09-01

    Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ({sup 3}H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total {sup 3}H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon et al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use {sup 3}H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily {sup 3}H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the {sup 3}H discharge from SWSA 5 to streams is increasing or decreasing.

  20. Effect of distributed heat source on low frequency thermoacoustic instabilities

    NASA Astrophysics Data System (ADS)

    Li, Lei; Yang, Lijun; Sun, Xiaofeng

    2013-06-01

    The problem of thermoacoustic instabilities in the combustor of modern air-breathing engines has become a topic of concern, which occurs as a result of unstable coupling between the heat release fluctuations and acoustic perturbations. A three-dimensional thermoacoustic model including the distributed non-uniform heat source and non-uniform flow is developed based on the domain decomposition spectral method. The importance of distributed heat source on combustion instabilities of longitudinal modes is analyzed with the help of a simplified geometrical configuration of combustor. The results show that the longitudinal distribution of heat source has a crucial effect on instabilities. In addition, the effect of circumferentially non-uniform heat source and non-uniform flow on longitudinal instabilities is also investigated. It can be found that the influence of circumferential non-uniformity can become significant on the lowest frequency instabilities, in particular, the oscillation frequency and growth rate are all evidently affected by temperature non-uniformity and time delay non-uniformity.

  1. Chandra Bulge Latitude Survey (BLS): Overview and Source Distributions

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan E.; Hong, J.; Van den Berg, M.; Servillat, M.; Zhao, P.

    2010-03-01

    The Chandra Bulge Latitude Survey (BLS) is mosaic of some 36 Chandra pointings with ACIS-I covering approximately 3 degrees in latitude and 1 degree in longitude and centered on the Galactic Center. We proposed this (Chandra Cycle 7) to complement the Chandra survey of the Galactic Center Region conducted by Wang et al (2002) which surveyed a similar region but aligned in longitude along the galactic plane. The combination of these two surveys would then give the complete latitude vs. longitude distributions of the sources in the central Bulge of the Galaxy. We have completed the survey and present here the full mosaic image of the BLS to study the latitude distribution, in particular, of the Bulge sources. With integration times of 15ksec per ACIS-I field, and with the full Chandra band used, the BLS reaches fainter limiting fluxes than the Wang Survey. In combination with wide-field optical (CTIO-mosaic) imaging and followup Hydra spectroscopy(see poster by Zhao et al) and IR imaging (CTIO-ISPI) in JHK (see poster by van den Berg et al), we have mapped the source distribution and content. In this talk we present the global results and comparisons with source populations in the central Bulge.

  2. Joint distributed source-channel coding for 3D videos

    NASA Astrophysics Data System (ADS)

    Palma, Veronica; Cancellaro, Michela; Neri, Alessandro

    2011-03-01

    This paper presents a distributed joint source-channel 3D video coding system. Our aim is the design of an efficient coding scheme for stereoscopic video communication over noisy channels that preserves the perceived visual quality while guaranteeing a low computational complexity. The drawback in using stereo sequences is the increased amount of data to be transmitted. Several methods are being used in the literature for encoding stereoscopic video. A significantly different approach respect to traditional video coding has been represented by Distributed Video Coding (DVC), which introduces a flexible architecture with the design of low complex video encoders. In this paper we propose a novel method for joint source-channel coding in a distributed approach. We choose turbo code for our application and study the new setting of distributed joint source channel coding of a video. Turbo code allows to send the minimum amount of data while guaranteeing near channel capacity error correcting performance. In this contribution, the mathematical framework will be fully detailed and tradeoff among redundancy and perceived quality and quality of experience will be analyzed with the aid of numerical experiments.

  3. Accident source terms for light-water nuclear power plants using high-burnup or MOX fuel.

    SciTech Connect

    Salay, Michael; Gauntt, Randall O.; Lee, Richard Y.; Powers, Dana Auburn; Leonard, Mark Thomas

    2011-01-01

    Representative accident source terms patterned after the NUREG-1465 Source Term have been developed for high burnup fuel in BWRs and PWRs and for MOX fuel in a PWR with an ice-condenser containment. These source terms have been derived using nonparametric order statistics to develop distributions for the timing of radionuclide release during four accident phases and for release fractions of nine chemical classes of radionuclides as calculated with the MELCOR 1.8.5 accident analysis computer code. The accident phases are those defined in the NUREG-1465 Source Term - gap release, in-vessel release, ex-vessel release, and late in-vessel release. Important differences among the accident source terms derived here and the NUREG-1465 Source Term are not attributable to either fuel burnup or use of MOX fuel. Rather, differences among the source terms are due predominantly to improved understanding of the physics of core meltdown accidents. Heat losses from the degrading reactor core prolong the process of in-vessel release of radionuclides. Improved understanding of the chemistries of tellurium and cesium under reactor accidents changes the predicted behavior characteristics of these radioactive elements relative to what was assumed in the derivation of the NUREG-1465 Source Term. An additional radionuclide chemical class has been defined to account for release of cesium as cesium molybdate which enhances molybdenum release relative to other metallic fission products.

  4. A comparison of world-wide uses of severe reactor accident source terms

    SciTech Connect

    Ang, M.L.; Frid, W.; Kersting, E.J.; Friederichs, H.G.; Lee, R.Y.; Meyer-Heine, A.; Powers, D.A.; Soda, K.; Sweet, D.

    1994-09-01

    The definitions of source terms to reactor containments and source terms to the environment are discussed. A comparison is made between the TID-14844 example source term and the alternative source term described in NUREG-1465. Comparisons of these source terms to the containments and those used in France, Germany, Japan, Sweden, and the United Kingdom are made. Source terms to the environment calculated in NUREG-1500 and WASH-1400 are discussed. Again, these source terms are compared to those now being used in France, Germany, Japan, Sweden, and the United Kingdom. It is concluded that source terms to the containment suggested in NUREG-1465 are not greatly more conservative than those used in other countries. Technical bases for the source terms are similar. The regulatory use of the current understanding of radionuclide behavior varies among countries.

  5. Tank waste source term inventory validation. Volume II. Letter report

    SciTech Connect

    1995-04-01

    This document comprises Volume II of the Letter Report entitled Tank Waste Source Term Inventory Validation. This volume contains Appendix C, Radionuclide Tables, and Appendix D, Chemical Analyte Tables. The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories.

  6. Source-term evaluations from recent core-melt experiments

    SciTech Connect

    Parker, G.W.; Creek, G.E.; Sutton, A.L. Jr.

    1985-01-01

    Predicted consequences of hypothetical severe reactor accidents resulting in core meltdown appear to be too conservatively projected because of the simplistic concepts often assumed for the intricate and highly variable phenomena involved. Recent demonstration work on a modest scale (1-kg) has already revealed significant variations in the mode and temperature for clad failure, in the rates of formation of zirconium alloys, in the nature of the UO/sub 2/-ZrO/sub 2/ eutectic mixtures, and in aerosol generation rates. The current series of core-melt demonstration experiments (at the 10-kg scale) seem to confirm that an increase in size of the meltdown mass will lead to an even further reduction in the amount of vaporized components. Source terms that are based on older release evaluations could be up to an order of magnitude too large. 6 refs., 6 figs., 2 tabs.

  7. Tank waste source term inventory validation. Volume 1. Letter report

    SciTech Connect

    Brevick, C.H.; Gaddis, L.A.; Johnson, E.D.

    1995-04-28

    The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories. This document is Volume I of the Letter Report entitled Tank Waste Source Term Inventory Validation.

  8. Aggregation of a Distributed Source in Morphogen Gradient Formation

    PubMed Central

    Lander, A. D.; Nie, Q.; Vargas, B.; Wan, F. Y. M.

    2007-01-01

    In the development of a biological entity, ligands (such as Decapentaplegic (Dpp) along the anterior–posterior axis of the Drosophila wing imaginal disc) are synthesized at a localized source and transported away from the source for binding with cell surface receptors to form concentration gradients of ligand–receptor complexes for cell signaling. Generally speaking, activities such as diffusion and reversible binding with degradable receptors also take place in the region of ligand production. The effects of such morphogen activities in the region of localized distributed ligand source on the ligand–receptor concentration gradient in the entire biological entity have been modeled and analyzed as System F in [1]. In this paper, we deduce from System F, a related end source model (System A) in which the effects of the distributed ligand source is replaced by an idealized point stimulus at the border between the (posterior) chamber and the ligand production region that simulates the average effects of the ligand activities in the production zone. This aggregated end source model is shown to adequately reproduce the significant implications of System F and to contain the corresponding ad hoc point source model, System R of [2], as a special case. Because of its simpler mathematical structure and the absence of any limitation on the ligand synthesis rate for the existence of steady-state gradients, System A type models are expected to be used widely. An example of such application is the recent study of the inhibiting effects of the formation of nonsignaling ligand–nonreceptor complexes [3]. PMID:17372620

  9. 5.0. Depletion, activation, and spent fuel source terms

    SciTech Connect

    Wieselquist, William A.

    2016-04-01

    SCALE’s general depletion, activation, and spent fuel source terms analysis capabilities are enabled through a family of modules related to the main ORIGEN depletion/irradiation/decay solver. The nuclide tracking in ORIGEN is based on the principle of explicitly modeling all available nuclides and transitions in the current fundamental nuclear data for decay and neutron-induced transmutation and relies on fundamental cross section and decay data in ENDF/B VII. Cross section data for materials and reaction processes not available in ENDF/B-VII are obtained from the JEFF-3.0/A special purpose European activation library containing 774 materials and 23 reaction channels with 12,617 neutron-induced reactions below 20 MeV. Resonance cross section corrections in the resolved and unresolved range are performed using a continuous-energy treatment by data modules in SCALE. All nuclear decay data, fission product yields, and gamma-ray emission data are developed from ENDF/B-VII.1 evaluations. Decay data include all ground and metastable state nuclides with half-lives greater than 1 millisecond. Using these data sources, ORIGEN currently tracks 174 actinides, 1149 fission products, and 974 activation products. The purpose of this chapter is to describe the stand-alone capabilities and underlying methodology of ORIGEN—as opposed to the integrated depletion capability it provides in all coupled neutron transport/depletion sequences in SCALE, as described in other chapters.

  10. Long-term staff scheduling with regular temporal distribution.

    PubMed

    Carrasco, Rafael C

    2010-11-01

    Although optimal staff scheduling often requires elaborate computational methods, those cases which are not highly constrained can be efficiently solved using simpler approaches. This paper describes how a simple procedure, combining random and greedy strategies with heuristics, has been successfully applied in a Spanish hospital to assign guard shifts to the physicians in a department. In this case, the employees prefer that their guard duties are regularly distributed in time. The workload distribution must also satisfy some constraints: in particular, the distribution of duties among the staff must be uniform when a number of tasks and shift types (including some unfrequent and aperiodic types, such as those scheduled during long weekends) are considered. Furthermore, the composition of teams should be varied, in the sense that no particular pairing should dominate the assignments. The procedure proposed is able to find suitable solutions when the number of employees available for every task is not small compared to the number required at every shift. The software is distributed under the terms of the GNU General Public License.

  11. Quantum key distribution with an unknown and untrusted source

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong

    2009-03-01

    The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).

  12. Distributed policy based access to networked heterogeneous ISR data sources

    NASA Astrophysics Data System (ADS)

    Bent, G.; Vyvyan, D.; Wood, David; Zerfos, Petros; Calo, Seraphin

    2010-04-01

    Within a coalition environment, ad hoc Communities of Interest (CoI's) come together, perhaps for only a short time, with different sensors, sensor platforms, data fusion elements, and networks to conduct a task (or set of tasks) with different coalition members taking different roles. In such a coalition, each organization will have its own inherent restrictions on how it will interact with the others. These are usually stated as a set of policies, including security and privacy policies. The capability that we want to enable for a coalition operation is to provide access to information from any coalition partner in conformance with the policies of all. One of the challenges in supporting such ad-hoc coalition operations is that of providing efficient access to distributed sources of data, where the applications requiring the data do not have knowledge of the location of the data within the network. To address this challenge the International Technology Alliance (ITA) program has been developing the concept of a Dynamic Distributed Federated Database (DDFD), also know as a Gaian Database. This type of database provides a means for accessing data across a network of distributed heterogeneous data sources where access to the information is controlled by a mixture of local and global policies. We describe how a network of disparate ISR elements can be expressed as a DDFD and how this approach enables sensor and other information sources to be discovered autonomously or semi-autonomously and/or combined, fused formally defined local and global policies.

  13. Fiber optic distributed temperature sensing for fire source localization

    NASA Astrophysics Data System (ADS)

    Sun, Miao; Tang, Yuquan; Yang, Shuang; Sigrist, Markus W.; Li, Jun; Dong, Fengzhong

    2017-08-01

    A method for localizing a fire source based on a distributed temperature sensor system is proposed. Two sections of optical fibers were placed orthogonally to each other as the sensing elements. A tray of alcohol was lit to act as a fire outbreak in a cabinet with an uneven ceiling to simulate a real scene of fire. Experiments were carried out to demonstrate the feasibility of the method. Rather large fluctuations and systematic errors with respect to predicting the exact room coordinates of the fire source caused by the uneven ceiling were observed. Two mathematical methods (smoothing recorded temperature curves and finding temperature peak positions) to improve the prediction accuracy are presented, and the experimental results indicate that the fluctuation ranges and systematic errors are significantly reduced. The proposed scheme is simple and appears reliable enough to locate a fire source in large spaces.

  14. Do forests represent a long-term source of contaminated particulate matter in the Fukushima Prefecture?

    PubMed

    Laceby, J Patrick; Huon, Sylvain; Onda, Yuichi; Vaury, Veronique; Evrard, Olivier

    2016-12-01

    The Fukushima Daiichi Nuclear Power Plant (FDNPP) accident resulted in radiocesium fallout contaminating coastal catchments of the Fukushima Prefecture. As the decontamination effort progresses, the potential downstream migration of radiocesium contaminated particulate matter from forests, which cover over 65% of the most contaminated region, requires investigation. Carbon and nitrogen elemental concentrations and stable isotope ratios are thus used to model the relative contributions of forest, cultivated and subsoil sources to deposited particulate matter in three contaminated coastal catchments. Samples were taken from the main identified sources: cultivated (n = 28), forest (n = 46), and subsoils (n = 25). Deposited particulate matter (n = 82) was sampled during four fieldwork campaigns from November 2012 to November 2014. A distribution modelling approach quantified relative source contributions with multiple combinations of element parameters (carbon only, nitrogen only, and four parameters) for two particle size fractions (<63 μm and <2 mm). Although there was significant particle size enrichment for the particulate matter parameters, these differences only resulted in a 6% (SD 3%) mean difference in relative source contributions. Further, the three different modelling approaches only resulted in a 4% (SD 3%) difference between relative source contributions. For each particulate matter sample, six models (i.e. <63 μm and <2 mm from the three modelling approaches) were used to incorporate a broader definition of potential uncertainty into model results. Forest sources were modelled to contribute 17% (SD 10%) of particulate matter indicating they present a long term potential source of radiocesium contaminated material in fallout impacted catchments. Subsoils contributed 45% (SD 26%) of particulate matter and cultivated sources contributed 38% (SD 19%). The reservoir of radiocesium in forested landscapes in the Fukushima region represents a

  15. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    SciTech Connect

    Yu, Charley; Gnanapragasam, Emmanuel; Cheng, Jing-Jy; Kamboj, Sunita; Chen, Shih-Yew

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  16. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    NASA Technical Reports Server (NTRS)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  17. Multiple sparse volumetric priors for distributed EEG source reconstruction.

    PubMed

    Strobbe, Gregor; van Mierlo, Pieter; De Vos, Maarten; Mijović, Bogdan; Hallez, Hans; Van Huffel, Sabine; López, José David; Vandenberghe, Stefaan

    2014-10-15

    We revisit the multiple sparse priors (MSP) algorithm implemented in the statistical parametric mapping software (SPM) for distributed EEG source reconstruction (Friston et al., 2008). In the present implementation, multiple cortical patches are introduced as source priors based on a dipole source space restricted to a cortical surface mesh. In this note, we present a technique to construct volumetric cortical regions to introduce as source priors by restricting the dipole source space to a segmented gray matter layer and using a region growing approach. This extension allows to reconstruct brain structures besides the cortical surface and facilitates the use of more realistic volumetric head models including more layers, such as cerebrospinal fluid (CSF), compared to the standard 3-layered scalp-skull-brain head models. We illustrated the technique with ERP data and anatomical MR images in 12 subjects. Based on the segmented gray matter for each of the subjects, cortical regions were created and introduced as source priors for MSP-inversion assuming two types of head models. The standard 3-layered scalp-skull-brain head models and extended 4-layered head models including CSF. We compared these models with the current implementation by assessing the free energy corresponding with each of the reconstructions using Bayesian model selection for group studies. Strong evidence was found in favor of the volumetric MSP approach compared to the MSP approach based on cortical patches for both types of head models. Overall, the strongest evidence was found in favor of the volumetric MSP reconstructions based on the extended head models including CSF. These results were verified by comparing the reconstructed activity. The use of volumetric cortical regions as source priors is a useful complement to the present implementation as it allows to introduce more complex head models and volumetric source priors in future studies.

  18. New passive decoy-state quantum key distribution with thermal distributed parametric down-conversion source

    NASA Astrophysics Data System (ADS)

    Wei, Jie; Zhang, Chun-Hui; Wang, Qin

    2017-02-01

    We present a new scheme on implementing the passive quantum key distribution with thermal distributed parametric down-conversion source. In this scheme, only one-intensity decoy state is employed, but we can achieve very precise estimation on the single-photon-pulse contribution by utilizing those built-in decoy states. Moreover, we compare the new scheme with other practical methods, i.e., the standard three-intensity decoy-state BB84 protocol using either weak coherent states or parametric down-conversion source. Through numerical simulations, we demonstrate that our new scheme can drastically improve both the secure transmission distance and the key generation rate.

  19. Distribution and Sources of Black Carbon in the Arctic

    NASA Astrophysics Data System (ADS)

    Qi, Ling

    scavenging efficiency. In this dissertation, we relate WBF with temperature and ice mass fraction based on long-term observations in mixed-phase clouds. We find that WBF reduces BC scavenging efficiency globally, with larger decrease at higher latitude and altitude (from 8% in the tropics to 76% in the Arctic). WBF slows down and reduces wet deposition of BC and leave more BC in the atmosphere. Higher BC air results in larger dry deposition. The resulting total deposition is lower in mid-latitudes (by 12-34%) and higher in the Arctic (2-29%). Globally, including WBF significantly reduces the discrepancy of BCsnow (by 50%), BCair (by 50%), and washout ratios (by a factor of two to four). The remaining discrepancies in these variables suggest that in-cloud removal is likely still excessive over land. In the last part, we identify sources of surface atmospheric BC in the Arctic in springtime, when radiative forcing is the largest due to the high insolation and surface albedo. We find a large contribution from Asian anthropogenic sources (40-43%) and open biomass burning emissions from forest fires in South Siberia (29-41%). Outside the Arctic front, BC is strongly enhanced by episodic, direct transport events from Asia and Siberia after 12 days of transport. In contrast, in the Arctic front, a large fraction of the Asian contribution is in the form of 'chronic' pollution on 1-2 month timescale. As such, it is likely that previous studies using 5- or 10-day trajectory analyses strongly underestimated the contribution from Asia to surface BC in the Arctic. Our results point toward an urgent need for better characterization of flaring emissions of BC (e.g. the emission factors, temporal and spatial distribution), extensive measurements of both the dry deposition of BC over snow and ice, and the scavenging efficiency of BC in mixed-phase clouds, particularly over Ocean. More measurements of 14C are needed to better understand sources of BC (fossil fuel combustion versus biomass

  20. Atmospheric PAHs in North China: Spatial distribution and sources.

    PubMed

    Zhang, Yanjun; Lin, Yan; Cai, Jing; Liu, Yue; Hong, Linan; Qin, Momei; Zhao, Yifan; Ma, Jin; Wang, Xuesong; Zhu, Tong; Qiu, Xinghua; Zheng, Mei

    2016-09-15

    Polycyclic aromatic hydrocarbons (PAHs), formed through incomplete combustion process, have adverse health effects. To investigate spatial distribution and sources of PAHs in North China, PAHs with passive sampling in 90 gridded sites during June to September in 2011 were analyzed. The average concentration of the sum of fifteen PAHs in North China is 220±14ng/m(3), with the highest in Shanxi, followed by Shandong and Hebei, and then the Beijing-Tianjin area. Major sources of PAHs are identified for each region of North China, coke process for Shanxi, biomass burning for Hebei and Shandong, and coal combustion for Beijing-Tianjin area, respectively. Emission inventory is combined with back trajectory analysis to study the influence of emissions from surrounding areas at receptor sites. Shanxi and Beijing-Tianjin areas are more influenced by sources nearby while regional sources have more impact on Hebei and Shandong areas. Results from this study suggest the areas where local emission should be the major target for control and areas where both local and regional sources should be considered for PAH abatement in North China. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Volatile Organic Compounds: Characteristics, distribution and sources in urban schools

    NASA Astrophysics Data System (ADS)

    Mishra, Nitika; Bartsch, Jennifer; Ayoko, Godwin A.; Salthammer, Tunga; Morawska, Lidia

    2015-04-01

    Long term exposure to organic pollutants, both inside and outside school buildings may affect children's health and influence their learning performance. Since children spend significant amount of time in school, air quality, especially in classrooms plays a key role in determining the health risks associated with exposure at schools. Within this context, the present study investigated the ambient concentrations of Volatile Organic Compounds (VOCs) in 25 primary schools in Brisbane with the aim to quantify the indoor and outdoor VOCs concentrations, identify VOCs sources and their contribution, and based on these; propose mitigation measures to reduce VOCs exposure in schools. One of the most important findings is the occurrence of indoor sources, indicated by the I/O ratio >1 in 19 schools. Principal Component Analysis with Varimax rotation was used to identify common sources of VOCs and source contribution was calculated using an Absolute Principal Component Scores technique. The result showed that outdoor 47% of VOCs were contributed by petrol vehicle exhaust but the overall cleaning products had the highest contribution of 41% indoors followed by air fresheners and art and craft activities. These findings point to the need for a range of basic precautions during the selection, use and storage of cleaning products and materials to reduce the risk from these sources.

  2. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site.

  3. Extending Marine Species Distribution Maps Using Non-Traditional Sources

    PubMed Central

    Moretzsohn, Fabio; Gibeaut, James

    2015-01-01

    Abstract Background Traditional sources of species occurrence data such as peer-reviewed journal articles and museum-curated collections are included in species databases after rigorous review by species experts and evaluators. The distribution maps created in this process are an important component of species survival evaluations, and are used to adapt, extend and sometimes contract polygons used in the distribution mapping process. New information During an IUCN Red List Gulf of Mexico Fishes Assessment Workshop held at The Harte Research Institute for Gulf of Mexico Studies, a session included an open discussion on the topic of including other sources of species occurrence data. During the last decade, advances in portable electronic devices and applications enable 'citizen scientists' to record images, location and data about species sightings, and submit that data to larger species databases. These applications typically generate point data. Attendees of the workshop expressed an interest in how that data could be incorporated into existing datasets, how best to ascertain the quality and value of that data, and what other alternate data sources are available. This paper addresses those issues, and provides recommendations to ensure quality data use. PMID:25941453

  4. The Impact of Source Distribution on Scalar Transport over Forested Hills

    NASA Astrophysics Data System (ADS)

    Ross, Andrew N.; Harman, Ian N.

    2015-08-01

    Numerical simulations of neutral flow over a two-dimensional, isolated, forested ridge are conducted to study the effects of scalar source distribution on scalar concentrations and fluxes over forested hills. Three different constant-flux sources are considered that span a range of idealized but ecologically important source distributions: a source at the ground, one uniformly distributed through the canopy, and one decaying with depth in the canopy. A fourth source type, where the in-canopy source depends on both the wind speed and the difference in concentration between the canopy and a reference concentration on the leaf, designed to mimic deposition, is also considered. The simulations show that the topographically-induced perturbations to the scalar concentration and fluxes are quantitatively dependent on the source distribution. The net impact is a balance of different processes affecting both advection and turbulent mixing, and can be significant even for moderate topography. Sources that have significant input in the deep canopy or at the ground exhibit a larger magnitude advection and turbulent flux-divergence terms in the canopy. The flows have identical velocity fields and so the differences are entirely due to the different tracer concentration fields resulting from the different source distributions. These in-canopy differences lead to larger spatial variations in above-canopy scalar fluxes for sources near the ground compared to cases where the source is predominantly located near the canopy top. Sensitivity tests show that the most significant impacts are often seen near to or slightly downstream of the flow separation or reattachment points within the canopy flow. The qualitative similarities to previous studies using periodic hills suggest that important processes occurring over isolated and periodic hills are not fundamentally different. The work has important implications for the interpretation of flux measurements over forests, even in

  5. SOURCE TERM TARGETED THRUST FY 2005 NEW START PROJECTS

    SciTech Connect

    NA

    2005-10-05

    While a significant amount of work has been devoted to developing thermodynamic data. describing the sorption of radionuclides to iron oxides and other geomedia, little data exist to describe the interaction of key radionuclides found in high-level radioactive waste with the uranium surfaces expected in corroded spent nuclear fuel (SNF) waste packages. Recent work indicates that actinide adsorption to the U(VI) solids expected in the engineered barrier system may play a key role in the reduction of dissolved concentrations of radionuclides such as Np(V). However, little is known about the mechanism(s) of adsorption, nor are the thermodynamic data available to represent the phenomenon in predictive modeling codes. Unfortunately, this situation makes it difficult to consider actinide adsorption to the U(VI) silicates in either geochemical or performance assessment (PA) predictions. The primary goal in the Source Term Targeted Thrust area is to ''study processes that control radionuclide release from the waste form''. Knowledge of adsorption of actinides to U(VI) silicate solids its and parameterization in geochemical models will be an important step towards this goal.

  6. Source terms for plutonium aerosolization from nuclear weapon accidents

    SciTech Connect

    Stephens, D.R.

    1995-07-01

    The source term literature was reviewed to estimate aerosolized and respirable release fractions for accidents involving plutonium in high-explosive (HE) detonation and in fuel fires. For HE detonation, all estimates are based on the total amount of Pu. For fuel fires, all estimates are based on the amount of Pu oxidized. I based my estimates for HE detonation primarily upon the results from the Roller Coaster experiment. For hydrocarbon fuel fire oxidation of plutonium, I based lower bound values on laboratory experiments which represent accident scenarios with very little turbulence and updraft of a fire. Expected values for aerosolization were obtained from the Vixen A field tests, which represent a realistic case for modest turbulence and updraft, and for respirable fractions from some laboratory experiments involving large samples of Pu. Upper bound estimates for credible accidents are based on experiments involving combustion of molten plutonium droplets. In May of 1991 the DOE Pilot Safety Study Program established a group of experts to estimate the fractions of plutonium which would be aerosolized and respirable for certain nuclear weapon accident scenarios.

  7. Verification test calculations for the Source Term Code Package

    SciTech Connect

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  8. Diversity, distribution and sources of bacteria in residential kitchens

    PubMed Central

    Flores, Gilberto E.; Bates, Scott T.; Caporaso, J. Gregory; Lauber, Christian L.; Leff, Jonathan W.; Knight, Rob; Fierer, Noah

    2016-01-01

    Summary Bacteria readily colonize kitchen surfaces, and the exchange of microbes between humans and the kitchen environment can impact human health. However, we have a limited understanding of the overall diversity of these communities, how they differ across surfaces, and sources of bacteria to kitchen surfaces. Here we used high-throughput sequencing of the 16S rRNA gene to explore biogeographical patterns of bacteria across >80 surfaces within the kitchens of each of four households. In total, 34 bacterial and two archaeal phyla were identified, with most sequences belonging to the Actinobacteria, Bacteriodetes, Firmicutes and Proteobacteria. Genera known to contain common food-borne pathogens were low in abundance but broadly distributed throughout the kitchens, with different taxa exhibiting distinct distribution patterns. The most diverse communities were associated with infrequently cleaned surfaces such as fans above stoves, refrigerator/freezer door seals, and floors. In contrast, the least diverse communities were observed in and around sinks, which were dominated by biofilm-forming gram-negative lineages. Community composition was influenced by conditions on individual surfaces, usage patterns, and dispersal from source environments. Human skin was the primary source of bacteria across all kitchen surfaces, with contributions from food and faucet water dominating in a few specific locations. This study demonstrates that diverse bacterial communities are widely distributed in residential kitchens and that the composition of these communities is often predictable. These results also illustrate the ease with which human- and food-associated bacteria can be transferred in residential settings to kitchen surfaces. PMID:23171378

  9. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  10. A vortex-source combination, a source, and a vortex with distributed heat supply

    NASA Astrophysics Data System (ADS)

    Kucherov, A. N.

    1983-04-01

    An analysis is made of the effect of distributed heat supply on the gasdynamic characteristics of a vortex-source (vortex-sink) combination, a source (sink), and a vortex. It is shown that in all the cases considered, there is a minimum radius for which the radial component of M is equal to unity. It is also shown that there is a critical intensity of heat release (for a fixed similarity parameter) separating two families of integral curves and that for this critical value a solution exists only under certain conditions.

  11. Theoretical discussion for electron-density distribution in multicusp ion source

    NASA Astrophysics Data System (ADS)

    Zhan, Hualin; Hu, Chundong; Xie, Yahong; Wu, Bin; Wang, Jinfang; Liang, Lizheng; Wei, Jianglong

    2011-03-01

    By introducing some ideas of magnetohydrodynamics (MHD) and kinetic theories, some useful solutions for electron-density distribution in the radial direction in multicusp ion source are obtained. Therefore, some conclusions are made in this perspective: 1, the electron-density distributions in a specific region in the sheath are the same with or without magnetic field; 2, the influence of magnetic field on the electron density obeys exponential law, which should take into account the collision term as well if the magnetic field is strong; 3, the result derived from the Boltzmann equation is qualitatively consistent with some given experimental results.

  12. Theoretical discussion for electron-density distribution in multicusp ion source

    SciTech Connect

    Zhan Hualin; Hu Chundong; Xie Yahong; Wu Bin; Wang Jinfang; Liang Lizheng; Wei Jianglong

    2011-03-21

    By introducing some ideas of magnetohydrodynamics (MHD) and kinetic theories, some useful solutions for electron-density distribution in the radial direction in multicusp ion source are obtained. Therefore, some conclusions are made in this perspective: 1, the electron-density distributions in a specific region in the sheath are the same with or without magnetic field; 2, the influence of magnetic field on the electron density obeys exponential law, which should take into account the collision term as well if the magnetic field is strong; 3, the result derived from the Boltzmann equation is qualitatively consistent with some given experimental results.

  13. Long-term variations of muon flux angular distribution

    NASA Astrophysics Data System (ADS)

    Shutenko, V. V.; Astapov, I. I.; Barbashina, N. S.; Dmitrieva, A. N.; Kokoulin, R. P.; Kompaniets, K. G.; Petrukhin, A. A.; Yashin, I. I.

    2013-02-01

    Intensity of the atmospheric muon flux depends on a number of factors: energy spectrum of primary cosmic rays (PCR), heliospheric conditions, state of the magnetosphere and atmosphere of the Earth. The wide-aperture muon hodoscope URAGAN (Moscow, Russia, 55.7° N, 37.7° E, 173 m a.s.l.) makes it possible to investigate not only variations of the intensity of muon flux, but also temporal changes of its angular distribution. For the analysis of angular distribution variations, the vector of local anisotropy is used. The vector of local anisotropy is the sum of individual vectors (directions of the reconstructed muon tracks) normalized to the total number of reconstructed tracks. The vector of local anisotropy and its projections show different sensitivities to parameters of the processes of modulation of PCR in the heliosphere and the Earth's magnetosphere, and the passage of secondary cosmic rays through the terrestrial atmosphere. In the work, results of the analysis of long-term variations of hourly average projections of the vector of local anisotropy obtained from the URAGAN data during experimental series of 2007-2011 are presented.

  14. Local tsunamis and distributed slip at the source

    USGS Publications Warehouse

    Geist, E.L.; Dmowska, R.

    1999-01-01

    Variations in the local tsunami wave field are examined in relation to heterogeneous slip distributions that are characteristic of many shallow subduction zone earthquakes. Assumptions inherent in calculating the coseismic vertical displacement field that defines the initial condition for tsunami propagation are examined. By comparing the seafloor displacement from uniform slip to that from an ideal static crack, we demonstrate that dip-directed slip variations significantly affect the initial cross-sectional wave profile. Because of the hydrodynamic stability of tsunami wave forms, these effects directly impact estimates of maximum runup from the local tsunami. In most cases, an assumption of uniform slip in the dip direction significantly underestimates the maximum amplitude and leading wave steepness of the local tsunami. Whereas dip-directed slip variations affect the initial wave profile, strike-directed slip variations result in wavefront-parallel changes in amplitude that are largely preserved during propagation from the source region toward shore, owing to the effects of refraction. Tests of discretizing slip distributions indicate that small fault surface elements of dimensions similar to the source depth can acceptably approximate the vertical displacement field in comparison to continuous slip distributions. Crack models for tsunamis generated by shallow subduction zone earthquakes indicate that a rupture intersecting the free surface results in approximately twice the average slip. Therefore, the observation of higher slip associated with tsunami earthquakes relative to typical subduction zone earthquakes of the same magnitude suggests that tsunami earthquakes involve rupture of the seafloor, whereas rupture of deeper subduction zone earthquakes may be imbedded and not reach the seafloor.

  15. Testing contamination source identification methods for water distribution networks

    DOE PAGES

    Seth, Arpan; Klise, Katherine A.; Siirola, John D.; ...

    2016-04-01

    In the event of contamination in a water distribution network (WDN), source identification (SI) methods that analyze sensor data can be used to identify the source location(s). Knowledge of the source location and characteristics are important to inform contamination control and cleanup operations. Various SI strategies that have been developed by researchers differ in their underlying assumptions and solution techniques. The following manuscript presents a systematic procedure for testing and evaluating SI methods. The performance of these SI methods is affected by various factors including the size of WDN model, measurement error, modeling error, time and number of contaminant injections,more » and time and number of measurements. This paper includes test cases that vary these factors and evaluates three SI methods on the basis of accuracy and specificity. The tests are used to review and compare these different SI methods, highlighting their strengths in handling various identification scenarios. These SI methods and a testing framework that includes the test cases and analysis tools presented in this paper have been integrated into EPA’s Water Security Toolkit (WST), a suite of software tools to help researchers and others in the water industry evaluate and plan various response strategies in case of a contamination incident. Lastly, a set of recommendations are made for users to consider when working with different categories of SI methods.« less

  16. Reservoir, seal, and source rock distribution in Essaouira Rift Basin

    SciTech Connect

    Ait Salem, A. )

    1994-07-01

    The Essaouira onshore basin is an important hydrocarbon generating basin, which is situated in western Morocco. There are seven oil and gas-with-condensate fields; six are from Jurassic reservoirs and one from a Triassic reservoir. As a segment of the Atlantic passive continental margin, the Essaouira basin was subjected to several post-Hercynian basin deformation phases, which resulted in distribution, in space and time, of reservoir, seal, and source rock. These basin deformations are synsedimentary infilling of major half grabens with continental red buds and evaporite associated with the rifting phase, emplacement of a thick postrifting Jurassic and Cretaceous sedimentary wedge during thermal subsidence, salt movements, and structural deformations in relation to the Atlas mergence. The widely extending lower Oxfordian shales are the only Jurassic shale beds penetrated and recognized as potential and mature source rocks. However, facies analysis and mapping suggested the presence of untested source rocks in Dogger marine shales and Triassic to Liassic lacustrine shales. Rocks with adequate reservoir characteristics were encountered in Triassic/Liassic fluvial sands, upper Liassic dolomites, and upper Oxfordian sandy dolomites. The seals are provided by Liassic salt for the lower reservoirs and Middle to Upper Jurassic anhydrite for the upper reservoirs. Recent exploration studies demonstrate that many prospective structure reserves remain untested.

  17. Testing contamination source identification methods for water distribution networks

    SciTech Connect

    Seth, Arpan; Klise, Katherine A.; Siirola, John D.; Haxton, Terranna; Laird, Carl D.

    2016-04-01

    In the event of contamination in a water distribution network (WDN), source identification (SI) methods that analyze sensor data can be used to identify the source location(s). Knowledge of the source location and characteristics are important to inform contamination control and cleanup operations. Various SI strategies that have been developed by researchers differ in their underlying assumptions and solution techniques. The following manuscript presents a systematic procedure for testing and evaluating SI methods. The performance of these SI methods is affected by various factors including the size of WDN model, measurement error, modeling error, time and number of contaminant injections, and time and number of measurements. This paper includes test cases that vary these factors and evaluates three SI methods on the basis of accuracy and specificity. The tests are used to review and compare these different SI methods, highlighting their strengths in handling various identification scenarios. These SI methods and a testing framework that includes the test cases and analysis tools presented in this paper have been integrated into EPA’s Water Security Toolkit (WST), a suite of software tools to help researchers and others in the water industry evaluate and plan various response strategies in case of a contamination incident. Lastly, a set of recommendations are made for users to consider when working with different categories of SI methods.

  18. Long-term Trend of Solar Coronal Hole Distribution from 1975 to 2014

    NASA Astrophysics Data System (ADS)

    Fujiki, K.; Tokumaru, M.; Hayashi, K.; Satonaka, D.; Hakamada, K.

    2016-08-01

    We developed an automated prediction technique for coronal holes using potential magnetic field extrapolation in the solar corona to construct a database of coronal holes appearing from 1975 February to 2015 July (Carrington rotations from 1625 to 2165). Coronal holes are labeled with the location, size, and average magnetic field of each coronal hole on the photosphere and source surface. As a result, we identified 3335 coronal holes and found that the long-term distribution of coronal holes shows a similar pattern known as the magnetic butterfly diagram, and polar/low-latitude coronal holes tend to decrease/increase in the last solar minimum relative to the previous two minima.

  19. [Spatial distribution and pollution source identification of agricultural non-point source pollution in Fujiang watershed].

    PubMed

    Ding, Xiao-Wen; Shen, Zhen-Yao

    2012-11-01

    In order to provide regulatory support for management and control of non-point source (NPS) pollution in Fujiang watershed, agricultural NPS pollution is simulated, spatial distribution characteristics of NPS pollution are analyzed, and the primary pollution sources are also identified, by export coefficient model (ECM) and geographic information system (GIS). Agricultural NPS total nitrogen (TN) loading was of research area was 9.11 x 10(4) t in 2010, and the average loading was intensity was 3.10 t x km(-2). Agricultural NPS TN loading mainly distributed over dry lands, Mianyang city and gentle slope areas; high loading intensity areas were dry lands, Deyang city and gentle slope areas. Agricultural land use, of which contribution rate was 62. 12%, was the most important pollution source; fertilizer loss in dry lands, of which contribution rate was 50.49%, was the prominent. Improving methods of agricultural cultivation, implementing "farm land returning to woodland" policy, and enhancing treatment efficiency of domestic sewage and livestock waster wate are effective measures.

  20. Understanding the electrical behavior of the action potential in terms of elementary electrical sources.

    PubMed

    Rodriguez-Falces, Javier

    2015-03-01

    A concept of major importance in human electrophysiology studies is the process by which activation of an excitable cell results in a rapid rise and fall of the electrical membrane potential, the so-called action potential. Hodgkin and Huxley proposed a model to explain the ionic mechanisms underlying the formation of action potentials. However, this model is unsuitably complex for teaching purposes. In addition, the Hodgkin and Huxley approach describes the shape of the action potential only in terms of ionic currents, i.e., it is unable to explain the electrical significance of the action potential or describe the electrical field arising from this source using basic concepts of electromagnetic theory. The goal of the present report was to propose a new model to describe the electrical behaviour of the action potential in terms of elementary electrical sources (in particular, dipoles). The efficacy of this model was tested through a closed-book written exam. The proposed model increased the ability of students to appreciate the distributed character of the action potential and also to recognize that this source spreads out along the fiber as function of space. In addition, the new approach allowed students to realize that the amplitude and sign of the extracellular electrical potential arising from the action potential are determined by the spatial derivative of this intracellular source. The proposed model, which incorporates intuitive graphical representations, has improved students' understanding of the electrical potentials generated by bioelectrical sources and has heightened their interest in bioelectricity. Copyright © 2015 The American Physiological Society.

  1. Source-rock distribution model of the periadriatic region

    SciTech Connect

    Zappaterra, E. )

    1994-03-01

    The Periadriatic area is a mosaic of geological provinces comprised of spatially and temporally similar tectonic-sedimentary cycles. Tectonic evolution progressed from a Triassic-Early Jurassic (Liassic) continental rifting stage on the northern edge of the African craton, through an Early Jurassic (Middle Liassic)-Late Cretaceous/Eocene oceanic rifting stage and passive margin formation, to a final continental collision and active margin deformation stage in the Late Cretaceous/Eocene to Holocene. Extensive shallow-water carbonate platform deposits covered large parts of the Periadriatic region in the Late Triassic. Platform breakup and development of a platform-to-basin carbonate shelf morphology began in the Late Triassic and extended through the Cretaceous. On the basis of this paleogeographic evolution, the regional geology of the Periadriatic region can be expressed in terms of three main Upper Triassic-Paleogene sedimentary sequences: (A), the platform sequence; (B), the platform to basin sequence; and (C), the basin sequence. These sequences developed during the initial rifting and subsequent passive-margin formation tectonic stages. The principal Triassic source basins and most of the surface hydrocarbon indications and economically important oil fields of the Periadriatic region are associated with sequence B areas. No major hydrocarbon accumulations can be directly attributed to the Jurassic-Cretaceous epioceanic and intraplatform source rock sequences. The third episode of source bed deposition characterizes the final active margin deformation stage and is represented by Upper Tertiary organic-rich terrigenous units, mostly gas-prone. These are essentially associated with turbiditic and flysch sequences of foredeep basins and have generated the greater part of the commercial biogenic gases of the Periadriatic region. 82 refs., 11 figs., 2 tabs.

  2. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  3. Long-term optical behavior of 114 extragalactic sources

    NASA Astrophysics Data System (ADS)

    Pica, A. J.; Pollock, J. T.; Smith, A. G.; Leacock, R. J.; Edwards, P. L.; Scott, R. L.

    1980-11-01

    Photographic observations of over 200 quasars and related objects have been obtained at the Rosemary Hill Observatory since 1968. Twenty that are optically violent variables were reported on by Pollock et al. (1979). This paper presents data for 114 less active sources, 58 of which exhibit optical variations at a confidence level of 95% or greater. Light curves are given for the 26 most active sources. In addition, the overall monitoring program at the Observatory is reviewed, and information on the status of 206 objects is provided.

  4. Homogenization of the Brush Problem with a Source Term in L 1

    NASA Astrophysics Data System (ADS)

    Gaudiello, Antonio; Guibé, Olivier; Murat, François

    2017-07-01

    We consider a domain which has the form of a brush in 3 D or the form of a comb in 2 D, i.e. an open set which is composed of cylindrical vertical teeth distributed over a fixed basis. All the teeth have a similar fixed height; their cross sections can vary from one tooth to another and are not supposed to be smooth; moreover the teeth can be adjacent, i.e. they can share parts of their boundaries. The diameter of every tooth is supposed to be less than or equal to ɛ, and the asymptotic volume fraction of the teeth (as ɛ tends to zero) is supposed to be bounded from below away from zero, but no periodicity is assumed on the distribution of the teeth. In this domain we study the asymptotic behavior (as ɛ tends to zero) of the solution of a second order elliptic equation with a zeroth order term which is bounded from below away from zero, when the homogeneous Neumann boundary condition is satisfied on the whole of the boundary. First, we revisit the problem where the source term belongs to L 2. This is a classical problem, but our homogenization result takes place in a geometry which is more general that the ones which have been considered before. Moreover we prove a corrector result which is new. Then, we study the case where the source term belongs to L 1. Working in the framework of renormalized solutions and introducing a definition of renormalized solutions for degenerate elliptic equations where only the vertical derivative is involved (such a definition is new), we identify the limit problem and prove a corrector result.

  5. Homogenization of the Brush Problem with a Source Term in L 1

    NASA Astrophysics Data System (ADS)

    Gaudiello, Antonio; Guibé, Olivier; Murat, François

    2017-03-01

    We consider a domain which has the form of a brush in 3D or the form of a comb in 2D, i.e. an open set which is composed of cylindrical vertical teeth distributed over a fixed basis. All the teeth have a similar fixed height; their cross sections can vary from one tooth to another and are not supposed to be smooth; moreover the teeth can be adjacent, i.e. they can share parts of their boundaries. The diameter of every tooth is supposed to be less than or equal to ɛ, and the asymptotic volume fraction of the teeth (as ɛ tends to zero) is supposed to be bounded from below away from zero, but no periodicity is assumed on the distribution of the teeth. In this domain we study the asymptotic behavior (as ɛ tends to zero) of the solution of a second order elliptic equation with a zeroth order term which is bounded from below away from zero, when the homogeneous Neumann boundary condition is satisfied on the whole of the boundary. First, we revisit the problem where the source term belongs to L 2. This is a classical problem, but our homogenization result takes place in a geometry which is more general that the ones which have been considered before. Moreover we prove a corrector result which is new. Then, we study the case where the source term belongs to L 1. Working in the framework of renormalized solutions and introducing a definition of renormalized solutions for degenerate elliptic equations where only the vertical derivative is involved (such a definition is new), we identify the limit problem and prove a corrector result.

  6. CHALLENGES IN SOURCE TERM MODELING OF DECONTAMINATION AND DECOMMISSIONING WASTES.

    SciTech Connect

    SULLIVAN, T.M.

    2006-08-01

    Development of real-time predictive modeling to identify the dispersion and/or source(s) of airborne weapons of mass destruction including chemical, biological, radiological, and nuclear material in urban environments is needed to improve response to potential releases of these materials via either terrorist or accidental means. These models will also prove useful in defining airborne pollution dispersion in urban environments for pollution management/abatement programs. Predicting gas flow in an urban setting on a scale of less than a few kilometers is a complicated and challenging task due to the irregular flow paths that occur along streets and alleys and around buildings of different sizes and shapes, i.e., ''urban canyons''. In addition, air exchange between the outside and buildings and subway areas further complicate the situation. Transport models that are used to predict dispersion of WMD/CBRN materials or to back track the source of the release require high-density data and need defensible parameterizations of urban processes. Errors in the data or any of the parameter inputs or assumptions will lead to misidentification of the airborne spread or source release location(s). The need for these models to provide output in a real-time fashion if they are to be useful for emergency response provides another challenge. To improve the ability of New York City's (NYC's) emergency management teams and first response personnel to protect the public during releases of hazardous materials, the New York City Urban Dispersion Program (UDP) has been initiated. This is a four year research program being conducted from 2004 through 2007. This paper will discuss ground level and subway Perfluorocarbon tracer (PFT) release studies conducted in New York City. The studies released multiple tracers to study ground level and vertical transport of contaminants. This paper will discuss the results from these tests and how these results can be used for improving transport models

  7. Source term prediction in a multilayer tissue during hyperthermia.

    PubMed

    Baghban, M; Ayani, M B

    2015-08-01

    One of the major challenges in the use of hyperthermia to treat cancer is determining the desired heating power of external source in such a way that the thermal injury is confined to the unhealthy tissue. In this study, an inverse method based on the sequential method is proposed to estimate the desired heating power as a function of time for a successful hyperthermia treatment. In order to simulate the measured temperature, the direct problem is solved for a multilayer skin tissue to obtain the temperature data at the skin surface. These data are employed in the inverse problem to estimate the heating power of external source. Two examples are considered to examine the accuracy of the inverse analysis. In addition, the effect of measurement errors is investigated. Results show that the proposed inverse algorithm is able to determine the desired heating power of external source accurately, even in the presence of measurement errors. However, for noisy data, more temperature measurements are required to achieve reliable results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Modeling stochasticity in gene regulation: characterization in the terms of the underlying distribution function.

    PubMed

    Paszek, Pawel

    2007-07-01

    Intrinsic stochasticity plays an essential role in gene regulation because of a small number of involved molecules of DNA, mRNA and protein of a given species. To better understand this phenomenon, small gene regulatory systems are mathematically modeled as systems of coupled chemical reactions, but the existing exact description utilizing a Chapman-Kolmogorov equation or simulation algorithms is limited and inefficient. The present work considers a much more efficient yet accurate modeling approach, which allows analyzing stochasticity in the system in the terms of the underlying distribution function. We depart from the analysis of a single gene regulatory module to find that the mRNA and protein variance is decomposable into additive terms resulting from respective sources of stochasticity. This variance decomposition is asserted by constructing two approximations to the exact stochastic description: First, the continuous approximation, which considers only the stochasticity due to the intermittent gene activity. Second, the mixed approximation, which in addition attributes stochasticity to the mRNA transcription/decay process. Considered approximations yield systems of first order partial differential equations for the underlying distribution function, which can be efficiently solved using developed numerical methods. Single cell simulations and numerical two-dimensional mRNA-protein stationary distribution functions are presented to confirm accuracy of approximating models.

  9. Environmental radiation safety: source term modification by soil aerosols. Interim report

    SciTech Connect

    Moss, O.R.; Allen, M.D.; Rossignol, E.J.; Cannon, W.C.

    1980-08-01

    The goal of this project is to provide information useful in estimating hazards related to the use of a pure refractory oxide of /sup 238/Pu as a power source in some of the space vehicles to be launched during the next few years. Although the sources are designed and built to withstand re-entry into the earth's atmosphere, and to impact with the earth's surface without releasing any plutonium, the possibility that such an event might produce aerosols composed of soil and /sup 238/PuO/sub 2/ cannot be absolutely excluded. This report presents the results of our most recent efforts to measure the degree to which the plutonium aerosol source term might be modified in a terrestrial environment. The five experiments described represent our best effort to use the original experimental design to study the change in the size distribution and concentration of a /sup 238/PuO/sub 2/ aerosol due to coagulation with an aerosol of clay or sandy loam soil.

  10. Plutonium isotopes and 241Am in the atmosphere of Lithuania: A comparison of different source terms

    NASA Astrophysics Data System (ADS)

    Lujanienė, G.; Valiulis, D.; Byčenkienė, S.; Šakalys, J.; Povinec, P. P.

    2012-12-01

    137Cs, 241Am and Pu isotopes collected in aerosol samples during 1994-2011 were analyzed with special emphasis on better understanding of Pu and Am behavior in the atmosphere. The results from long-term measurements of 240Pu/239Pu atom ratios showed a bimodal frequency distribution with median values of 0.195 and 0.253, indicating two main sources contributing to the Pu activities at the Vilnius sampling station. The low Pu atom ratio of 0.141 could be attributed to the weapon-grade plutonium derived from the nuclear weapon test sites. The frequency of air masses arriving from the North-West and North-East correlated with the Pu atom ratio indicating the input from the sources located in these regions (the Novaya Zemlya test site, Siberian nuclear plants), while no correlation with the Chernobyl region was observed. Measurements carried out during the Fukushima accident showed a negligible impact of this source with Pu activities by four orders of magnitude lower as compared to the Chernobyl accident. The activity concentration of actinides measured in the integrated sample collected in March-April, 2011 showed a small contribution of Pu with unusual activity and atom ratios indicating the presence of the spent fuel of different origin than that of the Chernobyl accident.

  11. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... under Public Law 103-354 will be the lower of the interest rates in effect at the time of loan approval... 7 Agriculture 12 2011-01-01 2011-01-01 false Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds....

  12. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... under Public Law 103-354 will be the lower of the interest rates in effect at the time of loan approval... 7 Agriculture 12 2014-01-01 2013-01-01 true Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds....

  13. Source term estimation during incident response to severe nuclear power plant accidents

    SciTech Connect

    McKenna, T.J.; Glitter, J.G.

    1988-10-01

    This document presents a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. 39 refs., 48 figs., 19 tabs.

  14. Human Term Placenta as a Source of Hematopoietic Cells

    PubMed Central

    Serikov, Vladimir; Hounshell, Catherine; Larkin, Sandra; Green, William; Ikeda, Hirokazu; Walters, Mark C.

    2012-01-01

    The main barrier to a broader clinical application of umbilical cord blood (UCB) transplantation is its limiting cellular content. Thus, the discovery of hematopoietic progenitor cells in murine placental tissue led us investigate whether the human placenta contains hematopoietic cells, sites of hematopoiesis, and to develop a procedure of processing and storing placental hematopoietic cells for transplantation. Here we show that the human placenta contains large numbers of CD34-expressing hematopoietic cells, with the potential to provide a cellular yield several-fold greater than that of a typical UCB harvest. Cells from fresh or cryopreserved placental tissue generated erythroid and myeloid colonies in culture, and also produced lymphoid cells after transplantation in immunodeficient mice. These results suggest that human placenta could become an important new source of hematopoietic cells for allogeneic transplantation. PMID:19429852

  15. Open Source assimilation tool for distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Richard, Julien; Giangola-Murzyn, Agathe; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2013-04-01

    An advanced GIS data assimilation interface is a requisite to obtain a distributed hydrological model that is both transportable from catchment to catchment and is easily adaptable to data resolution. This tool is achieved for the cartographic data as well as the linked information data. In the case of the Multi-Hydro-Version2 model (A. Giangola-Murzyn et al. 2012), several types of information are distributed on a regular grid. The grid cell size has to be chosen by the user and each cell has to be filled up with information. In order to be the most realistic as possible, the Multi-Hydro model takes into account several data. For that, the assimilation tool (MH-AssimTool) has to be able to import all these different information. The needed flexibility of the studied area and grid size requires that the GIS interface must be easy to take in hand and also practical. The solution of a main window for the geographical visualisation and hierarchical menus coupled with checkboxes was chosen. For example, the geographical information, like the topography or the land use can be visualized in the main window. For the other data, like the soil conductivity, the geology or the initial moisture, the information is demanded through several pop-up windows. Once the needed information imported, MH-AssimTool prepares automatically the data. For the topography data conversion, if the resolution is too small, an interpolation is done during the processing. As a result, all the converted data is in a good resolution for the modelling. As Multi-Hydro, MH-AssimTool is open source. It's coded in Visual Basic language coupled with a GIS library. The interface is built in such a way then it can be used by a non specialist. We will illustrate the efficiency of the tool with some case studies of peri-urban catchments of widely different sizes and characteristics. We will also explain some parts of the coding of the interface.

  16. 78 FR 41398 - SourceGas Distribution LLC; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Filing Take notice that on June 27, 2013, SourceGas Distribution LLC (SourceGas) filed a Rate Election and revised Statement of...

  17. 77 FR 28374 - SourceGas Distribution LLC; Notice of Compliance Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Compliance Filing Take notice that on April 30, 2012, SourceGas Distribution LLC (SourceGas) filed a revised Statement of Operating...

  18. 78 FR 6318 - SourceGas Distribution LLC; Notice of Petition for Rate Approval

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Petition for Rate Approval Take notice that on January 15, 2013, SourceGas Distribution LLC (SourceGas) filed a rate election pursuant to section 284.123(b)(1) of the Commissions...

  19. Future prospects for ECR ion sources with improved charge state distributions

    SciTech Connect

    Alton, G.D.

    1995-12-31

    Despite the steady advance in the technology of the ECR ion source, present art forms have not yet reached their full potential in terms of charge state and intensity within a particular charge state, in part, because of the narrow band width. single-frequency microwave radiation used to heat the plasma electrons. This article identifies fundamentally important methods which may enhance the performances of ECR ion sources through the use of: (1) a tailored magnetic field configuration (spatial domain) in combination with single-frequency microwave radiation to create a large uniformly distributed ECR ``volume`` or (2) the use of broadband frequency domain techniques (variable-frequency, broad-band frequency, or multiple-discrete-frequency microwave radiation), derived from standard TWT technology, to transform the resonant plasma ``surfaces`` of traditional ECR ion sources into resonant plasma ``volume``. The creation of a large ECR plasma ``volume`` permits coupling of more power into the plasma, resulting in the heating of a much larger electron population to higher energies, thereby producing higher charge state ions and much higher intensities within a particular charge state than possible in present forms of` the source. The ECR ion source concepts described in this article offer exciting opportunities to significantly advance the-state-of-the-art of ECR technology and as a consequence, open new opportunities in fundamental and applied research and for a variety of industrial applications.

  20. Source and distribution of metals in urban soil of Bombay, India, using multivariate statistical techniques

    NASA Astrophysics Data System (ADS)

    Ratha, D. S.; Sahu, B. K.

    1993-11-01

    Simplification of a complex system of geochemical variables obtained from the soils of an industrialized area of Bombay is attempted by means of R-mode factor analysis. Prior to factor analysis, discriminant analysis was carried out taking rock and soil chemical data to establish the anthropogenic contribution of metals in soil. Trace elements (Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb, and Zn) are expressed in terms of three rotated factors. The factors mostly indicate anthropogenic sources of metals such as atmospheric fallout, emission from different industrial chimneys, crushing operations in quarries, and sewage sludges. Major elements (Na, Mg, Al, Si, P, K, Ca, Ti, Mn, and Fe) are also expressed in terms of three rotated factors indicating natural processes such as chemical weathering, presence of clay minerals, and contribution from sewage sludges and municipal refuse. Summary statistics (mean, standard deviation, skewness, and kurtosis) for the particle size distribution were interpreted as moderate dominance of fine particles. Mineralogical studies revealed the presence of montmorillonite, kaolinite, and illite types of clay minerals. Thus the present study provides information about the metal content entering into the soil and their level, sources, and distribution in the area.

  1. Processes driving short-term temporal dynamics of small mammal distribution in human-disturbed environments.

    PubMed

    Martineau, Julie; Pothier, David; Fortin, Daniel

    2016-07-01

    As the impact of anthropogenic activities intensifies worldwide, an increasing proportion of landscape is converted to early successional stages every year. To understand and anticipate the global effects of the human footprint on wildlife, assessing short-term changes in animal populations in response to disturbance events is becoming increasingly important. We used isodar habitat selection theory to reveal the consequences of timber harvesting on the ecological processes that control the distribution dynamics of a small mammal, the red-backed vole (Myodes gapperi). The abundance of voles was estimated in pairs of cut and uncut forest stands, prior to logging and up to 2 years afterwards. A week after logging, voles did not display any preference between cut and uncut stands, and a non-significant isodar indicated that their distribution was not driven by density-dependent habitat selection. One month after harvesting, however, juvenile abundance increased in cut stands, whereas the highest proportions of reproductive females were observed in uncut stands. This distribution pattern appears to result from interference competition, with juveniles moving into cuts where there was weaker competition with adults. In fact, the emergence of source-sink dynamics between uncut and cut stands, driven by interference competition, could explain why the abundance of red-backed voles became lower in cut (the sink) than uncut (the source) stands 1-2 years after logging. Our study demonstrates that the influences of density-dependent habitat selection and interference competition in shaping animal distribution can vary frequently, and for several months, following anthropogenic disturbance.

  2. Spatial distribution and migration of nonylphenol in groundwater following long-term wastewater irrigation.

    PubMed

    Wang, Shiyu; Wu, Wenyong; Liu, Fei; Yin, Shiyang; Bao, Zhe; Liu, Honglu

    2015-01-01

    Seen as a solution to water shortages, wastewater reuse for crop irrigation does however poses a risk owing to the potential release of organic contaminants into soil and water. The frequency of detection (FOD), concentration, and migration of nonylphenol (NP) isomers in reclaimed water (FODRW), surface water (FODSW), and groundwater (FODGW) were investigated in a long-term wastewater irrigation area in Beijing. The FODRW, FODSW and FODGW of any or all of 12 NP isomers were 66.7% to 100%, 76.9% to 100% and 13.3% to 60%, respectively. The mean (±standard deviation) NP concentrations of the reclaimed water, surface water, and groundwater (NPRW, NPSW, NPGW, repectively) were 469.4±73.4 ng L(-1), 694.6±248.7 ng(-1) and 244.4±230.8 ng(-1), respectively. The existence of external pollution sources during water transmission and distribution resulted in NPSW exceeding NPRW. NP distribution in groundwater was related to the duration and quantity of wastewater irrigation, the sources of aquifer recharge, and was seen to decrease with increasing aquifer depth. Higher riverside infiltration rate nearby leads to higher FODGW values. The migration rate of NP isomers was classified as high, moderate or low. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Chemotaxis Increases the Residence Time Distribution of Bacteria in Granular Media Containing Distributed Contaminant Sources

    NASA Astrophysics Data System (ADS)

    Adadevoh, J.; Triolo, S.; Ramsburg, C. A.; Ford, R.

    2015-12-01

    The use of chemotactic bacteria in bioremediation has the potential to increase access to, and biotransformation of, contaminant mass within the subsurface environment. This laboratory-scale study aimed to understand and quantify the influence of chemotaxis on residence times of pollutant-degrading bacteria within homogeneous treatment zones. Focus was placed on a continuous flow sand-packed column system in which a uniform distribution of naphthalene crystals created distributed sources of dissolved phase contaminant. A 10 mL pulse of Pseudomonas putida G7, which is chemotactic to naphthalene, and Pseudomonas putida G7 Y1, a non-chemotactic mutant strain, were simultaneously introduced into the sand-packed column at equal concentrations. Breakthrough curves obtained for the bacteria from column experiments conducted with and without naphthalene were used to quantify the effect of chemotaxis on transport parameters. In the presence of the chemoattractant, longitudinal dispersivity of PpG7 increased by a factor of 3 and percent recovery decreased from 21% to 12%. The results imply that pore-scale chemotaxis responses are evident at an interstitial fluid velocity of 1.7 m/d, which is within the range of typical groundwater flow. Within the context of bioremediation, chemotaxis may work to enhance bacterial residence times in zones of contamination thereby improving treatment.

  4. Particles emitted from indoor combustion sources: size distribution measurement and chemical analysis.

    PubMed

    Roy, A A; Baxla, S P; Gupta, Tarun; Bandyopadhyaya, R; Tripathi, S N

    2009-08-01

    This study is primarily focused toward measuring the particle size distribution and chemical analysis of particulate matter that originates from combustion sources typically found in Indian urban homes. Four such sources were selected: cigarette, incense stick, mosquito coil, and dhoop, the latter being actually a thick form of incense stick. Altogether, seven of the most popular brands available in the Indian market were tested. Particle size distribution in the smoke was measured using a scanning mobility particle sizer, using both long and nano forms of differential mobility analyzer (DMA), with readings averaged from four to six runs. The measurable particle size range of the nano DMA was 4.6 nm to 157.8 nm, whereas that of the long DMA was 15.7 nm to 637.8 nm. Therefore, readings obtained from the long and the nano DMA were compared for different brands as well as for different sources. An overlap was seen in the readings in the common range of measurement. The lowest value of peak concentration was seen for one brand of incense stick (0.9 x 10(6) cm(-3)), whereas the highest (7.1 x 10(6) cm(-3)) was seen for the dhoop. Generally, these sources showed a peak between 140 and 170 nm; however, 2 incense stick brands showed peaks at 79 nm and 89 nm. The dhoop showed results much different from the rest of the sources, with a mode at around 240 nm. Chemical analysis in terms of three heavy metals (cadmium, zinc, and lead) was performed using graphite tube atomizer and flame-atomic absorption spectrophotometer. Calculations were made to assess the expected cancer and noncancer risks, using published toxicity potentials for these three heavy metals. Our calculations revealed that all the sources showed lead concentrations much below the American Conference of Governmental Industrial Hygienists (ACGIH) threshold limit value (TLV) level. One of the two mosquito coil brands (M(2)) showed cadmium concentrations two times higher than the California Environmental

  5. Strategies for satellite-based monitoring of CO2 from distributed area and point sources

    NASA Astrophysics Data System (ADS)

    Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David

    2014-05-01

    and sensor provides the full range of temporal sampling needed to characterize distributed area and point source emissions. For instance, point source emission patterns will vary with source strength, wind speed and direction. Because wind speed, direction and other environmental factors change rapidly, short term variabilities should be sampled. For detailed target selection and pointing verification, important lessons have already been learned and strategies devised during JAXA's GOSAT mission (Schwandner et al, 2013). The fact that competing spatial and temporal requirements drive satellite remote sensing sampling strategies dictates a systematic, multi-factor consideration of potential solutions. Factors to consider include vista, revisit frequency, integration times, spatial resolution, and spatial coverage. No single satellite-based remote sensing solution can address this problem for all scales. It is therefore of paramount importance for the international community to develop and maintain a constellation of atmospheric CO2 monitoring satellites that complement each other in their temporal and spatial observation capabilities: Polar sun-synchronous orbits (fixed local solar time, no diurnal information) with agile pointing allow global sampling of known distributed area and point sources like megacities, power plants and volcanoes with daily to weekly temporal revisits and moderate to high spatial resolution. Extensive targeting of distributed area and point sources comes at the expense of reduced mapping or spatial coverage, and the important contextual information that comes with large-scale contiguous spatial sampling. Polar sun-synchronous orbits with push-broom swath-mapping but limited pointing agility may allow mapping of individual source plumes and their spatial variability, but will depend on fortuitous environmental conditions during the observing period. These solutions typically have longer times between revisits, limiting their ability to resolve

  6. WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...

  7. WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...

  8. Source-term reevaluation for US commercial nuclear power reactors: a status report

    SciTech Connect

    Herzenberg, C.L.; Ball, J.R.; Ramaswami, D.

    1984-12-01

    Only results that had been discussed publicly, had been published in the open literature, or were available in preliminary reports as of September 30, 1984, are included here. More than 20 organizations are participating in source-term programs, which have been undertaken to examine severe accident phenomena in light-water power reactors (including the chemical and physical behavior of fission products under accident conditions), update and reevaluate source terms, and resolve differences between predictions and observations of radiation releases and related phenomena. Results from these source-term activities have been documented in over 100 publications to date.

  9. Source term model evaluations for the low-level waste facility performance assessment

    SciTech Connect

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  10. Source terms and attenuation lengths for estimating shielding requirements or dose analyses of proton therapy accelerators.

    PubMed

    Sheu, Rong-Jiun; Lai, Bo-Lun; Lin, Uei-Tyng; Jiang, Shiang-Huei

    2013-08-01

    Proton therapy accelerators in the energy range of 100-300 MeV could potentially produce intense secondary radiation, which must be carefully evaluated and shielded for the purpose of radiation safety in a densely populated hospital. Monte Carlo simulations are generally the most accurate method for accelerator shielding design. However, simplified approaches such as the commonly used point-source line-of-sight model are usually preferable on many practical occasions, especially for scoping shielding design or quick sensitivity studies. This work provides a set of reliable shielding data with reasonable coverage of common target and shielding materials for 100-300 MeV proton accelerators. The shielding data, including source terms and attenuation lengths, were derived from a consistent curve fitting process of a number of depth-dose distributions within the shield, which were systematically calculated by using MCNPX for various beam-target shield configurations. The general characteristics and qualities of this data set are presented. Possible applications in cases of single- and double-layer shielding are considered and demonstrated.

  11. Distribution, sources and health risk assessment of mercury in kindergarten dust

    NASA Astrophysics Data System (ADS)

    Sun, Guangyi; Li, Zhonggen; Bi, Xiangyang; Chen, Yupeng; Lu, Shuangfang; Yuan, Xin

    2013-07-01

    Mercury (Hg) contamination in urban area is a hot issue in environmental research. In this study, the distribution, sources and health risk of Hg in dust from 69 kindergartens in Wuhan, China, were investigated. In comparison with most other cities, the concentrations of total mercury (THg) and methylmercury (MeHg) were significantly elevated, ranging from 0.15 to 10.59 mg kg-1 and from 0.64 to 3.88 μg kg-1, respectively. Among the five different urban areas, the educational area had the highest concentrations of THg and MeHg. The GIS mapping was used to identify the hot-spot areas and assess the potential pollution sources of Hg. The emissions of coal-power plants and coking plants were the main sources of THg in the dust, whereas the contributions of municipal solid waste (MSW) landfills and iron and steel smelting related industries were not significant. However, the emission of MSW landfills was considered to be an important source of MeHg in the studied area. The result of health risk assessment indicated that there was a high adverse health effect of the kindergarten dust in terms of Hg contamination on the children living in the educational area (Hazard index (HI) = 6.89).

  12. Spatial Distribution of Soil Fauna In Long Term No Tillage

    NASA Astrophysics Data System (ADS)

    Corbo, J. Z. F.; Vieira, S. R.; Siqueira, G. M.

    2012-04-01

    The soil is a complex system constituted by living beings, organic and mineral particles, whose components define their physical, chemical and biological properties. Soil fauna plays an important role in soil and may reflect and interfere in its functionality. These organisms' populations may be influenced by management practices, fertilization, liming and porosity, among others. Such changes may reduce the composition and distribution of soil fauna community. Thus, this study aimed to determine the spatial variability of soil fauna in consolidated no-tillage system. The experimental area is located at Instituto Agronômico in Campinas (São Paulo, Brazil). The sampling was conducted in a Rhodic Eutrudox, under no tillage system and 302 points distributed in a 3.2 hectare area in a regular grid of 10.00 m x 10.00 m were sampled. The soil fauna was sampled with "Pitfall Traps" method and traps remained in the area for seven days. Data were analyzed using descriptive statistics to determine the main statistical moments (mean variance, coefficient of variation, standard deviation, skewness and kurtosis). Geostatistical tools were used to determine the spatial variability of the attributes using the experimental semivariogram. For the biodiversity analysis, Shannon and Pielou indexes and richness were calculated for each sample. Geostatistics has proven to be a great tool for mapping the spatial variability of groups from the soil epigeal fauna. The family Formicidae proved to be the most abundant and dominant in the study area. The parameters of descriptive statistics showed that all attributes studied showed lognormal frequency distribution for groups from the epigeal soil fauna. The exponential model was the most suited for the obtained data, for both groups of epigeal soil fauna (Acari, Araneae, Coleoptera, Formicidae and Coleoptera larva), and the other biodiversity indexes. The sampling scheme (10.00 m x 10.00 m) was not sufficient to detect the spatial

  13. An Investigation of the Influence of Indexing Exhaustivity and Term Distributions on a Document Space.

    ERIC Educational Resources Information Center

    Wolfram, Dietmar; Zhang, Jin

    2002-01-01

    Investigates the influence of index term distributions and indexing exhaustivity on the document space within a visual information retrieval environment called DARE (Distance Angle Retrieval Environment). Discusses results that demonstrate the importance of term distribution and exhaustivity on the density of document spaces and their implications…

  14. An Investigation of the Influence of Indexing Exhaustivity and Term Distributions on a Document Space.

    ERIC Educational Resources Information Center

    Wolfram, Dietmar; Zhang, Jin

    2002-01-01

    Investigates the influence of index term distributions and indexing exhaustivity on the document space within a visual information retrieval environment called DARE (Distance Angle Retrieval Environment). Discusses results that demonstrate the importance of term distribution and exhaustivity on the density of document spaces and their implications…

  15. Use of open source distribution for a machine tool controller

    NASA Astrophysics Data System (ADS)

    Shackleford, William P.; Proctor, Frederick M.

    2001-02-01

    In recent years a growing number of government and university las, non-profit organizations and even a few for- profit corporations have found that making their source code public is good for both developers and users. In machine tool control, a growing number of users are demanding that the controllers they buy be `open architecture,' which would allow third parties and end-users at least limited ability to modify, extend or replace the components of that controller. This paper examines the advantages and dangers of going one step further, and providing `open source' controllers by relating the experiences of users and developers of the Enhanced Machine Controller. We also examine some implications for the development of standards for open-architecture but closed-source controllers. Some of the questions we hope to answer include: How can the quality be maintained after the source code has been modified? Can the code be trusted to run on expensive machines and parts, or when the safety of the operator is an issue? Can `open- architecture' but closed-source controllers ever achieve the level of flexibility or extensibility that open-source controllers can?

  16. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    SciTech Connect

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  17. Accident source terms for boiling water reactors with high burnup cores.

    SciTech Connect

    Gauntt, Randall O.; Powers, Dana Auburn; Leonard, Mark Thomas

    2007-11-01

    The primary objective of this report is to provide the technical basis for development of recommendations for updates to the NUREG-1465 Source Term for BWRs that will extend its applicability to accidents involving high burnup (HBU) cores. However, a secondary objective is to re-examine the fundamental characteristics of the prescription for fission product release to containment described by NUREG-1465. This secondary objective is motivated by an interest to understand the extent to which research into the release and behaviors of radionuclides under accident conditions has altered best-estimate calculations of the integral response of BWRs to severe core damage sequences and the resulting radiological source terms to containment. This report, therefore, documents specific results of fission product source term analyses that will form the basis for the HBU supplement to NUREG-1465. However, commentary is also provided on observed differences between the composite results of the source term calculations performed here and those reflected NUREG-1465 itself.

  18. Refined Source Terms in WAVEWATCH III with Wave Breaking and Sea Spray Forecasts

    DTIC Science & Technology

    2015-09-30

    N00014-1010390 LONG-TERM GOALS Several U.S. Federal Agencies operate wind wave prediction models for a variety of mission specific purposes...decade will significantly upgrade the model physics. A major goal is to produce a refined set of source and sink terms for the wind input...accuracy of ocean wave forecasts over a wide dynamic range of wind speeds out to hurricane conditions, contributing a dissipation source function

  19. Correlating Pluto's Albedo Distribution to Long Term Insolation Patterns

    NASA Astrophysics Data System (ADS)

    Earle, Alissa M.; Binzel, Richard P.; Stern, S. Alan; Young, Leslie A.; Buratti, Bonnie J.; Ennico, Kimberly; Grundy, Will M.; Olkin, Catherine B.; Spencer, John R.; Weaver, Hal A.

    2015-11-01

    NASA's New Horizons' reconnaissance of the Pluto system has revealed striking albedo contrasts from polar to equatorial latitudes on Pluto, as well as sharp boundaries for longitudinal variations. These contrasts suggest Pluto undergoes dynamic evolution that drives the redistribution of volatiles. Using the New Horizons results as a template, in this talk we will explore the volatile migration process driven seasonally on Pluto considering multiple timescales. These timescales include the current orbit (248 years) as well as the timescales for obliquity precession (amplitude of 23 degrees over 3 Myrs) and regression of the orbital longitude of perihelion (3.7 Myrs). We will build upon the long-term insolation history model described by Earle and Binzel (2015, Icarus 250, 405-412) with the goal of identifying the most critical timescales that drive the features observed in Pluto’s current post-perihelion epoch. This work was supported by the NASA New Horizons Project.

  20. FUB at TREC 2008 Relevance Feedback Track: Extending Rocchio with Distributional Term Analysis

    DTIC Science & Technology

    2008-11-01

    1 FUB at TREC 2008 Relevance Feedback Track: Extending Rocchio with Distributional Term Analysis Andrea Bernardini, Claudio Carpineto Fondazione Ugo...following. - Test the effectiveness of using a combination of Rocchio and distributional term analysis on a relevance feedback task; so far, this approach has...feedback not only more effective but also more robust than pseudo-relevance feedback? 2. Our approach: combining Rocchio with term-ranking scores The

  1. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  2. Long Term Leaching of Chlorinated Solvents from Source Zones in Low Permeability Settings with Fractures

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J.; Troldborg, M.; Binning, P. J.; Broholm, M. M.; Lemming, G.; Damgaard, I.

    2008-12-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at two well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At the study sites (Sortebrovej and Vadsbyvej), the source areas are situated in a clayey till with fractures

  3. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  4. Acetone in the atmosphere: Distribution, sources, and sinks

    NASA Technical Reports Server (NTRS)

    Singh, H. B.; O'Hara, D.; Herlth, D.; Sachse, W.; Blake, D. R.; Bradshaw, J. D.; Kanakidou, M.; Crutzen, P. J.

    1994-01-01

    Acetone (CH3COCH3) was found to be the dominant nonmethane organic species present in the atmosphere sampled primarily over eastern Canada (0-6 km, 35 deg-65 deg N) during ABLE3B (July to August 1990). A concentration range of 357 to 2310 ppt (= 10(exp -12) v/v) with a mean value of 1140 +/- 413 ppt was measured. Under extremely clean conditions, generally involving Arctic flows, lowest (background) mixing ratios of 550 +/- 100 ppt were present in much of the troposphere studied. Correlations between atmospheric mixing ratios of acetone and select species such as C2H2, CO, C3H8, C2Cl4 and isoprene provided important clues to its possible sources and to the causes of its atmospheric variability. Biomass burning as a source of acetone has been identified for the first time. By using atmospheric data and three-dimensional photochemical models, a global acetone source of 40-60 Tg (= 10(exp 12) g)/yr is estimated to be present. Secondary formation from the atmospheric oxidation of precursor hydrocarbons (principally propane, isobutane, and isobutene) provides the single largest source (51%). The remainder is attributable to biomass burning (26%), direct biogenic emissions (21%), and primary anthropogenic emissions (3%). Atmospheric removal of acetone is estimated to be due to photolysis (64%), reaction with OH radicals (24%), and deposition (12%). Model calculations also suggest that acetone photolysis contributed significantly to PAN formation (100-200 ppt) in the middle and upper troposphere of the sampled region and may be important globally. While the source-sink equation appears to be roughly balanced, much more atmospheric and source data, especially from the southern hemisphere, are needed to reliably quantify the atmospheric budget of acetone.

  5. 77 FR 10490 - SourceGas Distribution LLC; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Filing Take notice that on February 14, 2012, SourceGas Distribution LLC submitted a revised baseline filing of their Statement of...

  6. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  7. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  8. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  9. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  10. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  11. 30 CFR 872.12 - Where do moneys distributed from the Fund and other sources go?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 3 2013-07-01 2013-07-01 false Where do moneys distributed from the Fund and other sources go? 872.12 Section 872.12 Mineral Resources OFFICE OF SURFACE MINING RECLAMATION AND... AND INDIAN TRIBES § 872.12 Where do moneys distributed from the Fund and other sources go? (a) Each...

  12. 30 CFR 872.12 - Where do moneys distributed from the Fund and other sources go?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Where do moneys distributed from the Fund and other sources go? 872.12 Section 872.12 Mineral Resources OFFICE OF SURFACE MINING RECLAMATION AND... AND INDIAN TRIBES § 872.12 Where do moneys distributed from the Fund and other sources go? (a) Each...

  13. 30 CFR 872.12 - Where do moneys distributed from the Fund and other sources go?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false Where do moneys distributed from the Fund and other sources go? 872.12 Section 872.12 Mineral Resources OFFICE OF SURFACE MINING RECLAMATION AND... AND INDIAN TRIBES § 872.12 Where do moneys distributed from the Fund and other sources go? (a) Each...

  14. Shielding analysis of proton therapy accelerators: a demonstration using Monte Carlo-generated source terms and attenuation lengths.

    PubMed

    Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng

    2015-05-01

    Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators.

  15. Source term balance in a severe storm in the Southern North Sea

    NASA Astrophysics Data System (ADS)

    van Vledder, Gerbrant Ph.; Hulst, Sander Th. C.; McConochie, Jason D.

    2016-12-01

    This paper presents the results of a wave hindcast of a severe storm in the Southern North Sea to verify recently developed deep and shallow water source terms. The work was carried out in the framework of the ONR funded NOPP project (Tolman et al. 2013) in which deep and shallow water source terms were developed for use in third-generation wave prediction models. These deep water source terms for whitecapping, wind input and nonlinear interactions were developed, implemented and tested primarily in the WAVEWATCH III model, whereas shallow water source terms for depth-limited wave breaking and triad interactions were developed, implemented and tested primarily in the SWAN wave model. So far, the new deep-water source terms for whitecapping were not fully tested in shallow environments. Similarly, the shallow water source terms were not yet tested in large inter-mediate depth areas like the North Sea. As a first step in assessing the performance of these newly developed source terms, the source term balance and the effect of different physical settings on the prediction of wave heights and wave periods in the relatively shallow North Sea was analysed. The December 2013 storm was hindcast with a SWAN model implementation for the North Sea. Spectral wave boundary conditions were obtained from an Atlantic Ocean WAVEWATCH III model implementation and the model was driven by hourly CFSR wind fields. In the southern part of the North Sea, current and water level effects were included. The hindcast was performed with five different settings for whitecapping, viz. three Komen type whitecapping formulations, the saturation-based whitecapping by Van der Westhuysen et al. (2007) and the recently developed ST6 whitecapping as described by Zieger et al. (2015). Results of the wave hindcast were compared with buoy measurements at location K13 collected by the Dutch Ministry of Transport and Public Works. An analysis was made of the source term balance at three locations, the deep

  16. Monitoring Design for Source Identification in Water Distribution Systems

    EPA Science Inventory

    The design of sensor networks for the purpose of monitoring for contaminants in water distribution systems is currently an active area of research. Much of the effort has been directed at the contamination detection problem and the expression of public health protection objective...

  17. The planetary distribution of heat sources and sinks during FGGE

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Wei, M. Y.

    1985-01-01

    Heating distributions from analysis of the National Meteorological Center and European Center for Medium Range Weather Forecasts data sets; methods used and problems involved in the inference of diabatic heating; the relationship between differential heating and energy transport; and recommendations on the inference of heat soruces and heat sinks for the planetary show are discussed.

  18. Monitoring Design for Source Identification in Water Distribution Systems

    EPA Science Inventory

    The design of sensor networks for the purpose of monitoring for contaminants in water distribution systems is currently an active area of research. Much of the effort has been directed at the contamination detection problem and the expression of public health protection objective...

  19. Bounds on Distributed TDOA-Based Localization of OFDM Sources

    DTIC Science & Technology

    2010-05-01

    License, The original document contains color images . 14. ABSTRACT One main drawback of using Time Difference of Arrival (TDOA) methods for source...systems, since the CP makes OFDM non-stationary (in fact, cyclo-stationary with pe- riod MT ). However, similar work on image registration [9] makes...an assumption analogous to ours, i.e. that the observed images are sampled at the Nyquist rate or greater. 4. CRLB DERIVATION First, we consider the

  20. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  1. The distribution and source of boulders on asteroid 4179 Toutatis

    NASA Astrophysics Data System (ADS)

    Jiang, Yun; Ji, Jianghui; Huang, Jiangchuan; Marchi, Simone; Li, Yuan; Ip, Wing-Huen

    2016-01-01

    Boulders are ubiquitous on the surfaces of asteroids and their spatial and size distributions provide information for the geological evolution and collisional history of parent bodies. We identify more than 200 boulders on near-Earth asteroid 4179 Toutatis based on images obtained by Chang'e-2 flyby. The cumulative boulder size frequency distribution (SFD) gives a power-index of -4.4 +/- 0.1, which is clearly steeper than those of boulders on Itokawa and Eros, indicating much high degree of fragmentation. Correlation analyses with craters suggest that most boulders cannot solely be produced as products of cratering, but are probably survived fragments from the parent body of Toutatis, accreted after its breakup. Similar to Itokawa, Toutatis probably has a rubble-pile structure, but owns a different preservation state of boulders.

  2. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  3. Occurrence of arsenic contamination in Canada: sources, behavior and distribution.

    PubMed

    Wang, Suiling; Mulligan, Catherine N

    2006-08-01

    Recently there has been increasing anxieties concerning arsenic related problems. Occurrence of arsenic contamination has been reported worldwide. In Canada, the main natural arsenic sources are weathering and erosion of arsenic-containing rocks and soil, while tailings from historic and recent gold mine operations and wood preservative facilities are the principal anthropogenic sources. Across Canada, the 24-h average concentration of arsenic in the atmosphere is generally less than 0.3 microg/m3. Arsenic concentrations in natural uncontaminated soil and sediments range from 4 to 150 mg/kg. In uncontaminated surface and ground waters, the arsenic concentration ranges from 0.001 to 0.005 mg/L. As a result of anthropogenic inputs, elevated arsenic levels, above ten to thousand times the Interim Maximum Acceptable Concentration (IMAC), have been reported in air, soil and sediment, surface water and groundwater, and biota in several regions. Most arsenic is of toxic inorganic forms. It is critical to recognize that such contamination imposes serious harmful effects on various aquatic and terrestrial organisms and human health ultimately. Serious incidences of acute and chronic arsenic poisonings have been revealed. Through examination of the available literature, screening and selecting existing data, this paper provides an analysis of the currently available information on recognized problem areas, and an overview of current knowledge of the principal hydrogeochemical processes of arsenic transportation and transformation. However, a more detailed understanding of local sources of arsenic and mechanisms of arsenic release is required. More extensive studies will be required for building practical guidance on avoiding and reducing arsenic contamination. Bioremediation and hyperaccumulation are emerging innovative technologies for the remediation of arsenic contaminated sites. Natural attenuation may be utilized as a potential in situ remedial option. Further

  4. High resolution stationary digital breast tomosynthesis using distributed carbon nanotube x-ray source array

    PubMed Central

    Qian, Xin; Tucker, Andrew; Gidcumb, Emily; Shan, Jing; Yang, Guang; Calderon-Colon, Xiomara; Sultana, Shabana; Lu, Jianping; Zhou, Otto; Spronk, Derrek; Sprenger, Frank; Zhang, Yiheng; Kennedy, Don; Farbizio, Tom; Jing, Zhenxue

    2012-01-01

    binning, the projection resolution along the scanning direction increased from 4.0 cycles/mm [at 10% modulation-transfer-function (MTF)] in DBT to 5.1 cycles/mm in s-DBT at magnification factor of 1.08. The improvement is more pronounced for faster scanning speeds, wider angular coverage, and smaller detector pixel sizes. The scanning speed depends on the detector, the number of views, and the imaging dose. With 240 ms detector readout time, the s-DBT system scanning time is 6.3 s for a 15-view, 100 mAs scan regardless of the angular coverage. The scanning speed can be reduced to less than 4 s when detectors become faster. Initial phantom studies showed good quality reconstructed images. Conclusions: A prototype s-DBT scanner has been developed and evaluated by retrofitting the Selenia rotating gantry DBT scanner with a spatially distributed CNT x-ray source array. Preliminary results show that it improves system spatial resolution substantially by eliminating image blur due to x-ray focal spot motion. The scanner speed of s-DBT system is independent of angular coverage and can be increased with faster detector without image degration. The accelerated lifetime measurement demonstrated the long term stability of CNT x-ray source array with typical clinical operation lifetime over 3 years. PMID:22482630

  5. High resolution stationary digital breast tomosynthesis using distributed carbon nanotube x-ray source array.

    PubMed

    Qian, Xin; Tucker, Andrew; Gidcumb, Emily; Shan, Jing; Yang, Guang; Calderon-Colon, Xiomara; Sultana, Shabana; Lu, Jianping; Zhou, Otto; Spronk, Derrek; Sprenger, Frank; Zhang, Yiheng; Kennedy, Don; Farbizio, Tom; Jing, Zhenxue

    2012-04-01

    resolution along the scanning direction increased from 4.0 cycles/mm [at 10% modulation-transfer-function (MTF)] in DBT to 5.1 cycles/mm in s-DBT at magnification factor of 1.08. The improvement is more pronounced for faster scanning speeds, wider angular coverage, and smaller detector pixel sizes. The scanning speed depends on the detector, the number of views, and the imaging dose. With 240 ms detector readout time, the s-DBT system scanning time is 6.3 s for a 15-view, 100 mAs scan regardless of the angular coverage. The scanning speed can be reduced to less than 4 s when detectors become faster. Initial phantom studies showed good quality reconstructed images. A prototype s-DBT scanner has been developed and evaluated by retrofitting the Selenia rotating gantry DBT scanner with a spatially distributed CNT x-ray source array. Preliminary results show that it improves system spatial resolution substantially by eliminating image blur due to x-ray focal spot motion. The scanner speed of s-DBT system is independent of angular coverage and can be increased with faster detector without image degration. The accelerated lifetime measurement demonstrated the long term stability of CNT x-ray source array with typical clinical operation lifetime over 3 years.

  6. Review of radionuclide source terms used for performance-assessment analyses; Yucca Mountain Site Characterization Project

    SciTech Connect

    Barnard, R.W.

    1993-06-01

    Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ``uranium`` and 4n+3 ``actinium`` decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides.

  7. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  8. Clinical Application of Spatiotemporal Distributed Source Analysis in Presurgical Evaluation of Epilepsy

    PubMed Central

    Tanaka, Naoaki; Stufflebeam, Steven M.

    2014-01-01

    Magnetoencephalography (MEG), which acquires neuromagnetic fields in the brain, is a useful diagnostic tool in presurgical evaluation of epilepsy. Previous studies have shown that MEG affects the planning intracranial electroencephalography placement and correlates with surgical outcomes by using a single dipole model. Spatiotemporal source analysis using distributed source models is an advanced method for analyzing MEG, and has been recently introduced for analyzing epileptic spikes. It has advantages over the conventional single dipole analysis for obtaining accurate sources and understanding the propagation of epileptic spikes. In this article, we review the source analysis methods, describe the techniques of the distributed source analysis, interpretation of source distribution maps, and discuss the benefits and feasibility of this method in evaluation of epilepsy. PMID:24574999

  9. Clinical application of spatiotemporal distributed source analysis in presurgical evaluation of epilepsy.

    PubMed

    Tanaka, Naoaki; Stufflebeam, Steven M

    2014-01-01

    Magnetoencephalography (MEG), which acquires neuromagnetic fields in the brain, is a useful diagnostic tool in presurgical evaluation of epilepsy. Previous studies have shown that MEG affects the planning intracranial electroencephalography placement and correlates with surgical outcomes by using a single dipole model. Spatiotemporal source analysis using distributed source models is an advanced method for analyzing MEG, and has been recently introduced for analyzing epileptic spikes. It has advantages over the conventional single dipole analysis for obtaining accurate sources and understanding the propagation of epileptic spikes. In this article, we review the source analysis methods, describe the techniques of the distributed source analysis, interpretation of source distribution maps, and discuss the benefits and feasibility of this method in evaluation of epilepsy.

  10. Simulation of dose distribution for iridium-192 brachytherapy source type-H01 using MCNPX

    NASA Astrophysics Data System (ADS)

    Purwaningsih, Anik

    2014-09-01

    Dosimetric data for a brachytherapy source should be known before it used for clinical treatment. Iridium-192 source type H01 was manufactured by PRR-BATAN aimed to brachytherapy is not yet known its dosimetric data. Radial dose function and anisotropic dose distribution are some primary keys in brachytherapy source. Dose distribution for Iridium-192 source type H01 was obtained from the dose calculation formalism recommended in the AAPM TG-43U1 report using MCNPX 2.6.0 Monte Carlo simulation code. To know the effect of cavity on Iridium-192 type H01 caused by manufacturing process, also calculated on Iridium-192 type H01 if without cavity. The result of calculation of radial dose function and anisotropic dose distribution for Iridium-192 source type H01 were compared with another model of Iridium-192 source.

  11. Simulation of dose distribution for iridium-192 brachytherapy source type-H01 using MCNPX

    SciTech Connect

    Purwaningsih, Anik

    2014-09-30

    Dosimetric data for a brachytherapy source should be known before it used for clinical treatment. Iridium-192 source type H01 was manufactured by PRR-BATAN aimed to brachytherapy is not yet known its dosimetric data. Radial dose function and anisotropic dose distribution are some primary keys in brachytherapy source. Dose distribution for Iridium-192 source type H01 was obtained from the dose calculation formalism recommended in the AAPM TG-43U1 report using MCNPX 2.6.0 Monte Carlo simulation code. To know the effect of cavity on Iridium-192 type H01 caused by manufacturing process, also calculated on Iridium-192 type H01 if without cavity. The result of calculation of radial dose function and anisotropic dose distribution for Iridium-192 source type H01 were compared with another model of Iridium-192 source.

  12. Distribution and source of the UV absorption in Venus' atmosphere

    NASA Technical Reports Server (NTRS)

    Pollack, J. B.; Toon, O. B.; Whitten, R. C.; Boese, R.; Ragent, B.; Tomasko, M.; Esposito, L.; Travis, L.; Wiedman, D.

    1980-01-01

    The model predictions were compared with the Pioneer Venus probes and orbiter to determine the composition of the UV absorbing materials. The simulations were carried out with radiative transfer codes which included spacecraft constraints on the aerosol and gas characteristics in the Venus atmosphere; gaseous SO2 (a source of opacity at the wavelengths below 0.32 microns), and a second absorber (which dominates above 0.32 microns) were required. The UV contrast variations are due to the optical depth changes in the upper haze layer producing brightness variations between equatorial and polar areas, and to differences in the depth over which the second UV absorber is depleted in the highest portion of the main clouds.

  13. Fire Source Localization Based on Distributed Temperature Sensing by a Dual-Line Optical Fiber System.

    PubMed

    Sun, Miao; Tang, Yuquan; Yang, Shuang; Li, Jun; Sigrist, Markus W; Dong, Fengzhong

    2016-06-06

    We propose a method for localizing a fire source using an optical fiber distributed temperature sensor system. A section of two parallel optical fibers employed as the sensing element is installed near the ceiling of a closed room in which the fire source is located. By measuring the temperature of hot air flows, the problem of three-dimensional fire source localization is transformed to two dimensions. The method of the source location is verified with experiments using burning alcohol as fire source, and it is demonstrated that the method represents a robust and reliable technique for localizing a fire source also for long sensing ranges.

  14. Fire Source Localization Based on Distributed Temperature Sensing by a Dual-Line Optical Fiber System

    PubMed Central

    Sun, Miao; Tang, Yuquan; Yang, Shuang; Li, Jun; Sigrist, Markus W.; Dong, Fengzhong

    2016-01-01

    We propose a method for localizing a fire source using an optical fiber distributed temperature sensor system. A section of two parallel optical fibers employed as the sensing element is installed near the ceiling of a closed room in which the fire source is located. By measuring the temperature of hot air flows, the problem of three-dimensional fire source localization is transformed to two dimensions. The method of the source location is verified with experiments using burning alcohol as fire source, and it is demonstrated that the method represents a robust and reliable technique for localizing a fire source also for long sensing ranges. PMID:27275822

  15. Open-Source, Distributed Computational Environment for Virtual Materials Exploration

    DTIC Science & Technology

    2015-01-01

    like LAMMPS with big enough systems, and the link between  molecular   dynamics   materials simulations  and FEM parameters.  22 Distribution Statement A...offering  dynamic  runtime extension using plugins, and offering reusable software  libraries that expose features already well tested in the main...FEM solvers. The hot start method was coordinated using  Python  scripts and input  files, and  would serve as an initial prototype to be implemented in

  16. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1988-01-01

    The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.

  17. A Source Term for Wave Attenuation by Sea Ice in WAVEWATCH III: IC4

    DTIC Science & Technology

    2017-06-07

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--17-9726 Approved for public release; distribution is unlimited. A Source...Division Stennis Space Center, MS 39529-5004 NRL/MR/7320--17-9726 Approved for public release; distribution is unlimited. Unclassified Unlimited...frequency space ; M6) and an expanded version of M5 with up to 10 steps. The remainder of this report is structured as follows: a note about the

  18. Uncertainties associated with the definition of a hydrologic source term for the Nevada Test Site

    SciTech Connect

    Smith, D.K.; Esser, B.K.; Thompson, J.L.

    1995-05-01

    The U.S. Department of Energy, Nevada Operations Office (DOE/NV) Environmental Restoration Division is seeking to evaluate groundwater contamination resulting from 30 years of underground nuclear testing at the Nevada Test Site (NTS). This evaluation requires knowledge about what radioactive materials are in the groundwater and how they are transported through the underground environment. This information coupled with models of groundwater flow (flow paths and flow rates) will enable predictions of the arrival of each radionuclide at a selected receptor site. Risk assessment models will then be used to calculate the expected environmental and human doses. The accuracy of our predictions depends on the validity of our hydrologic and risk assessment models and on the quality of the data for radionuclide concentrations in ground water at each underground nuclear test site. This paper summarizes what we currently know about radioactive material in NTS groundwater and suggests how we can best use our limited knowledge to proceed with initial modeling efforts. The amount of a radionuclide available for transport in groundwater at the site of an underground nuclear test is called the hydrologic source term. The radiologic source term is the total amount of residual radionuclides remaining after an underground nuclear test. The hydrologic source term is smaller than the radiologic source term because some or most of the radionuclide residual cannot be transported by groundwater. The radiologic source term has been determined for each of the underground nuclear tests fired at the NTS; however, the hydrologic source term has been estimated from measurements at only a few sites.

  19. The Fukushima releases: an inverse modelling approach to assess the source term by using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Saunier, Olivier; Mathieu, Anne; Didier, Damien; Tombette, Marilyne; Quélo, Denis; Winiarek, Victor; Bocquet, Marc

    2013-04-01

    The Chernobyl nuclear accident and more recently the Fukushima accident highlighted that the largest source of error on consequences assessment is the source term estimation including the time evolution of the release rate and its distribution between radioisotopes. Inverse modelling methods have proved to be efficient to assess the source term due to accidental situation (Gudiksen, 1989, Krysta and Bocquet, 2007, Stohl et al 2011, Winiarek et al 2012). These methods combine environmental measurements and atmospheric dispersion models. They have been recently applied to the Fukushima accident. Most existing approaches are designed to use air sampling measurements (Winiarek et al, 2012) and some of them use also deposition measurements (Stohl et al, 2012, Winiarek et al, 2013). During the Fukushima accident, such measurements are far less numerous and not as well distributed within Japan than the dose rate measurements. To efficiently document the evolution of the contamination, gamma dose rate measurements were numerous, well distributed within Japan and they offered a high temporal frequency. However, dose rate data are not as easy to use as air sampling measurements and until now they were not used in inverse modelling approach. Indeed, dose rate data results from all the gamma emitters present in the ground and in the atmosphere in the vicinity of the receptor. They do not allow one to determine the isotopic composition or to distinguish the plume contribution from wet deposition. The presented approach proposes a way to use dose rate measurement in inverse modeling approach without the need of a-priori information on emissions. The method proved to be efficient and reliable when applied on the Fukushima accident. The emissions for the 8 main isotopes Xe-133, Cs-134, Cs-136, Cs-137, Ba-137m, I-131, I-132 and Te-132 have been assessed. The Daiichi power plant events (such as ventings, explosions…) known to have caused atmospheric releases are well identified in

  20. Efficiency of core light injection from sources in the cladding - Bulk distribution

    NASA Astrophysics Data System (ADS)

    Egalon, Claudio O.; Rogowski, Robert S.

    1992-04-01

    The behavior of the power efficiency of an optical fiber with bulk distribution of sources in its cladding is analyzed. Marcuse's (1988) results for weakly guiding cylindrical fibers with fluorescent sources uniformly distributed in the cladding are confirmed for the bulk distribution case. It is found that power efficiency increases with wavelength and with difference in refractive indices. A new independent variable for the bulk distribution is found, and it is shown that the power efficiency does not always increase with the V number.

  1. Efficiency of core light injection from sources in the cladding - Bulk distribution

    NASA Technical Reports Server (NTRS)

    Egalon, Claudio O.; Rogowski, Robert S.

    1991-01-01

    The behavior of the power efficiency of an optical fiber with bulk distribution of sources in its cladding is analyzed. Marcuse's (1988) results for weakly guiding cylindrical fibers with fluorescent sources uniformly distributed in the cladding are confirmed for the bulk distribution case. It is found that power efficiency increases with wavelength and with difference in refractive indices. A new independent variable for the bulk distribution is found, and it is shown that the power efficiency does not always increase with the V number.

  2. An altitude and distance correction to the source fluence distribution of TGFs

    NASA Astrophysics Data System (ADS)

    Nisi, R. S.; Østgaard, N.; Gjesteland, T.; Collier, A. B.

    2014-10-01

    The source fluence distribution of terrestrial gamma ray flashes (TGFs) has been extensively discussed in recent years, but few have considered how the TGF fluence distribution at the source, as estimated from satellite measurements, depends on the distance from satellite foot point and assumed production altitude. As the absorption of the TGF photons increases significantly with lower source altitude and larger distance between the source and the observing satellite, these might be important factors. We have addressed the issue by using the tropopause pressure distribution as an approximation of the TGF production altitude distribution and World Wide Lightning Location Network spheric measurements to determine the distance. The study is made possible by the increased number of Ramaty High Energy Solar Spectroscopic Imager (RHESSI) TGFs found in the second catalog of the RHESSI data. One find is that the TGF/lightning ratio for the tropics probably has an annual variability due to an annual variability in the Dobson-Brewer circulation. The main result is an indication that the altitude distribution and distance should be considered when investigating the source fluence distribution of TGFs, as this leads to a softening of the inferred distribution of source brightness.

  3. An altitude and distance correction to the source fluence distribution of TGFs

    PubMed Central

    Nisi, R S; Østgaard, N; Gjesteland, T; Collier, A B

    2014-01-01

    The source fluence distribution of terrestrial gamma ray flashes (TGFs) has been extensively discussed in recent years, but few have considered how the TGF fluence distribution at the source, as estimated from satellite measurements, depends on the distance from satellite foot point and assumed production altitude. As the absorption of the TGF photons increases significantly with lower source altitude and larger distance between the source and the observing satellite, these might be important factors. We have addressed the issue by using the tropopause pressure distribution as an approximation of the TGF production altitude distribution and World Wide Lightning Location Network spheric measurements to determine the distance. The study is made possible by the increased number of Ramaty High Energy Solar Spectroscopic Imager (RHESSI) TGFs found in the second catalog of the RHESSI data. One find is that the TGF/lightning ratio for the tropics probably has an annual variability due to an annual variability in the Dobson-Brewer circulation. The main result is an indication that the altitude distribution and distance should be considered when investigating the source fluence distribution of TGFs, as this leads to a softening of the inferred distribution of source brightness. PMID:26167434

  4. The potential distribution in the Radial Plasma Source

    NASA Astrophysics Data System (ADS)

    Fruchtman, Amnon; Makrinich, Gennady

    2011-10-01

    The Radial Plasma Source (RPS) is based on plasma acceleration by an applied voltage across a magnetic field. Here we report the recent progress in understanding the mechanism of plasma acceleration in the RPS. The RPS has a cylindrical symmetry. The accelerating electric field is radial and the magnetic field is axial. Most of the potential drop between the inner anode and the outer cathode is expected to be located where the magnetic field intensity is large. We employ an emissive probe and a Langmuir probe in order to evaluate the radial dependence of the potential. For inferring the plasma potential from the measured emissive probe potential, we employ our recently developed theory for a cylindrical emissive probe. Using the theory and the probe measurements, we plot the radial profiles in the RPS of the plasma potential as well as of the electron density and temperature. The possible modification of the geometry for propulsion applications will be discussed. Partially supported by the Israel Science Foundation, Grant 864/07.

  5. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1990-01-01

    In the present study of the behavior of typical numerical methods in the case of a model advection equation having a parameter-dependent source term, two approaches to the incorporation of the source terms are used: MacCormack-type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. The latter are found to perform slightly better. The model scalar equation is used to show that the incorrectness of the propagation speeds of discontinuities observed in the stiff case is due to the introduction of nonequilibrium values through numerical dissipation in the advection step.

  6. The long-term problems of contaminated land: Sources, impacts and countermeasures

    SciTech Connect

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  7. High Order Finite Difference Methods with Subcell Resolution for Advection Equations with Stiff Source Terms

    DTIC Science & Technology

    2011-06-16

    introduce this anti-diffusive WENO scheme for Eq . (11). 5 Let xi, i = 1, . . . , N be a uniform (for simplicity) mesh of the computational domain, with mesh...example is the model problem of [23]. Consider Eq . (4) with f(u) = u, the source term given by Eq . ( 5 ), and the initial condition: u(x, 0) = { 1, x ≤ 0.3...should be always zero. However, if µ in the source term Eq . ( 5 ) is very large, the numerical errors of u in the transition region can result in large

  8. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    SciTech Connect

    Not Available

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites.

  9. Reconstruction of far-field tsunami amplitude distributions from earthquake sources

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2016-01-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  10. Reconstruction of Far-Field Tsunami Amplitude Distributions from Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Parsons, Tom

    2016-12-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  11. Measurements of Infrared and Acoustic Source Distributions in Jet Plumes

    NASA Technical Reports Server (NTRS)

    Agboola, Femi A.; Bridges, James; Saiyed, Naseem

    2004-01-01

    The aim of this investigation was to use the linear phased array (LPA) microphones and infrared (IR) imaging to study the effects of advanced nozzle-mixing techniques on jet noise reduction. Several full-scale engine nozzles were tested at varying power cycles with the linear phased array setup parallel to the jet axis. The array consisted of 16 sparsely distributed microphones. The phased array microphone measurements were taken at a distance of 51.0 ft (15.5 m) from the jet axis, and the results were used to obtain relative overall sound pressure levels from one nozzle design to the other. The IR imaging system was used to acquire real-time dynamic thermal patterns of the exhaust jet from the nozzles tested. The IR camera measured the IR radiation from the nozzle exit to a distance of six fan diameters (X/D(sub FAN) = 6), along the jet plume axis. The images confirmed the expected jet plume mixing intensity, and the phased array results showed the differences in sound pressure level with respect to nozzle configurations. The results show the effects of changes in configurations to the exit nozzles on both the flows mixing patterns and radiant energy dissipation patterns. By comparing the results from these two measurements, a relationship between noise reduction and core/bypass flow mixing is demonstrated.

  12. Decoy-state quantum key distribution with a leaky source

    NASA Astrophysics Data System (ADS)

    Tamaki, Kiyoshi; Curty, Marcos; Lucamarini, Marco

    2016-06-01

    In recent years, there has been a great effort to prove the security of quantum key distribution (QKD) with a minimum number of assumptions. Besides its intrinsic theoretical interest, this would allow for larger tolerance against device imperfections in the actual implementations. However, even in this device-independent scenario, one assumption seems unavoidable, that is, the presence of a protected space devoid of any unwanted information leakage in which the legitimate parties can privately generate, process and store their classical data. In this paper we relax this unrealistic and hardly feasible assumption and introduce a general formalism to tackle the information leakage problem in most of existing QKD systems. More specifically, we prove the security of optical QKD systems using phase and intensity modulators in their transmitters, which leak the setting information in an arbitrary manner. We apply our security proof to cases of practical interest and show key rates similar to those obtained in a perfectly shielded environment. Our work constitutes a fundamental step forward in guaranteeing implementation security of quantum communication systems.

  13. Differential dose contributions on total dose distribution of 125I brachytherapy source

    PubMed Central

    Camgöz, B.; Yeğin, G.; Kumru, M.N.

    2010-01-01

    This work provides an improvement of the approach using Monte Carlo simulation for the Amersham Model 6711 125I brachytherapy seed source, which is well known by many theoretical and experimental studies. The source which has simple geometry was researched with respect to criteria of AAPM Tg-43 Report. The approach offered by this study involves determination of differential dose contributions that come from virtual partitions of a massive radioactive element of the studied source to a total dose at analytical calculation point. Some brachytherapy seeds contain multi-radioactive elements so the dose at any point is a total of separate doses from each element. It is momentous to know well the angular and radial dose distributions around the source that is located in cancerous tissue for clinical treatments. Interior geometry of a source is effective on dose characteristics of a distribution. Dose information of inner geometrical structure of a brachytherapy source cannot be acquired by experimental methods because of limits of physical material and geometry in the healthy tissue, so Monte Carlo simulation is a required approach of the study. EGSnrc Monte Carlo simulation software was used. In the design of a simulation, the radioactive source was divided into 10 rings, partitioned but not separate from each other. All differential sources were simulated for dose calculation, and the shape of dose distribution was determined comparatively distribution of a single-complete source. In this work anisotropy function was examined also mathematically. PMID:24376927

  14. GEOCHEMISTRY OF PAHS IN AQUATIC ENVIRONMENTS: A SYNTHESIS OF DISTRIBUTION, SOURCE, PERSISTENCE, PARTITIONING AND BIOAVAILABILITY

    EPA Science Inventory

    On the basis of their distributions, sources, persistence, partitioning and bioavailability, polycyclic aromatic hydrocarbons (PAHs) are a unique class of persistent organic pollutants (POPs) contaminating the aquatic environment. They are of particular interest to geochemists an...

  15. Comparing two micrometeorological techniques for estimating trace gas emissions from distributed sources

    USDA-ARS?s Scientific Manuscript database

    Measuring trace gas emission from distributed sources such as treatment lagoons, treatment wetlands, land spread of manure, and feedlots requires micrometeorological methods. In this study, we tested the accuracy of two relatively new micrometeorological techniques, vertical radial plume mapping (VR...

  16. GEOCHEMISTRY OF PAHS IN AQUATIC ENVIRONMENTS: A SYNTHESIS OF DISTRIBUTION, SOURCE, PERSISTENCE, PARTITIONING AND BIOAVAILABILITY

    EPA Science Inventory

    On the basis of their distributions, sources, persistence, partitioning and bioavailability, polycyclic aromatic hydrocarbons (PAHs) are a unique class of persistent organic pollutants (POPs) contaminating the aquatic environment. They are of particular interest to geochemists an...

  17. Using natural archives to track sources and long-term trends of pollution: an introduction

    USGS Publications Warehouse

    Jules Blais,; Rosen, Michael R.; John Smol,

    2015-01-01

    This book explores the myriad ways that environmental archives can be used to study the distribution and long-term trajectories of contaminants. The volume first focuses on reviews that examine the integrity of the historic record, including factors related to hydrology, post-depositional diffusion, and mixing processes. This is followed by a series of chapters dealing with the diverse archives available for long-term studies of environmental pollution.

  18. Long-term variability in bright hard X-ray sources: 5+ years of BATSE data

    NASA Technical Reports Server (NTRS)

    Robinson, C. R.; Harmon, B. A.; McCollough, M. L.; Paciesas, W. S.; Sahi, M.; Scott, D. M.; Wilson, C. A.; Zhang, S. N.; Deal, K. J.

    1997-01-01

    The operation of the Compton Gamma Ray Observatory (CGRO)/burst and transient source experiment (BATSE) continues to provide data for inclusion into a data base for the analysis of long term variability in bright, hard X-ray sources. The all-sky capability of BATSE provides up to 30 flux measurements/day for each source. The long baseline and the various rising and setting occultation flux measurements allow searches for periodic and quasi-periodic signals with periods of between several hours to hundreds of days to be conducted. The preliminary results from an analysis of the hard X-ray variability in 24 of the brightest BATSE sources are presented. Power density spectra are computed for each source and profiles are presented of the hard X-ray orbital modulations in some X-ray binaries, together with amplitude modulations and variations in outburst durations and intensities in recurrent X-ray transients.

  19. An ontology-based and distributed KDD model for biomedical sources.

    PubMed

    Perez-Rey, David; Anguita, Alberto; Crespo, Jose; Maojo, Victor

    2007-10-11

    Knowledge discovery approaches in modern biomedical research usually require to access heterogeneous and remote data sources in a distributed environment. Traditional KDD models assumed a central repository, lacking mechanisms to access decentralized databases. In such distributed environment, ontologies can be used in all the KDD phases. We present here a new model of ontology-based KDD approach to improve data preprocessing from heterogeneous sources.

  20. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    SciTech Connect

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  1. sLORETA allows reliable distributed source reconstruction based on subdural strip and grid recordings.

    PubMed

    Dümpelmann, Matthias; Ball, Tonio; Schulze-Bonhage, Andreas

    2012-05-01

    Source localization based on invasive recordings by subdural strip and grid electrodes is a topic of increasing interest. This simulation study addresses the question, which factors are relevant for reliable source reconstruction based on sLORETA. MRI and electrode positions of a patient undergoing invasive presurgical epilepsy diagnostics were the basis of sLORETA simulations. A boundary element head model derived from the MRI was used for the simulation of electrical potentials and source reconstruction. Focal dipolar sources distributed on a regular three-dimensional lattice and spatiotemporal distributed patches served as input for simulation. In addition to the distance between original and reconstructed source maxima, the activation volume of the reconstruction and the correlation of time courses between the original and reconstructed sources were investigated. Simulations were supplemented by the localization of the patient's spike activity. For noise-free simulated data, sLORETA achieved results with zero localization error. Added noise diminished the percentage of reliable source localizations with a localization error ≤15 mm to 67.8%. Only for source positions close to the electrode contacts the activation volume correctly represented focal generators. Time-courses of original and reconstructed sources were significantly correlated. The case study results showed accurate localization. sLORETA is a distributed source model, which can be applied for reliable grid and strip based source localization. For distant source positions, overestimation of the extent of the generator has to be taken into account. sLORETA-based source reconstruction has the potential to improve the localization of distributed generators in presurgical epilepsy diagnostics and cognitive neuroscience. Copyright © 2011 Wiley-Liss, Inc.

  2. Sources, distribution, bioavailability, toxicity, and risk assessment of heavy metal(loid)s in complementary medicines.

    PubMed

    Bolan, Shiv; Kunhikrishnan, Anitha; Seshadri, Balaji; Choppala, Girish; Naidu, Ravi; Bolan, Nanthi S; Ok, Yong Sik; Zhang, Ming; Li, Chun-Guang; Li, Feng; Noller, Barry; Kirkham, Mary Beth

    2017-11-01

    The last few decades have seen the rise of alternative medical approaches including the use of herbal supplements, natural products, and traditional medicines, which are collectively known as 'Complementary medicines'. However, there are increasing concerns on the safety and health benefits of these medicines. One of the main hazards with the use of complementary medicines is the presence of heavy metal(loid)s such as arsenic (As), cadmium (Cd), lead (Pb), and mercury (Hg). This review deals with the characteristics of complementary medicines in terms of heavy metal(loid)s sources, distribution, bioavailability, toxicity, and human risk assessment. The heavy metal(loid)s in these medicines are derived from uptake by medicinal plants, cross-contamination during processing, and therapeutic input of metal(loid)s. This paper discusses the distribution of heavy metal(loid)s in these medicines, in terms of their nature, concentration, and speciation. The importance of determining bioavailability towards human health risk assessment was emphasized by the need to estimate daily intake of heavy metal(loid)s in complementary medicines. The review ends with selected case studies of heavy metal(loid) toxicity from complementary medicines with specific reference to As, Cd, Pb, and Hg. The future research opportunities mentioned in the conclusion of review will help researchers to explore new avenues, methodologies, and approaches to the issue of heavy metal(loid)s in complementary medicines, thereby generating new regulations and proposing fresh approach towards safe use of these medicines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Understanding emergency medical dispatch in terms of distributed cognition: a case study.

    PubMed

    Furniss, Dominic; Blandford, Ann

    Emergency medical dispatch (EMD) is typically a team activity, requiring fluid coordination and communication between team members. Such working situations have often been described in terms of distributed cognition (DC), a framework for understanding team working. DC takes account of factors such as shared representations and artefacts to support reasoning about team working. Although the language of DC has been developed over several years, little attention has been paid to developing a methodology or reusable representation which supports reasoning about an interactive system from a DC perspective. We present a case study in which we developed a method for constructing a DC account of team working in the domain of EMD, focusing on the use of the method for describing an existing EMD work system, identifying sources of weakness in that system, and reasoning about the likely consequences of redesign of the system. The resulting DC descriptions have yielded new insights into the design of EMD work and of tools to support that work within a large EMD centre.

  4. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    ERIC Educational Resources Information Center

    Hall, Matthew L.; Bavelier, Daphne

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…

  5. Risk comparisons based on representative source terms with the NUREG-1150 results

    SciTech Connect

    Mubayi, V.; Davis, R.E.; Hanson, A.L.

    1993-12-01

    Standardized source terms, based on a specified release of fission products during potential accidents at commercial light water nuclear reactors, have been used for a long time for regulatory purposes. The siting of nuclear power plants, for example, which is governed by Part 100 of the Code of Federal Regulations Title 10, has utilized the source term recommended in TID-14844 supplemented by Regulatory Guides 1.3 and 1.4 and the Standard Review Plan. With the introduction of probabilistic risk assessment (PRA) methods, the source terms became characterized not only by the amount of fission products released, but also by the probability of the release. In the Reactor Safety Study, for example, several categories of source terms, characterized by release severity and probability, were developed for both pressurized and boiling water reactors (PWRs and BWRs). These categories were based on an understanding of the likely paths and associated phenomenology of accident progression following core damage to possible failure of the containment and release to the environment.

  6. Parameterization of unresolved obstacles in wave modelling: A source term approach

    NASA Astrophysics Data System (ADS)

    Mentaschi, L.; Pérez, J.; Besio, G.; Mendez, F. J.; Menendez, M.

    2015-12-01

    In the present work we introduce two source terms for the parameterization of energy dissipation due to unresolved obstacles in spectral wave models. The proposed approach differs from the classical one based on spatial propagation schemes because it provides a local representation of phenomena such as unresolved wave energy dissipation. This source term-based approach presents the advantage of decoupling unresolved obstacles parameterization from the spatial propagation scheme, allowing not to reformulate, reimplement and revalidate the parameterization of unresolved obstacles for each propagation scheme. Furthermore it opens the way to parameterizations of other unresolved sheltering effects like rotation and redistribution of wave energy over frequencies. Proposed source terms estimate respectively local energy dissipation and shadow effect due to unresolved obstacles. Source terms validation through synthetic case studies has been carried out, showing their ability in reproducing wave dynamics comparable to those of high resolution models. The analysis of high resolution stationary wave simulations may help to better diagnose and study the effects of unresolved obstacles, providing estimations of transparency coefficients for each spectral component and allowing to understand and model unresolved effects of rotation and redistribution of wave energy over frequencies.

  7. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    ERIC Educational Resources Information Center

    Hall, Matthew L.; Bavelier, Daphne

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…

  8. Elevated Natural Source Water Ammonia and Nitrification in the Distribution Systems of Four Water Utilities

    EPA Science Inventory

    Nitrification in drinking water distribution systems is a concern of many drinking water systems. Although chloramination as a source of nitrification (i.e., addition of excess ammonia or breakdown of chloramines) has drawn the most attention, many source waters contain signific...

  9. Elevated Natural Source Water Ammonia and Nitrification in the Distribution Systems of Four Water Utilities

    EPA Science Inventory

    Nitrification in drinking water distribution systems is a concern of many drinking water systems. Although chloramination as a source of nitrification (i.e., addition of excess ammonia or breakdown of chloramines) has drawn the most attention, many source waters contain signific...

  10. Correlated Sources in Distributed Networks--Data Transmission, Common Information Characterization and Inferencing

    ERIC Educational Resources Information Center

    Liu, Wei

    2011-01-01

    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of…

  11. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... distributions made during the taxable year consist only of money and exceed the earnings and profits of such... corporation is made out of earnings and profits to the extent thereof and from the most recently accumulated earnings and profits. In determining the source of a distribution, consideration should be given first,...

  12. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... distributions made during the taxable year consist only of money and exceed the earnings and profits of such... corporation is made out of earnings and profits to the extent thereof and from the most recently accumulated earnings and profits. In determining the source of a distribution, consideration should be given first,...

  13. 30 CFR 872.12 - Where do moneys distributed from the Fund and other sources go?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Where do moneys distributed from the Fund and... ENFORCEMENT, DEPARTMENT OF THE INTERIOR ABANDONED MINE LAND RECLAMATION MONEYS AVAILABLE TO ELIGIBLE STATES AND INDIAN TRIBES § 872.12 Where do moneys distributed from the Fund and other sources go? (a)...

  14. 30 CFR 872.12 - Where do moneys distributed from the Fund and other sources go?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Where do moneys distributed from the Fund and... ENFORCEMENT, DEPARTMENT OF THE INTERIOR ABANDONED MINE LAND RECLAMATION MONEYS AVAILABLE TO ELIGIBLE STATES AND INDIAN TRIBES § 872.12 Where do moneys distributed from the Fund and other sources go? (a)...

  15. Correlated Sources in Distributed Networks--Data Transmission, Common Information Characterization and Inferencing

    ERIC Educational Resources Information Center

    Liu, Wei

    2011-01-01

    Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of…

  16. Numerical analysis of atomic density distribution in arc driven negative ion sources

    SciTech Connect

    Yamamoto, T. Shibata, T.; Hatayama, A.; Kashiwagi, M.; Hanada, M.; Sawada, K.

    2014-02-15

    The purpose of this study is to calculate atomic (H{sup 0}) density distribution in JAEA 10 ampere negative ion source. A collisional radiative model is developed for the calculation of the H{sup 0} density distribution. The non-equilibrium feature of the electron energy distribution function (EEDF), which mainly determines the H{sup 0} production rate, is included by substituting the EEDF calculated from 3D electron transport analysis. In this paper, the H{sup 0} production rate, the ionization rate, and the density distribution in the source chamber are calculated. In the region where high energy electrons exist, the H{sup 0} production and the ionization are enhanced. The calculated H{sup 0} density distribution without the effect of the H{sup 0} transport is relatively small in the upper region. In the next step, the effect should be taken into account to obtain more realistic H{sup 0} distribution.

  17. Measurement-device-independent quantum key distribution with source state errors and statistical fluctuation

    NASA Astrophysics Data System (ADS)

    Jiang, Cong; Yu, Zong-Wen; Wang, Xiang-Bin

    2017-03-01

    We show how to calculate the secure final key rate in the four-intensity decoy-state measurement-device-independent quantum key distribution protocol with both source errors and statistical fluctuations with a certain failure probability. Our results rely only on the range of only a few parameters in the source state. All imperfections in this protocol have been taken into consideration without assuming any specific error patterns of the source.

  18. Size distribution of acidic sulfate ions in fine ambient particulate matter and assessment of source region effect

    NASA Astrophysics Data System (ADS)

    Hazi, Y.; Heikkinen, M. S. A.; Cohen, B. S.

    Human exposure studies strongly suggested that the fine fraction of ambient particulate matter (PM) and its associated acidic sulfates are closely correlated with observed adverse health effects. Acidic sulfates are the products of atmospheric sulfur dioxide oxidation and neutralization processes. Few data are available on the amount and size distribution of acidic sulfates within the fine fraction of ambient PM. Knowledge of this distribution will help to understand their toxic mechanisms in the human respiratory tract. The goals of this research were: (1) to measure the size distribution of hydrogen ion, sulfate, and ammonium within the fine fraction of the ambient aerosol in air masses originating from different source regions; and (2) to examine the effect of the source region and the seasons on the sampled PM composition. Six size fractions within the fine ambient PM were collected using a micro-orifice impactor. Results from 30 sampling sessions demonstrated that higher total concentrations of these three ions were observed during the warm months than during the cold months of the year. Size distribution results show that the midpoint diameter of the fraction of particles with the largest fraction of hydrogen, sulfate and ammonium ions was 0.38 μm. Although most of the mass containing hydrogen and sulfate ions was measured in the fraction of particles with 0.38 μm midpoint diameter, the ultrafine fraction (<0.1 μm) was found to be more acidic. Ambient ion concentrations varied between sampling sessions and seasons, but the overall size distribution profiles are similar. Air mass back trajectories were used to identify the source region of the sampled aerosols. No apparent source region effect was observed in terms of the distribution profile of the ions. However, samples collected from air masses that originated from, or passed over, high sulfur dioxide emission areas demonstrated higher concentrations of the different ions.

  19. Laboratory experiments designed to provide limits on the radionuclide source term for the NNWSI Project

    SciTech Connect

    Oversby, V.M.; McCright, R.D.

    1984-11-01

    The Nevada Nuclear Waste Storage Investigations Project is investigating the suitability of the tuffaceous rocks at Yucca Mountain Nevada for potential use as a high-level nuclear waste repository. The horizon under investigation lies above the water table, and therefore offers a setting that differs substantially from other potential repository sites. The unsaturated zone environment allows a simple, but effective, waste package design. The source term for radionuclide release from the waste package will be based on laboratory experiments that determine the corrosion rates and mechanisms for the metal container and the dissolution rate of the waste form under expected long term conditions. This paper describes the present status of laboratory results and outlines the approach to be used in combining the data to develop a realistic source term for release of radionuclides from the waste package. 16 refs., 3 figs., 1 tab.

  20. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    SciTech Connect

    Carbajo, J.J.

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  1. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  2. Detailed dose distribution prediction of Cf-252 brachytherapy source with boron loading dose enhancement.

    PubMed

    Ghassoun, J; Mostacci, D; Molinari, V; Jehouani, A

    2010-02-01

    The purpose of this work is to evaluate the dose rate distribution and to determine the boron effect on dose rate distribution for (252)Cf brachytherapy source. This study was carried out using a Monte Carlo simulation. To validate the Monte Carlo computer code, the dosimetric parameters were determined following the updated TG-43 formalism and compared with current literature data. The validated computer code was then applied to evaluate the neutron and photon dose distribution and to illustrate the boron loading effect.

  3. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    SciTech Connect

    Grabaskas, David; Bucknor, Matthew; Jerden, James; Brunett, Acacia J.; Denman, Matthew; Clark, Andrew; Denning, Richard S.

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  4. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE PAGES

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    2016-01-18

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/137Cs versus 134Cs/137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain

  5. Low-level radioactive waste source terms for the 1992 integrated data base

    SciTech Connect

    Loghry, S L; Kibbey, A H; Godbee, H W; Icenhour, A S; DePaoli, S M

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF{sub 6}) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and {open_quotes}other{close_quotes}. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF{sub 6} conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992.

  6. Fusion of chemical, biological, and meteorological observations for agent source term estimation and hazard refinement

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Sykes, Ian; Hurst, Jonathan; Vandenberghe, Francois; Weil, Jeffrey; Bieberbach, George, Jr.; Parker, Steve; Cabell, Ryan

    2011-05-01

    Chemical and biological (CB) agent detection and effective use of these observations in hazard assessment models are key elements of our nation's CB defense program that seeks to ensure that Department of Defense (DoD) operations are minimally affected by a CB attack. Accurate hazard assessments rely heavily on the source term parameters necessary to characterize the release in the transport and dispersion (T&D) simulation. Unfortunately, these source parameters are often not known and based on rudimentary assumptions. In this presentation we describe an algorithm that utilizes variational data assimilation techniques to fuse CB and meteorological observations to characterize agent release source parameters and provide a refined hazard assessment. The underlying algorithm consists of a combination of modeling systems, including the Second order Closure Integrated PUFF model (SCIPUFF), its corresponding Source Term Estimation (STE) model, a hybrid Lagrangian-Eulerian Plume Model (LEPM), its formal adjoint, and the software infrastructure necessary to link them. SCIPUFF and its STE model are used to calculate a "first guess" source estimate. The LEPM and corresponding adjoint are then used to iteratively refine this release source estimate using variational data assimilation techniques. This algorithm has undergone preliminary testing using virtual "single realization" plume release data sets from the Virtual THreat Response Emulation and Analysis Testbed (VTHREAT) and data from the FUSION Field Trials 2007 (FFT07). The end-to-end prototype of this system that has been developed to illustrate its use within the United States (US) Joint Effects Model (JEM) will be demonstrated.

  7. A Systematic Search for Short-term Variability of EGRET Sources

    NASA Technical Reports Server (NTRS)

    Wallace, P. M.; Griffis, N. J.; Bertsch, D. L.; Hartman, R. C.; Thompson, D. J.; Kniffen, D. A.; Bloom, S. D.

    2000-01-01

    The 3rd EGRET Catalog of High-energy Gamma-ray Sources contains 170 unidentified sources, and there is great interest in the nature of these sources. One means of determining source class is the study of flux variability on time scales of days; pulsars are believed to be stable on these time scales while blazers are known to be highly variable. In addition, previous work has demonstrated that 3EG J0241-6103 and 3EG J1837-0606 are candidates for a new gamma-ray source class. These sources near the Galactic plane display transient behavior but cannot be associated with any known blazers. Although, many instances of flaring AGN have been reported, the EGRET database has not been systematically searched for occurrences of short-timescale (approximately 1 day) variability. These considerations have led us to conduct a systematic search for short-term variability in EGRET data, covering all viewing periods through proposal cycle 4. Six 3EG catalog sources are reported here to display variability on short time scales; four of them are unidentified. In addition, three non-catalog variable sources are discussed.

  8. Measurement of anisotropic angular distributions of photon energy spectra for I-125 brachytherapy sources.

    PubMed

    Unno, Yasuhiro; Yunoki, Akira; Kurosawa, Tadahiro; Yamada, Takahiro; Sato, Yasushi; Hino, Yoshio

    2012-09-01

    The angular distribution of photon energy spectra emitted from an I-125 brachytherapy source was measured using a specially designed jig in the range of ±70° in the plane of the long axis of the source. It is important to investigate the angular dependence of photon emissions from these sources for the calibration of the air kerma rate. The results show that the influence of the distributions between 0° and ±8° is small enough to allow a calibration using current primary instruments which have a large entrance window.

  9. Extended Tonks-Langmuir-type model with non-Boltzmann-distributed electrons and cold ion sources

    NASA Astrophysics Data System (ADS)

    Kamran, M.; Kuhn, S.; Tskhakaya, D. D.; Khan, M.; Khan

    2013-04-01

    A general formalism for calculating the potential distribution Φ(z) in the quasineutral region of a new class of plane Tonks-Langmuir (TL)-type bounded-plasma-system (BPS) models differing from the well-known `classical' TL model (Tonks, L. and Langmuir, I. 1929 A general theory of the plasma of an arc. Phys. Rev. 34, 876) by allowing for arbitrary (but still cold) ion sources and arbitrary electron distributions is developed. With individual particles usually undergoing microscopic collision/sink/source (CSS) events, extensive use is made here of the basic kinetic-theory concept of `CSS-free trajectories' (i.e., the characteristics of the kinetic equation). Two types of electron populations, occupying the `type-t' and `type-p' domains of electron phase space, are distinguished. By definition, the type-t and type-p domains are made up of phase points lying on type-t (`trapped') CSS-free trajectories (not intersecting the walls and closing on themselves) and type-p (`passing') ones (starting at one of the walls and ending at the other). This work being the first step, it is assumed that ɛ ≡ λ D /l -> 0+ (where λ D and l are a typical Debye length and a typical ionization length respectively) so that the system exhibits a finite quasineutral `plasma' region and two infinitesimally thin `sheath' regions associated with the `sheath-edge singularities' | dΦ/dz| z->+/-zs -> ∞. The potential in the plasma region is required to satisfy a plasma equation (quasineutrality condition) of the form n i {Φ} = n e (Φ), where the electron density n e (Φ) is given and the ion density n i {Φ} is expressed in terms of trajectory integrals of the ion kinetic equation, with the ions produced by electron-impact ionization of cold neutrals. While previous TL-type models were characterized by electrons diffusing under the influence of frequent collisions with the neutral background particles and approximated by Maxwellian (Riemann, K.-U. 2006 Plasma-sheath transition in the

  10. Imaging a spatially confined photoacoustic source defined by a distribution of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Norton, Stephen J.; Vo-Dinh, Tuan

    2012-05-01

    This paper describes the use of plasmonic nanoparticles in photoacoustic imaging. When acoustic waves are generated by thermoacoustic expansion in the fluid medium surrounding a distribution of these particles and the acoustic signals are recorded over a planar aperture, a bandlimited image of this distribution can be reconstructed. It is shown that the accessible portion of the three-dimensional spatial Fourier transform of the unknown source distribution is a spherical shell in k-space, with the core representing missing low-frequency Fourier components of the source density. When the source arises from an isolated distribution of nanoparticles, the iterative Gerchberg-Papoulis procedure can be applied to recover the low-frequency Fourier components. It is shown that this version of the photoacoustic source reconstruction problem is well suited for the use of this procedure. In this way, the fidelity of the image of the photoacoustic-generated source defined by the particle concentration can be enhanced. The procedure is illustrated using simulated data derived from a hypothetical source distribution.

  11. 3D iterative full and half scan reconstruction in CT architectures with distributed sources

    NASA Astrophysics Data System (ADS)

    Iatrou, M.; De Man, B.; Beque, D.; Yin, Z.; Khare, K.; Benson, T. M.

    2008-03-01

    In 3 rd generation CT systems projection data, generated by X-rays emitted from a single source and passing through the imaged object, are acquired by a single detector covering the entire field of view (FOV). Novel CT system architectures employing distributed sources [1,2] could extend the axial coverage, while removing cone-beam artifacts and improving spatial resolution and dose. The sources can be distributed in plane and/or in the longitudinal direction. We investigate statistical iterative reconstruction of multi-axial data, acquired with simulated CT systems with multiple sources distributed along the in-plane and longitudinal directions. The current study explores the feasibility of 3D iterative Full and Half Scan reconstruction methods for CT systems with two different architectures. In the first architecture the sources are distributed in the longitudinal direction, and in the second architecture the sources are distributed both longitudinally and trans-axially. We used Penalized Weighted Least Squares Transmission Reconstruction (PWLSTR) and incorporated a projector-backprojector model matching the simulated architectures. The proposed approaches minimize artifacts related to the proposed geometries. The reconstructed images show that the investigated architectures can achieve good image quality for very large coverage without severe cone-beam artifacts.

  12. Search for correlated radio and optical events in long-term studies of extragalactic sources

    NASA Technical Reports Server (NTRS)

    Pomphrey, R. B.; Smith, A. G.; Leacock, R. J.; Olsson, C. N.; Scott, R. L.; Pollock, J. T.; Edwards, P.; Dent, W. A.

    1976-01-01

    For the first time, long-term records of radio and optical fluxes of a large sample of variable extragalactic sources have been assembled and compared, with linear cross-correlation analysis being used to reinforce the visual comparisons. Only in the case of the BL Lac object OJ 287 is the correlation between radio and optical records strong. In the majority of cases there is no evidence of significant correlation, although nine sources show limited or weak evidence of correlation. The results do not support naive extrapolation of the expanding source model. The general absence of strong correlation between the radio and optical regions has important implications for the energetics of events occurring in such sources.

  13. Estimating usual food intake distributions by using the multiple source method in the EPIC-Potsdam Calibration Study.

    PubMed

    Haubrock, Jennifer; Nöthlings, Ute; Volatier, Jean-Luc; Dekkers, Arnold; Ocké, Marga; Harttig, Ulrich; Illner, Anne-Kathrin; Knüppel, Sven; Andersen, Lene F; Boeing, Heiner

    2011-05-01

    Estimating usual food intake distributions from short-term quantitative measurements is critical when occasionally or rarely eaten food groups are considered. To overcome this challenge by statistical modeling, the Multiple Source Method (MSM) was developed in 2006. The MSM provides usual food intake distributions from individual short-term estimates by combining the probability and the amount of consumption with incorporation of covariates into the modeling part. Habitual consumption frequency information may be used in 2 ways: first, to distinguish true nonconsumers from occasional nonconsumers in short-term measurements and second, as a covariate in the statistical model. The MSM is therefore able to calculate estimates for occasional nonconsumers. External information on the proportion of nonconsumers of a food can also be handled by the MSM. As a proof-of-concept, we applied the MSM to a data set from the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam Calibration Study (2004) comprising 393 participants who completed two 24-h dietary recalls and one FFQ. Usual intake distributions were estimated for 38 food groups with a proportion of nonconsumers > 70% in the 24-h dietary recalls. The intake estimates derived by the MSM corresponded with the observed values such as the group mean. This study shows that the MSM is a useful and applicable statistical technique to estimate usual food intake distributions, if at least 2 repeated measurements per participant are available, even for food groups with a sizeable percentage of nonconsumers.

  14. Impact of routine episodic emissions on the expected frequency distribution of emissions from oil and gas production sources.

    NASA Astrophysics Data System (ADS)

    Smith, N.; Blewitt, D.; Hebert, L. B.

    2015-12-01

    In coordination with oil and gas operators, we developed a high resolution (< 1 min) simulation of temporal variability in well-pad oil and gas emissions over a year. We include routine emissions from condensate tanks, dehydrators, pneumatic devices, fugitive leaks and liquids unloading. We explore the variability in natural gas emissions from these individual well-pad sources, and find that routine short-term episodic emissions such as tank flashing and liquids unloading result in the appearance of a skewed, or 'fat-tail' distribution of emissions, from an individual well-pad over time. Additionally, we explore the expected variability in emissions from multiple wells with different raw gas composition, gas/liquids production volumes and control equipment. Differences in well-level composition, production volume and control equipment translate into differences in well-level emissions leading to a fat-tail distribution of emissions in the absence of operational upsets. Our results have several implications for recent studies focusing on emissions from oil and gas sources. Time scale of emission estimates are important and have important policy implications. Fat tail distributions may not be entirely driven by avoidable mechanical failures, and are expected to occur under routine operational conditions from short-duration emissions (e.g., tank flashing, liquid unloading). An understanding of the expected distribution of emissions for a particular population of wells is necessary to evaluate whether the observed distribution is more skewed than expected. Temporal variability in well-pad emissions make comparisons to annual average emissions inventories difficult and may complicate the interpretation of long-term ambient fenceline monitoring data. Sophisticated change detection algorithms will be necessary to identify when true operational upsets occur versus routine short-term emissions.

  15. Size distribution, mixing state and source apportionment of black carbon aerosol in London during wintertime

    NASA Astrophysics Data System (ADS)

    Liu, D.; Allan, J. D.; Young, D. E.; Coe, H.; Beddows, D.; Fleming, Z. L.; Flynn, M. J.; Gallagher, M. W.; Harrison, R. M.; Lee, J.; Prevot, A. S. H.; Taylor, J. W.; Yin, J.; Williams, P. I.; Zotter, P.

    2014-09-01

    Black carbon aerosols (BC) at a London urban site were characterised in both winter- and summertime 2012 during the Clean Air for London (ClearfLo) project. Positive matrix factorisation (PMF) factors of organic aerosol mass spectra measured by a high-resolution aerosol mass spectrometer (HR-AMS) showed traffic-dominant sources in summer but in winter the influence of additional non-traffic sources became more important, mainly from solid fuel sources (SF). Measurements using a single particle soot photometer (SP2, DMT), showed the traffic-dominant BC exhibited an almost uniform BC core size (Dc) distribution with very thin coating thickness throughout the detectable range of Dc. However, the size distribution of sf (project average mass median Dc = 149 ± 22 nm in winter, and 120 ± 6 nm in summer) and BC coating thickness varied significantly in winter. A novel methodology was developed to attribute the BC number concentrations and mass abundances from traffic (BCtr) and from SF (BCsf), by using a 2-D histogram of the particle optical properties as a function of BC core size, as measured by the SP2. The BCtr and BCsf showed distinctly different sf distributions and coating thicknesses, with BCsf displaying larger Dc and larger coating thickness compared to BCtr. BC particles from different sources were also apportioned by applying a multiple linear regression between the total BC mass and each AMS-PMF factor (BC-AMS-PMF method), and also attributed by applying the absorption spectral dependence of carbonaceous aerosols to 7-wavelength Aethalometer measurements (Aethalometer method). Air masses that originated from westerly (W), southeasterly (SE), and easterly (E) sectors showed BCsf fractions that ranged from low to high, and whose mass median Dc values were 137 ± 10 nm, 143 ± 11 nm and 169 ± 29 nm, respectively. The corresponding bulk relative coating thickness of BC (coated particle size/BC core - Dp/Dc) for these same sectors was 1.28 ± 0.07, 1.45 ± 0

  16. Size distribution, mixing state and source apportionments of black carbon aerosols in London during winter time

    NASA Astrophysics Data System (ADS)

    Liu, D.; Allan, J. D.; Young, D. E.; Coe, H.; Beddows, D.; Fleming, Z. L.; Flynn, M. J.; Gallagher, M. W.; Harrison, R. M.; Lee, J.; Prevot, A. S. H.; Taylor, J. W.; Yin, J.; Williams, P. I.; Zotter, P.

    2014-06-01

    Black carbon aerosols (BC) at a London urban site were characterized in both winter and summer time 2012 during the Clean Air for London (ClearfLo) project. Positive matrix factorization (PMF) factors of organic aerosol mass spectra measured by a high resolution aerosol mass spectrometer (HR-AMS) showed traffic-dominant sources in summer but in winter the influence of additional non-traffic sources became more important, mainly from solid fuel sources (SF). Measurements using a single particle soot photometer (SP2, DMT), showed the traffic-dominant BC exhibited an almost uniform BC core size (Dc) distribution with very thin coating thickness throughout the detectable range of Dc. However the size distribution of Dc (project average mass median Dc = 149 ± 22 nm in winter, and 120 ± 6 nm in summer) and BC coating thickness varied significantly in winter. A novel methodology was developed to attribute the BC number concentrations and mass abundances from traffic (BCtr) and from SF (BCsf), by using a 2-D histogram of the particle optical properties as a function of BC core size, as measured by the SP2. The BCtr and BCsf showed distinctly different Dc distributions and coating thicknesses, with BCsf displaying larger Dc and larger coating thickness compared to BCtr. BC particles from different sources were also apportioned by applying a multiple linear regression between the total BC mass and each AMS-PMF factor (BC-AMS-PMF method), and also attributed by applying the absorption spectral dependence of carbonaceous aerosols to 7-wavelength Aethalometer measurements (Aethalometer method). Air masses that originated from westerly (W), southeasterly (SE), or easterly (E) sectors showed BCsf fractions that ranged from low to high, and whose mass median Dc values were 137 ± 10 nm, 143 ± 11 nm, and 169 ± 29 nm respectively. The corresponding bulk relative coating thickness of BC (coated particle size / BC core - Dp / Dc) for these same sectors was 1.28 ± 0.07, 1.45 ± 0

  17. The impact of light source spectral power distribution on sky glow

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Christian B.; Boley, Paul A.; Davis, Donald R.

    2014-05-01

    The effect of light source spectral power distribution on the visual brightness of anthropogenic sky glow is described. Under visual adaptation levels relevant to observing the night sky, namely with dark-adapted (scotopic) vision, blue-rich (“white”) sources produce a dramatically greater sky brightness than yellow-rich sources. High correlated color temperature LEDs and metal halide sources produce a visual brightness up to 8× brighter than low-pressure sodium and 3× brighter than high-pressure sodium when matched lumen-for-lumen and observed nearby. Though the sky brightness arising from blue-rich sources decreases more strongly with distance, the visual sky glow resulting from such sources remains significantly brighter than from yellow sources out to the limits of this study at 300 km.

  18. Trace elements in particulate matter from metropolitan regions of Northern China: Sources, concentrations and size distributions.

    PubMed

    Pan, Yuepeng; Tian, Shili; Li, Xingru; Sun, Ying; Li, Yi; Wentworth, Gregory R; Wang, Yuesi

    2015-12-15

    Public concerns over airborne trace elements (TEs) in metropolitan areas are increasing, but long-term and multi-site observations of size-resolved aerosol TEs in China are still lacking. Here, we identify highly elevated levels of atmospheric TEs in megacities and industrial sites in a Beijing-Tianjin-Hebei urban agglomeration relative to background areas, with the annual mean values of As, Pb, Ni, Cd and Mn exceeding the acceptable limits of the World Health Organization. Despite the spatial variability in concentrations, the size distribution pattern of each trace element was quite similar across the region. Crustal elements of Al and Fe were mainly found in coarse particles (2.1-9 μm), whereas the main fraction of toxic metals, such as Cu, Zn, As, Se, Cd and Pb, was found in submicron particles (<1.1 μm). These toxic metals were enriched by over 100-fold relative to the Earth's crust. The size distributions of Na, Mg, K, Ca, V, Cr, Mn, Ni, Mo and Ba were bimodal, with two peaks at 0.43-0.65 μm and 4.7-5.8 μm. The combination of the size distribution information, principal component analysis and air mass back trajectory model offered a robust technique for distinguishing the main sources for airborne TEs, e.g., soil dust, fossil fuel combustion and industrial emissions, at different sites. In addition, higher elemental concentrations coincided with westerly flow, indicating that polluted soil and fugitive dust were major sources of TEs on the regional scale. However, the contribution of coal burning, iron industry/oil combustion and non-ferrous smelters to atmospheric metal pollution in Northern China should be given more attention. Considering that the concentrations of heavy metals associated with fine particles in the target region were significantly higher than those in other Asian sites, the implementations of strict environmental standards in China are required to reduce the amounts of these hazardous pollutants released into the atmosphere.

  19. Spatial and Temporal Volatile Organic Compound Measurements in New England: Key Insight on Sources and Distributions

    NASA Astrophysics Data System (ADS)

    Sive, B. C.; White, M. L.; Russo, R. S.; Zhou, Y.; Ambrose, J. L.; Haase, K.; Mao, H.; Talbot, R. W.

    2010-12-01

    Volatile organic compounds (VOCs) in the atmosphere act as precursors in the formation of tropospheric ozone and their emissions and oxidation products can contribute to secondary organic aerosol formation and growth. In examining their effects on regional chemistry and pollution events, considerable uncertainties exist in our understanding of the relative contributions from different sources and classes of compounds as well as their transport from other regions. To quantitatively improve VOC emission estimates for New England, such as propane from widespread LPG leakage and biogenic emissions of toluene, we have conducted regional surveys for VOCs in northern New England to map their spatial variability. This has included repeated one-day quarterly sampling trips covering four 250 to 300 mile loops throughout Maine, New Hampshire, Massachusetts and small portions of Vermont, Connecticut, and Rhode Island and more intensive diurnal hourly-sampling at selected locations within the measurement area. Additionally, long-term VOC measurements from the AIRMAP atmospheric monitoring station at Thompson Farm in rural Durham, New Hampshire are utilized to characterize the mixing ratios, seasonal to interannual variability, and sources of VOCs in this region. The combination of the spatial temporal data sets has provided key insight on the various sources (e.g., fossil fuel combustion, gasoline, LPG, fuel or solvent evaporation, industry, biogenic) and distributions of VOCs throughout New England. Nonmethane hydrocarbon emission rate estimates from the regional sampling campaigns and Thompson Farm ranged from ~109-1010 molecules cm-2 s-1. Moreover, emission rates based on our spatial and temporal measurements are compared with the (2002 and 2005) EPA National Emissions Inventory for the northeastern U.S. Finally, details regarding our analytical techniques and long-term calibration scales will be presented for measurements of C2-C10 nonmethane hydrocarbons, C1-C2 halocarbons

  20. The Integration of Renewable Energy Sources into Electric Power Distribution Systems, Vol. II Utility Case Assessments

    SciTech Connect

    Zaininger, H.W.

    1994-01-01

    Electric utility distribution system impacts associated with the integration of renewable energy sources such as photovoltaics (PV) and wind turbines (WT) are considered in this project. The impacts are expected to vary from site to site according to the following characteristics: the local solar insolation and/or wind characteristics, renewable energy source penetration level, whether battery or other energy storage systems are applied, and local utility distribution design standards and planning practices. Small, distributed renewable energy sources are connected to the utility distribution system like other, similar kW- and MW-scale equipment and loads. Residential applications are expected to be connected to single-phase 120/240-V secondaries. Larger kW-scale applications may be connected to three+phase secondaries, and larger hundred-kW and y-scale applications, such as MW-scale windfarms, or PV plants, may be connected to electric utility primary systems via customer-owned primary and secondary collection systems. In any case, the installation of small, distributed renewable energy sources is expected to have a significant impact on local utility distribution primary and secondary system economics. Small, distributed renewable energy sources installed on utility distribution systems will also produce nonsite-specific utility generation system benefits such as energy and capacity displacement benefits, in addition to the local site-specific distribution system benefits. Although generation system benefits are not site-specific, they are utility-specific, and they vary significantly among utilities in different regions. In addition, transmission system benefits, environmental benefits and other benefits may apply. These benefits also vary significantly among utilities and regions. Seven utility case studies considering PV, WT, and battery storage were conducted to identify a range of potential renewable energy source distribution system applications. The

  1. A Systematic Search for Short-term Variability of EGRET Sources

    NASA Technical Reports Server (NTRS)

    Wallace, P. M.; Bertsch, D. L.; Bloom, S. D.; Griffis, N. J.; Hunter, S. D.; Kniffen, D. A.; Thompson, D. J.

    1999-01-01

    The 3rd EGRET Catalog contains 170 unidentified high-energy (E>100 MeV) gamma-ray sources, and there is great interest in the nature of these sources. One means of determining sources class is the study of flux variability on time scales of days; pulsars are believed to be stable on these scales while blazars are known to be highly variable. In addition, previous work has led to the discovery of 2CG 135+01 and GRO J1838-04, candidates for a new high-energy gamma-ray source class. These sources display transient behavior but cannot be associated with any known blazars. These considerations have led us to conduct a systematic search for short-term variability in EGRET data, covering all viewing periods through cycle 4. Three unidentified sources show some evidence of variability on short time scales; the source displaying the most convincing variability, 3EG J2006-2321, is not easily identified as a blazar.

  2. Voltage management of distribution networks with high penetration of distributed photovoltaic generation sources

    NASA Astrophysics Data System (ADS)

    Alyami, Saeed

    Installation of photovoltaic (PV) units could lead to great challenges to the existing electrical systems. Issues such as voltage rise, protection coordination, islanding detection, harmonics, increased or changed short-circuit levels, etc., need to be carefully addressed before we can see a wide adoption of this environmentally friendly technology. Voltage rise or overvoltage issues are of particular importance to be addressed for deploying more PV systems to distribution networks. This dissertation proposes a comprehensive solution to deal with the voltage violations in distribution networks, from controlling PV power outputs and electricity consumption of smart appliances in real time to optimal placement of PVs at the planning stage. The dissertation is composed of three parts: the literature review, the work that has already been done and the future research tasks. An overview on renewable energy generation and its challenges are given in Chapter 1. The overall literature survey, motivation and the scope of study are also outlined in the chapter. Detailed literature reviews are given in the rest of chapters. The overvoltage and undervoltage phenomena in typical distribution networks with integration of PVs are further explained in Chapter 2. Possible approaches for voltage quality control are also discussed in this chapter, followed by the discussion on the importance of the load management for PHEVs and appliances and its benefits to electric utilities and end users. A new real power capping method is presented in Chapter 3 to prevent overvoltage by adaptively setting the power caps for PV inverters in real time. The proposed method can maintain voltage profiles below a pre-set upper limit while maximizing the PV generation and fairly distributing the real power curtailments among all the PV systems in the network. As a result, each of the PV systems in the network has equal opportunity to generate electricity and shares the responsibility of voltage

  3. Evidence for bathymetric control on the distribution of body wave microseism sources from temporary seismic arrays in Africa

    NASA Astrophysics Data System (ADS)

    Euler, Garrett G.; Wiens, Douglas A.; Nyblade, Andrew A.

    2014-06-01

    Microseisms are the background seismic vibrations mostly driven by the interaction of ocean waves with the solid Earth. Locating the sources of microseisms improves our understanding of the range of conditions under which they are generated and has potential applications to seismic tomography and climate research. In this study, we detect persistent source locations of P-wave microseisms at periods of 5-10 s (0.1-0.2 Hz) using broad-band array noise correlation techniques and frequency-slowness analysis. Data include vertical component records from four temporary seismic arrays in equatorial and southern Africa with a total of 163 broad-band stations and deployed over a span of 13 yr (1994-2007). While none of the arrays were deployed contemporaneously, we find that the recorded microseismic P waves originate from common, distant oceanic bathymetric features with amplitudes that vary seasonally in proportion with extratropical cyclone activity. Our results show that the majority of the persistent microseismic P-wave source locations are within the 30-60º latitude belts of the Northern and Southern hemispheres while a substantially reduced number are found at lower latitudes. Variations in source location with frequency are also observed and indicate tomographic studies including microseismic body wave sources will benefit from analysing multiple frequency bands. We show that the distribution of these source regions in the North Atlantic as well as in the Southern Ocean correlate with variations in bathymetry and ocean wave heights and corroborate current theory on double-frequency microseism generation. The stability of the source locations over the 13-yr time span of our investigation suggests that the long-term body wave microseism source distribution is governed by variations in the bathymetry and ocean wave heights while the interaction of ocean waves has a less apparent influence.

  4. Intensity distribution of the x ray source for the AXAF VETA-I mirror test

    NASA Technical Reports Server (NTRS)

    Zhao, Ping; Kellogg, Edwin M.; Schwartz, Daniel A.; Shao, Yibo; Fulton, M. Ann

    1992-01-01

    The X-ray generator for the AXAF VETA-I mirror test is an electron impact X-ray source with various anode materials. The source sizes of different anodes and their intensity distributions were measured with a pinhole camera before the VETA-I test. The pinhole camera consists of a 30 micrometers diameter pinhole for imaging the source and a Microchannel Plate Imaging Detector with 25 micrometers FWHM spatial resolution for detecting and recording the image. The camera has a magnification factor of 8.79, which enables measuring the detailed spatial structure of the source. The spot size, the intensity distribution, and the flux level of each source were measured with different operating parameters. During the VETA-I test, microscope pictures were taken for each used anode immediately after it was brought out of the source chamber. The source sizes and the intensity distribution structures are clearly shown in the pictures. They are compared and agree with the results from the pinhole camera measurements. This paper presents the results of the above measurements. The results show that under operating conditions characteristic of the VETA-I test, all the source sizes have a FWHM of less than 0.45 mm. For a source of this size at 528 meters away, the angular size to VETA is less than 0.17 arcsec which is small compared to the on ground VETA angular resolution (0.5 arcsec, required and 0.22 arcsec, measured). Even so, the results show the intensity distributions of the sources have complicated structures. These results were crucial for the VETA data analysis and for obtaining the on ground and predicted in orbit VETA Point Response Function.

  5. Characterizing short-term stability for Boolean networks over any distribution of transfer functions

    SciTech Connect

    Seshadhri, C.; Smith, Andrew M.; Vorobeychik, Yevgeniy; Mayo, Jackson R.; Armstrong, Robert C.

    2016-07-05

    Here we present a characterization of short-term stability of random Boolean networks under arbitrary distributions of transfer functions. Given any distribution of transfer functions for a random Boolean network, we present a formula that decides whether short-term chaos (damage spreading) will happen. We provide a formal proof for this formula, and empirically show that its predictions are accurate. Previous work only works for special cases of balanced families. Finally, it has been observed that these characterizations fail for unbalanced families, yet such families are widespread in real biological networks.

  6. Characterizing short-term stability for Boolean networks over any distribution of transfer functions

    SciTech Connect

    Seshadhri, C.; Smith, Andrew M.; Vorobeychik, Yevgeniy; Mayo, Jackson R.; Armstrong, Robert C.

    2016-07-05

    Here we present a characterization of short-term stability of random Boolean networks under arbitrary distributions of transfer functions. Given any distribution of transfer functions for a random Boolean network, we present a formula that decides whether short-term chaos (damage spreading) will happen. We provide a formal proof for this formula, and empirically show that its predictions are accurate. Previous work only works for special cases of balanced families. Finally, it has been observed that these characterizations fail for unbalanced families, yet such families are widespread in real biological networks.

  7. Characterizing short-term stability for Boolean networks over any distribution of transfer functions

    NASA Astrophysics Data System (ADS)

    Seshadhri, C.; Smith, Andrew M.; Vorobeychik, Yevgeniy; Mayo, Jackson R.; Armstrong, Robert C.

    2016-07-01

    We present a characterization of short-term stability of Kauffman's N K (random) Boolean networks under arbitrary distributions of transfer functions. Given such a Boolean network where each transfer function is drawn from the same distribution, we present a formula that determines whether short-term chaos (damage spreading) will happen. Our main technical tool which enables the formal proof of this formula is the Fourier analysis of Boolean functions, which describes such functions as multilinear polynomials over the inputs. Numerical simulations on mixtures of threshold functions and nested canalyzing functions demonstrate the formula's correctness.

  8. Characterizing short-term stability for Boolean networks over any distribution of transfer functions.

    PubMed

    Seshadhri, C; Smith, Andrew M; Vorobeychik, Yevgeniy; Mayo, Jackson R; Armstrong, Robert C

    2016-07-01

    We present a characterization of short-term stability of Kauffman's NK (random) Boolean networks under arbitrary distributions of transfer functions. Given such a Boolean network where each transfer function is drawn from the same distribution, we present a formula that determines whether short-term chaos (damage spreading) will happen. Our main technical tool which enables the formal proof of this formula is the Fourier analysis of Boolean functions, which describes such functions as multilinear polynomials over the inputs. Numerical simulations on mixtures of threshold functions and nested canalyzing functions demonstrate the formula's correctness.

  9. Prediction of short-term distributions of load extremes of offshore wind turbines

    NASA Astrophysics Data System (ADS)

    Wang, Ying-guang

    2016-12-01

    This paper proposes a new methodology to select an optimal threshold level to be used in the peak over threshold (POT) method for the prediction of short-term distributions of load extremes of offshore wind turbines. Such an optimal threshold level is found based on the estimation of the variance-to-mean ratio for the occurrence of peak values, which characterizes the Poisson assumption. A generalized Pareto distribution is then fitted to the extracted peaks over the optimal threshold level and the distribution parameters are estimated by the method of the maximum spacing estimation. This methodology is applied to estimate the short-term distributions of load extremes of the blade bending moment and the tower base bending moment at the mudline of a monopile-supported 5MW offshore wind turbine as an example. The accuracy of the POT method using the optimal threshold level is shown to be better, in terms of the distribution fitting, than that of the POT methods using empirical threshold levels. The comparisons among the short-term extreme response values predicted by using the POT method with the optimal threshold levels and with the empirical threshold levels and by using direct simulation results further substantiate the validity of the proposed new methodology.

  10. Relative contribution of DNAPL dissolution and matrix diffusion to the long-term persistence of chlorinated solvent source zones

    NASA Astrophysics Data System (ADS)

    Seyedabbasi, Mir Ahmad; Newell, Charles J.; Adamson, David T.; Sale, Thomas C.

    2012-06-01

    The relative contribution of dense non-aqueous phase liquid (DNAPL) dissolution versus matrix diffusion processes to the longevity of chlorinated source zones was investigated. Matrix diffusion is being increasingly recognized as an important non-DNAPL component of source behavior over time, and understanding the persistence of contaminants that have diffused into lower permeability units can impact remedial decision-making. In this study, a hypothetical DNAPL source zone architecture consisting of several different sized pools and fingers originally developed by Anderson et al. (1992) was adapted to include defined low permeability layers. A coupled dissolution-diffusion model was developed to allow diffusion into these layers while in contact with DNAPL, followed by diffusion out of these same layers after complete DNAPL dissolution. This exercise was performed for releases of equivalent masses (675 kg) of three different compounds, including chlorinated solvents with solubilities ranging from low (tetrachloroethene (PCE)), moderate (trichloroethene (TCE)) to high (dichloromethane (DCM)). The results of this simple modeling exercise demonstrate that matrix diffusion can be a critical component of source zone longevity and may represent a longer-term contributor to source longevity (i.e., longer time maintaining concentrations above MCLs) than DNAPL dissolution alone at many sites. For the hypothetical TCE release, the simulation indicated that dissolution of DNAPL would take approximately 38 years, while the back diffusion from low permeability zones could maintain the source for an additional 83 years. This effect was even more dramatic for the higher solubility DCM (97% of longevity due to matrix diffusion), while the lower solubility PCE showed a more equal contribution from DNAPL dissolution vs. matrix diffusion. Several methods were used to describe the resulting source attenuation curves, including a first-order decay model which showed that half-life of

  11. Effect of tissue inhomogeneities on dose distributions from Cf-252 brachytherapy source.

    PubMed

    Ghassoun, J

    2013-01-01

    The Monte Carlo method was used to determine the effect of tissue inhomogeneities on dose distribution from a Cf-252 brachytherapy source. Neutron and gamma-ray fluences, energy spectra and dose rate distributions were determined in both homogenous and inhomogeneous phantoms. Simulations were performed using the MCNP5 code. Obtained results were compared with experimentally measured values published in literature. Results showed a significant change in neutron dose rate distributions in presence of heterogeneities. However, their effect on gamma rays dose distribution is minimal.

  12. Long-term monitoring of airborne nickel (Ni) pollution in association with some potential source processes in the urban environment.

    PubMed

    Kim, Ki-Hyun; Shon, Zang-Ho; Mauulida, Puteri T; Song, Sang-Keun

    2014-09-01

    The environmental behavior and pollution status of nickel (Ni) were investigated in seven major cities in Korea over a 13-year time span (1998-2010). The mean concentrations of Ni measured during the whole study period fell within the range of 3.71 (Gwangju: GJ) to 12.6ngm(-3) (Incheon: IC). Although Ni values showed a good comparability in a relatively large spatial scale, its values in most cities (6 out of 7) were subject to moderate reductions over the study period. To assess the effect of major sources on the long-term distribution of Ni, the relationship between their concentrations and the potent source processes like non-road transportation sources (e.g., ship and aircraft emissions) were examined from some cities with port and airport facilities. The potential impact of long-range transport of Asian dust particles in controlling Ni levels was also evaluated. The overall results suggest that the Ni levels were subject to gradual reductions over the study period irrespective of changes in such localized non-road source activities. The pollution of Ni at all the study sites was maintained well below the international threshold (Directive 2004/107/EC) value of 20ngm(-3).

  13. Sources and distribution of late Pleistocene sand, northern Gulf of Mexico Shelf

    SciTech Connect

    Mazzullo, J.M.; Bates, C.; Reutter, D.; Withers, K.

    1985-02-01

    A completed 3-yr study of the sources and consequent distribution of late Pleistocene sand on the northern Gulf shelf clarifies paleogeography and alluvial identification. Techniques used to determine the sources of sand are: the Fourier technique (which differentiated sands from different source terranes on the basis of the shapes of quartz sand grains), mineralogic analysis (which identified the composition of the source terranes that contributed each quartz-shape type), and an evaluation of the source terranes drained by each of the southern US rivers (thereby linking each shape type to a particular river). These data and the mapped distribution of sand deposited on the shelf by each of these rivers during the late Pleistocene lowstand indicate distribution patterns have not been modified by modern shelf currents to any great extent, and thus record the late Pleistocene paleogeography of the shelf. These distributions show, among other things, the locations of the late Pleistocene alluvial valleys of each of the southern US rivers, and identify the sources of shelf-edge deltas off the coasts of Texas and Louisiana that were detected by shallow seismic analysis.

  14. Splitting the Source Term for the Einstein Equation to Classical and Quantum Parts

    NASA Astrophysics Data System (ADS)

    Biró, T. S.; Ván, P.

    2015-11-01

    We consider the special and general relativistic extensions of the action principle behind the Schrödinger equation distinguishing classical and quantum contributions. Postulating a particular quantum correction to the source term in the classical Einstein equation we identify the conformal content of the above action and obtain classical gravitation for massive particles, but with a cosmological term representing off-mass-shell contribution to the energy-momentum tensor. In this scenario the—on the Planck scale surprisingly small—cosmological constant stems from quantum bound states (gravonium) having a Bohr radius a as being Λ =3/a^2.

  15. The integration of renewable energy sources into electric power distribution systems. Volume 2, Utility case assessments

    SciTech Connect

    Zaininger, H.W.; Ellis, P.R.; Schaefer, J.C.

    1994-06-01

    Electric utility distribution system impacts associated with the integration of renewable energy sources such as photovoltaics (PV) and wind turbines (WT) are considered in this project. The impacts are expected to vary from site to site according to the following characteristics: (1) The local solar insolation and/or wind characteristics; (2) renewable energy source penetration level; (3) whether battery or other energy storage systems are applied; and (4) local utility distribution design standards and planning practices. Small, distributed renewable energy sources are connected to the utility distribution system like other, similar kW- and MW-scale equipment and loads. Residential applications are expected to be connected to single-phase 120/240-V secondaries. Larger kw-scale applications may be connected to three-phase secondaries, and larger hundred-kW and MW-scale applications, such as MW-scale windfarms or PV plants, may be connected to electric utility primary systems via customer-owned primary and secondary collection systems. Small, distributed renewable energy sources installed on utility distribution systems will also produce nonsite-specific utility generation system benefits such as energy and capacity displacement benefits, in addition to the local site-specific distribution system benefits. Although generation system benefits are not site-specific, they are utility-specific, and they vary significantly among utilities in different regions. In addition, transmission system benefits, environmental benefits and other benefits may apply. These benefits also vary significantly among utilities and regions. Seven utility case studies considering PV, WT, and battery storage were conducted to identify a range of potential renewable energy source distribution system applications.

  16. Particulate air pollution in six Asian cities: Spatial and temporal distributions, and associated sources

    NASA Astrophysics Data System (ADS)

    Kim Oanh, N. T.; Upadhyay, N.; Zhuang, Y.-H.; Hao, Z.-P.; Murthy, D. V. S.; Lestari, P.; Villarin, J. T.; Chengchua, K.; Co, H. X.; Dung, N. T.; Lindgren, E. S.

    A monitoring program for particulate matter pollution was designed and implemented in six Asian cities/metropolitan regions including Bandung, Bangkok, Beijing, Chennai, Manila, and Hanoi, within the framework of the Asian regional air pollution research network (AIRPET), coordinated by the Asian Institute of Technology. As uniform the methodologies as possible were intended with an established QA/QC procedure in order to produce reliable and comparable data by the network. The monsoon effects and seasonal changes in the sources/activities require long-term monitoring to understand the nature of air pollution in the cities. During phase 1 (2001-2004) of the AIRPET around 3000 fine and coarse particulate matter samples were collected from characteristic urban sites, which provide insight into temporal and spatial variations of PM in the cities. In all six cities, the levels of PM 10 and PM 2.5 were found high, especially during the dry season, which frequently exceeded the corresponding 24 h US EPA standards at a number of sites. The average concentrations of PM 2.5 and PM 10 in the cities ranged, respectively, 44-168 and 54-262 μg m -3 in the dry season, and 18-104 and 33-180 μg m -3 in the wet season. Spatial and temporal distribution of PM in each city, the ratios of PM 2.5 to PM 10, and the reconstructed mass were presented which provide useful information on possible PM sources in the cities. The findings help to understand the nature of particulate matter air pollution problems in the selected cities/metropolitan regions.

  17. Effect of asymmetry of the radio source distribution on the apparent proper motion kinematic analysis

    NASA Astrophysics Data System (ADS)

    Titov, O.; Malkin, Z.

    2009-11-01

    Context: Information on physical characteristics of astrometric radio sources, such as magnitude and redshift, is of great importance for many astronomical studies. However, data usually used in radio astrometry is often incomplete and outdated. Aims: Our purpose is to study the optical characteristics of more than 4000 radio sources observed by the astrometric VLBI technique since 1979. We also studied the effect of the asymmetry in the distribution of the reference radio sources on the correlation matrices between vector spherical harmonics of the first and second degrees. Methods: The radio source characteristics were mainly taken from the NASA/IPAC Extragalactic Database (NED). Characteristics of the gravitational lenses were checked with the CfA-Arizona Space Telescope LEns Survey. SIMBAD and HyperLeda databases were also used to clarify the characteristics of some objects. Also we simulated and investigated a list of 4000 radio sources evenly distributed around the celestial sphere. We estimated the correlation matrices between the vector spherical harmonics using the real as well as modelled distribution of the radio sources. Results: A new list OCARS (optical characteristics of astrometric radio sources) of 4261 sources has been compiled. Comparison of our data of optical characteristics with the official International Earth Rotation and Reference Systems Service (IERS) list showed significant discrepancies for about half of the 667 common sources. Finally, we found that asymmetry in the radio source distribution between hemispheres could cause significant correlation between the vector spherical harmonics, especially in the case of sparse distribution of the sources with high redshift. We also identified radio sources having a many-year observation history and lack of redshift. These sources should be urgently observed with large optical telescopes. Conclusions: The list of optical characteristics created in this paper is recommended for use as a

  18. On the application of subcell resolution to conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Chang, Shih-Hung

    1989-01-01

    LeVeque and Yee recently investigated a one-dimensional scalar conservation law with stiff source terms modeling the reacting flow problems and discovered that for the very stiff case most of the current finite difference methods developed for non-reacting flows would produce wrong solutions when there is a propagating discontinuity. A numerical scheme, essentially nonoscillatory/subcell resolution - characteristic direction (ENO/SRCD), is proposed for solving conservation laws with stiff source terms. This scheme is a modification of Harten's ENO scheme with subcell resolution, ENO/SR. The locations of the discontinuities and the characteristic directions are essential in the design. Strang's time-splitting method is used and time evolutions are done by advancing along the characteristics. Numerical experiment using this scheme shows excellent results on the model problem of LeVeque and Yee. Comparisons of the results of ENO, ENO/SR, and ENO/SRCD are also presented.

  19. Lattice Boltzmann method for n-dimensional nonlinear hyperbolic conservation laws with the source term.

    PubMed

    Wang, Zhenghua; Shi, Baochang; Xiang, Xiuqiao; Chai, Zhenhua; Lu, Jianhua

    2011-03-01

    It is important for nonlinear hyperbolic conservation laws (NHCL) to own a simulation scheme with high order accuracy, simple computation, and non-oscillatory character. In this paper, a unified and novel lattice Boltzmann model is presented for solving n-dimensional NHCL with the source term. By introducing the high order source term of explicit lattice Boltzmann method (LBM) and the optimum dimensionless relaxation time varied with the specific issues, the effects of space and time resolutions on the accuracy and stability of the model are investigated for the different problems in one to three dimensions. Both the theoretical analysis and numerical simulation validate that the results by the proposed LBM have second-order accuracy in both space and time, which agree well with the analytical solutions.

  20. Binary Source Microlensing Event OGLE-2016-BLG-0733: Interpretation of a Long-Term Asymmetric Perturbation

    NASA Technical Reports Server (NTRS)

    Jung, Y. K.; Udalski, A.; Yee, J. C.; Sumi, T.; Gould, A.; Han, C.; Albrow, M. D.; Lee, C.-U.; Bennett, D. P.; Suzuki, D.

    2017-01-01

    In the process of analyzing an observed light curve, one often confronts various scenarios that can mimic the planetary signals causing difficulties in the accurate interpretation of the lens system. In this paper, we present the analysis of the microlensing event OGLE-2016-BLG-0733. The light curve of the event shows a long-term asymmetric perturbation that would appear to be due to a planet. From the detailed modeling of the lensing light curve, however, we find that the perturbation originates from the binarity of the source rather than the lens. This result demonstrates that binary sources with roughly equal-luminosity components can mimic long-term perturbations induced by planets with projected separations near the Einstein ring. The result also represents the importance of the consideration of various interpretations in planet-like perturbations and of high-cadence observations for ensuring the unambiguous detection of the planet.

  1. Binary Source Microlensing Event OGLE-2016-BLG-0733: Interpretation of a Long-term Asymmetric Perturbation

    NASA Astrophysics Data System (ADS)

    Jung, Y. K.; Udalski, A.; Yee, J. C.; Sumi, T.; Gould, A.; Han, C.; Albrow, M. D.; Lee, C.-U.; Kim, S.-L.; Chung, S.-J.; Hwang, K.-H.; Ryu, Y.-H.; Shin, I.-G.; Zhu, W.; Cha, S.-M.; Kim, D.-J.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration; Pietrukowicz, P.; Kozłowski, S.; Poleski, R.; Skowron, J.; Mróz, P.; Szymański, M. K.; Soszyński, I.; Pawlak, M.; Ulaczyk, K.; OGLE Collaboration; Abe, F.; Bennett, D. P.; Barry, R.; Bond, I. A.; Asakura, Y.; Bhattacharya, A.; Donachie, M.; Freeman, M.; Fukui, A.; Hirao, Y.; Itow, Y.; Koshimoto, N.; Li, M. C. A.; Ling, C. H.; Masuda, K.; Matsubara, Y.; Muraki, Y.; Nagakane, M.; Oyokawa, H.; Rattenbury, N. J.; Sharan, A.; Sullivan, D. J.; Suzuki, D.; Tristram, P. J.; Yamada, T.; Yamada, T.; Yonehara, A.; MOA Collaboration

    2017-03-01

    In the process of analyzing an observed light curve, one often confronts various scenarios that can mimic the planetary signals causing difficulties in the accurate interpretation of the lens system. In this paper, we present the analysis of the microlensing event OGLE-2016-BLG-0733. The light curve of the event shows a long-term asymmetric perturbation that would appear to be due to a planet. From the detailed modeling of the lensing light curve, however, we find that the perturbation originates from the binarity of the source rather than the lens. This result demonstrates that binary sources with roughly equal-luminosity components can mimic long-term perturbations induced by planets with projected separations near the Einstein ring. The result also represents the importance of the consideration of various interpretations in planet-like perturbations and of high-cadence observations for ensuring the unambiguous detection of the planet.

  2. The application of inverse methods to spatially-distributed acoustic sources

    NASA Astrophysics Data System (ADS)

    Holland, K. R.; Nelson, P. A.

    2013-10-01

    Acoustic inverse methods, based on the output of an array of microphones, can be readily applied to the characterisation of acoustic sources that can be adequately modelled as a number of discrete monopoles. However, there are many situations, particularly in the fields of vibroacoustics and aeroacoustics, where the sources are distributed continuously in space over a finite area (or volume). This paper is concerned with the practical problem of applying inverse methods to such distributed source regions via the process of spatial sampling. The problem is first tackled using computer simulations of the errors associated with the application of spatial sampling to a wide range of source distributions. It is found that the spatial sampling criterion for minimising the errors in the radiated far-field reconstructed from the discretised source distributions is strongly dependent on acoustic wavelength but is only weakly dependent on the details of the source field itself. The results of the computer simulations are verified experimentally through the application of the inverse method to the sound field radiated by a ducted fan. The un-baffled fan source with the associated flow field is modelled as a set of equivalent monopole sources positioned on the baffled duct exit along with a matrix of complimentary non-flow Green functions. Successful application of the spatial sampling criterion involves careful frequency-dependent selection of source spacing, and results in the accurate reconstruction of the radiated sound field. Discussions of the conditioning of the Green function matrix which is inverted are included and it is shown that the spatial sampling criterion may be relaxed if conditioning techniques, such as regularisation, are applied to this matrix prior to inversion.

  3. Analytical source term optimization for radioactive releases with approximate knowledge of nuclide ratios

    NASA Astrophysics Data System (ADS)

    Hofman, Radek; Seibert, Petra; Kovalets, Ivan; Andronopoulos, Spyros

    2015-04-01

    We are concerned with source term retrieval in the case of an accident in a nuclear power with off-site consequences. The goal is to optimize atmospheric dispersion model inputs using inverse modeling of gamma dose rate measurements (instantaneous or time-integrated). These are the most abundant type of measurements provided by various radiation monitoring networks across Europe and available continuously in near-real time. Usually, a source term of an accidental release comprises of a mixture of nuclides. Unfortunately, gamma dose rate measurements do not provide a direct information on the source term composition; however, physical properties of respective nuclides (deposition properties, decay half-life) can yield some insight. In the method presented, we assume that nuclide ratios are known at least approximately, e.g. from nuclide specific observations or reactor inventory and assumptions on the accident type. The source term can be in multiple phases, each being characterized by constant nuclide ratios. The method is an extension of a well-established source term inversion approach based on the optimization of an objective function (minimization of a cost function). This function has two quadratic terms: mismatch between model and measurements weighted by an observation error covariance matrix and the deviation of the solution from a first guess weighted by the first-guess error covariance matrix. For simplicity, both error covariance matrices are approximated as diagonal. Analytical minimization of the cost function leads to a liner system of equations. Possible negative parts of the solution are iteratively removed by the means of first guess error variance reduction. Nuclide ratios enter the problem in the form of additional linear equations, where the deviations from prescribed ratios are weighted by factors; the corresponding error variance allows us to control how strongly we want to impose the prescribed ratios. This introduces some freedom into the

  4. A numerical method to solve the Stokes problem with a punctual force in source term

    NASA Astrophysics Data System (ADS)

    Lacouture, Loïc

    2015-03-01

    The aim of this note is to present a numerical method to solve the Stokes problem in a bounded domain with a Dirac source term, which preserves optimality for any approximation order by the finite-element method. It is based on the knowledge of a fundamental solution to the associated operator over the whole space. This method is motivated by the modeling of the movement of active thin structures in a viscous fluid.

  5. Implementation of New Turbulence Spectra in the Lighthill Analogy Source Terms

    NASA Technical Reports Server (NTRS)

    Woodruff, S. L.; Seiner, J. M.; Hussaini, M. Y.; Erlebacher, G.

    2000-01-01

    The industry-standard MGB approach to predicting the noise generated by a given aerodynamic flow field requires that the turbulence velocity correlation be specified so that the source terms in the Lighthill acoustic analogy may be computed. The velocity correlation traditionally used in MGB Computations is inconsistent with a number of basic qualitative properties of turbulent flows. In the present investigation the effect on noise prediction of using two alternative velocity correlations is examined.

  6. Final report on shipping-cask sabotage source-term investigation

    SciTech Connect

    Schmidt, E W; Walters, M A; Trott, B D; Gieseke, J A

    1982-10-01

    A need existed to estimate the source term resulting from a sabotage attack on a spent nuclear fuel shipping cask. An experimental program sponsored by the US NRC and conducted at Battelle's Columbus Laboratories was designed to meet that need. In the program a precision shaped charge was fired through a subscale model cask loaded with segments of spent PWR fuel rods and the radioactive material released was analyzed. This report describes these experiments and presents their results.

  7. Influence of containment spray systems on the source term behavior of VVER-1000-type reactors

    SciTech Connect

    Sdouz, G. )

    1993-01-01

    In Austria a research program to investigate the source term behavior of VVER-type reactors is still going on. The first two generations of VVER-type reactors were designed for 440-MW(electric) power. The next generation with 1000-MW(electric) power is known as the VVER-1000. These reactors have four loops without isolation valves, horizontal steam generators, and hexagonal fuel assemblies. In addition to the first two generations, this type has a containment structure with spray-type steam suppression. The three spray systems work autonomously with a special power supply for each system. The purpose of the containment spray system is to control the pressure within the containment by cooling and condensing steam from the atmosphere and to remove airborne aerosols. To investigate the source term behavior of VVER-type reactors, Austria acquired the Source Term Code Package (STCP) and started the program investigating a TMLB and an S[sub 1]B accident sequence. In the next step, a calculation of the TMLB sequence with working spray systems and emergency core coolant (ECC) recirculation was performed. This paper describes the results of the calculation, the comparison with the calculation without spray, and the implications for the accident management of VVER-1000-type reactors.

  8. Conditioning and long-term storage of spent radium sources in Turkey.

    PubMed

    Osmanlioglu, Ahmet Erdal

    2006-06-30

    Conditioning of radium sources is required before long-term interim storage to avoid the release of radioactive material and to limit radiation exposure. In this study, containment of the radium sources was achieved by high integrity encapsulation designed to control the radon emanation problem. The capsules were made of Type 316 austenitic stainless steel with dimensions of 22mm diameter and 160mm height. The gas pressures which was caused by encapsulation of different amounts of (226)Ra were determined. The maximum gas pressure found 10atm for 900mCi of (226)Ra in one capsule at 20 degrees C. A lead shielding device was designed to limit radiation exposure. A 200l drum was used as a conditioned waste package for the radium sources and represents a Type A package under the IAEA transport regulations.

  9. A General Model for Preferential and Triadic Choice in Terms of Central F Distribution Functions.

    ERIC Educational Resources Information Center

    Ennis, Daniel M; Johnson, Norman L.

    1994-01-01

    A model for preferential and triadic choice is derived in terms of weighted sums of central F distribution functions. It is a probabilistic generalization of Coombs' (1964) unfolding model from which special cases can be derived easily. This model for binary choice can be easily related to preference ratio judgments. (SLD)

  10. Low-level waste disposal performance assessments - Total source-term analysis

    SciTech Connect

    Wilhite, E.L.

    1995-12-31

    Disposal of low-level radioactive waste at Department of Energy (DOE) facilities is regulated by DOE. DOE Order 5820.2A establishes policies, guidelines, and minimum requirements for managing radioactive waste. Requirements for disposal of low-level waste emplaced after September 1988 include providing reasonable assurance of meeting stated performance objectives by completing a radiological performance assessment. Recently, the Defense Nuclear Facilities Safety Board issued Recommendation 94-2, {open_quotes}Conformance with Safety Standards at Department of Energy Low-Level Nuclear Waste and Disposal Sites.{close_quotes} One of the elements of the recommendation is that low-level waste performance assessments do not include the entire source term because low-level waste emplaced prior to September 1988, as well as other DOE sources of radioactivity in the ground, are excluded. DOE has developed and issued guidance for preliminary assessments of the impact of including the total source term in performance assessments. This paper will present issues resulting from the inclusion of all DOE sources of radioactivity in performance assessments of low-level waste disposal facilities.

  11. Using sediment particle size distribution to evaluate sediment sources in the Tobacco Creek Watershed

    NASA Astrophysics Data System (ADS)

    Liu, Cenwei; Lobb, David; Li, Sheng; Owens, Philip; Kuzyk, ZouZou

    2014-05-01

    Lake Winnipeg has recently brought attention to the deteriorated water quality due to in part to nutrient and sediment input from agricultural land. Improving water quality in Lake Winnipeg requires the knowledge of the sediment sources within this ecosystem. There are a variety of environmental fingerprinting techniques have been successfully used in the assessment of sediment sources. In this study, we used particle size distribution to evaluate spatial and temporal variations of suspended sediment and potential sediment sources collected in the Tobacco Creek Watershed in Manitoba, Canada. The particle size distribution of suspended sediment can reflect the origin of sediment and processes during sediment transport, deposition and remobilization within the watershed. The objectives of this study were to quantify visually observed spatial and temporal changes in sediment particles, and to assess the sediment source using a rapid and cost-effective fingerprinting technique based on particle size distribution. The suspended sediment was collected by sediment traps twice a year during rainfall and snowmelt periods from 2009 to 2012. The potential sediment sources included the top soil of cultivated field, riparian area and entire profile from stream banks. Suspended sediment and soil samples were pre-wet with RO water and sieved through 600 μm sieve before analyzing. Particle size distribution of all samples was determined using a Malvern Mastersizer 2000S laser diffraction with the measurement range up to 600μm. Comparison of the results for different fractions of sediment showed significant difference in particle size distribution of suspended sediment between snowmelt and rainfall events. An important difference of particle size distribution also found between the cultivated soil and forest soil. This difference can be explained by different land uses which provided a distinct fingerprint of sediment. An overall improvement in water quality can be achieved by

  12. Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)

    SciTech Connect

    Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.

    2013-04-01

    Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.

  13. Multiple concurrent sources localization based on a two-node distributed acoustic sensor network

    NASA Astrophysics Data System (ADS)

    Xu, Jiaxin; Zhao, Zhao; Chen, Chunzeng; Xu, Zhiyong

    2017-01-01

    In this work, we propose a new approach to localize multiple concurrent sources using a distributed acoustic sensor network. Only two node-arrays are required in this sensor network, and each node-array consists of only two widely spaced sensors. Firstly, direction-of-arrivals (DOAs) of multiple sources are estimated at each node-array by utilizing a new pooled angular spectrum proposed in this paper, which can implement the spatial aliasing suppression effectively. Based on minimum variance distortionless response (MVDR) beamforming and the DOA estimates of the sources, the time-frequency spectra containing the corresponding energy distribution features associated with those sources are reconstructed in each node-array. Then, scale invariant feature transform (SIFT) is employed to solve the DOA association problem. Performance evaluation is conducted with field recordings and experimental results prove the effectivity and feasibility of the proposed method.

  14. Measurements of the distribution of sound source intensities in turbulent jets.

    NASA Technical Reports Server (NTRS)

    Grosche, F.-R.; Jones, J. H.; Wilhold, G. A.

    1973-01-01

    The spatial distribution of sound source intensities in jets is determined from the sound radiated into the acoustic far field by means of a concave mirror-microphone system. The mirror forms an image of the sound sources in a region far enough from the jet so that near field pressure fluctuations can be neglected. The sound intensity in the image is thus closely related to the strength of the actual sound sources in the jet. Results of measurements with jets of Mach numbers 0.7 to 1.9 emanating from circular nozzles and from slot nozzle-flap combinations demonstrate the influence of Mach number and of other parameters upon the sound source distribution.

  15. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  16. Source coding with escort distributions and Rényi entropy bounds

    NASA Astrophysics Data System (ADS)

    Bercher, J.-F.

    2009-08-01

    We discuss the interest of escort distributions and Rényi entropy in the context of source coding. We first recall a source coding theorem by Campbell relating a generalized measure of length to the Rényi-Tsallis entropy. We show that the associated optimal codes can be obtained using considerations on escort-distributions. We propose a new family of measure of length involving escort-distributions and we show that these generalized lengths are also bounded below by the Rényi entropy. Furthermore, we obtain that the standard Shannon codes lengths are optimum for the new generalized lengths measures, whatever the entropic index. Finally, we show that there exists in this setting an interplay between standard and escort distributions.

  17. Temperature distribution of air source heat pump barn with different air flow

    NASA Astrophysics Data System (ADS)

    He, X.; Li, J. C.; Zhao, G. Q.

    2016-08-01

    There are two type of airflow form in tobacco barn, one is air rising, the other is air falling. They are different in the structure layout and working principle, which affect the tobacco barn in the distribution of temperature field and velocity distribution. In order to compare the temperature and air distribution of the two, thereby obtain a tobacco barn whose temperature field and velocity distribution are more uniform. Taking the air source heat pump tobacco barn as the investigated subject and establishing relevant mathematical model, the thermodynamics of the two type of curing barn was analysed and compared based on Fluent. Provide a reasonable evidence for chamber arrangement and selection of outlet for air source heat pump tobacco barn.

  18. Distribution and source of (129)I, (239)(,240)Pu, (137)Cs in the environment of Lithuania.

    PubMed

    Ežerinskis, Ž; Hou, X L; Druteikienė, R; Puzas, A; Šapolaitė, J; Gvozdaitė, R; Gudelis, A; Buivydas, Š; Remeikis, V

    2016-01-01

    Fifty five soil samples collected in the Lithuania teritory in 2011 and 2012 were analyzed for (129)I, (137)Cs and Pu isotopes in order to investigate the level and distribution of artificial radioactivity in Lithuania. The activity and atomic ratio of (238)Pu/((239,24)0)Pu, (129)I/(127)I and (131)I/(137)Cs were used to identify the origin of these radionuclides. The (238)Pu/(239+240)Pu and (240)Pu/(239)Pu ratios in the soil samples analyzed varied in the range of 0.02-0.18 and 0.18-0.24, respectively, suggesting the global fallout as the major source of Pu in Lithuania. The values of 10(-9) to 10(-6) for (129)I/(127)I atomic ratio revealed that the source of (129)I in Lithuania is global fallout in most cases though several sampling sites shows a possible impact of reprocessing releases. Estimated (129)I/(131)I ratio in soil samples from the southern part of Lithuania shows negligible input of the Chernobyl fallout. No correlation of the (137)Cs and Pu isotopes with (129)I was observed, indicating their different sources terms. Results demonstrate uneven distribution of these radionuclides in the Lithuanian territory and several sources of contamination i.e. Chernobyl accident, reprocessing releases and global fallout.

  19. Topology optimization of magnetic source distributions for diamagnetic and superconducting levitation

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Sergey; Guest, James K.

    2017-09-01

    Topology optimization is used to obtain a magnetic source distribution providing levitation of a diamagnetic body or type I superconductor with maximized thrust force. We show that this technique identifies non-trivial source distributions and may be useful to design devices based on non-contact magnetic suspension and other magnetic devices, such as micro-magneto-mechanical devices, high field magnets etc. Diamagnetic and superconducting suspensions are often used in physical experiments and thus we believe this approach will be interesting to physics community as it may generate non-trivial and often unexpected topologies and may be useful to create new experiments and devices.

  20. Analysis of plasma distribution near the extraction region in surface produced negative ion sources.

    PubMed

    Fukano, A; Hatayama, A

    2014-02-01

    In study of a negative ion source, it is important to understand the plasma characteristics near the extraction region. A recent experiment in the NIFS-R&D ion source has suggested that a "double ion plasma layer" which is a region consisting of hydrogen positive and negative ions exists near the plasma grid (PG). Density distribution of plasma near the extraction region is studied analytically. It is shown that the density distribution depends on an amount of the surface produced negative ions and the double ion plasma layer is formed near the PG surface for the case of strong surface production.

  1. Constraints on galactic distributions of gamma-ray burst sources from BATSE observations

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Meegan, Charles A.; Pendleton, Geoffrey N.; Fishman, Gerald J.; Wilson, Robert B.; Paciesas, William S.; Brock, Martin N.; Horack, John M.

    1994-01-01

    The paradigm that gamma-ray bursts originate from Galactic sources is studied in detail using the angular and intensity distributions observed by the Burst and Transient Source Experiment (BATSE) on NASA's Compton Gamma Ray Observatory (CGRO). Monte Carlo models of gamma-ray burst spatial distributions and luminosity functions are used to simulate bursts, which are then folded through mathematical models of BATSE selection effects. The observed and computed angular intensity distributions are analyzed using modifications of standard statistical homogeneity and isotropy studies. Analysis of the BATSE angular and intensity distributions greatly constrains the origins and luminosities of burst sources. In particular, it appears that no single population of sources confined to a Galactic disk, halo, or localized spiral arm satisfactorily explains BATSE observations and that effects of the burst luminosity function are secondary when considering such models. One family of models that still satisfies BATSE observations comprises sources located in an extended spherical Galactic corona. Coronal models are limited to small ranges of burst luminosity and core radius, and the allowed parameter space for such models shrinks with each new burst BATSE observes. Multiple-population models of bursts are found to work only if (1) the primary population accounts for the general isotropy and inhomogeneity seen in the BATSE observations and (2) secondary populations either have characteristics similar to the primary population or contain numbers that are small relative to the primary population.

  2. Kappa Distribution Model for Hard X-Ray Coronal Sources of Solar Flares

    NASA Astrophysics Data System (ADS)

    Oka, M.; Ishikawa, S.; Saint-Hilaire, P.; Krucker, S.; Lin, R. P.

    2013-02-01

    Solar flares produce hard X-ray emission, the photon spectrum of which is often represented by a combination of thermal and power-law distributions. However, the estimates of the number and total energy of non-thermal electrons are sensitive to the determination of the power-law cutoff energy. Here, we revisit an "above-the-loop" coronal source observed by RHESSI on 2007 December 31 and show that a kappa distribution model can also be used to fit its spectrum. Because the kappa distribution has a Maxwellian-like core in addition to a high-energy power-law tail, the emission measure and temperature of the instantaneous electrons can be derived without assuming the cutoff energy. Moreover, the non-thermal fractions of electron number/energy densities can be uniquely estimated because they are functions of only the power-law index. With the kappa distribution model, we estimated that the total electron density of the coronal source region was ~2.4 × 1010 cm-3. We also estimated without assuming the source volume that a moderate fraction (~20%) of electrons in the source region was non-thermal and carried ~52% of the total electron energy. The temperature was 28 MK, and the power-law index δ of the electron density distribution was -4.3. These results are compared to the conventional power-law models with and without a thermal core component.

  3. Identifying Synonymy between SNOMED Clinical Terms of Varying Length Using Distributional Analysis of Electronic Health Records

    PubMed Central

    Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

    2013-01-01

    Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

  4. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the

  5. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    SciTech Connect

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    2016-01-18

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequential ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/137Cs versus 134Cs/137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed

  6. Characterization of a Distributed Plasma Ionization Source (DPIS) for Ion Mobility Spectrometry and Mass Spectrometry

    SciTech Connect

    Waltman, Melanie J.; Dwivedi, Prabha; Hill, Herbert; Blanchard, William C.; Ewing, Robert G.

    2008-10-15

    A recently developed atmospheric pressure ionization source, a distributed plasma ionization source (DPIS), was characterized and compared to commonly used atmospheric pressure ionization sources with both mass spectrometry and ion mobility spectrometry. The source consisted of two electrodes of different sizes separated by a thin dielectric. Application of a high RF voltage across the electrodes generated plasma in air yielding both positive and negative ions depending on the polarity of the applied potential. These reactant ions subsequently ionized the analyte vapors. The reactant ions generated were similar to those created in a conventional point-to-plane corona discharge ion source. The positive reactant ions generated by the source were mass identified as being solvated protons of general formula (H2O)nH+ with (H2O)2H+ as the most abundant reactant ion. The negative reactant ions produced were mass identified primarily as CO3-, NO3-, NO2-, O3- and O2- of various relative intensities. The predominant ion and relative ion ratios varied depending upon source construction and supporting gas flow rates. A few compounds including drugs, explosives and environmental pollutants were selected to evaluate the new ionization source. The source was operated continuously for several months and although deterioration was observed visually, the source continued to produce ions at a rate similar that of the initial conditions. The results indicated that the DPIS may have a longer operating life than a conventional corona discharge.

  7. Characterization of a distributed plasma ionization source (DPIS) for ion mobility spectrometry and mass spectrometry.

    PubMed

    Waltman, Melanie J; Dwivedi, Prabha; Hill, Herbert H; Blanchard, William C; Ewing, Robert G

    2008-10-19

    A recently developed atmospheric pressure ionization source, a distributed plasma ionization source (DPIS), was characterized and compared to commonly used atmospheric pressure ionization sources with both mass spectrometry (MS) and ion mobility spectrometry (IMS). The source consisted of two electrodes of different sizes separated by a thin dielectric. Application of a high RF voltage across the electrodes generated plasma in air yielding both positive and negative ions. These reactant ions subsequently ionized the analyte vapors. The reactant ions generated were similar to those created in a conventional point-to-plane corona discharge ion source. The positive reactant ions generated by the source were mass identified as being solvated protons of general formula (H(2)O)(n)H(+) with (H(2)O)(2)H(+) as the most abundant reactant ion. The negative reactant ions produced were mass identified primarily as CO(3)(-), NO(3)(-), NO(2)(-), O(3)(-) and O(2)(-) of various relative intensities. The predominant ion and relative ion ratios varied depending upon source construction and supporting gas flow rates. A few compounds including drugs, explosives and amines were selected to evaluate the new ionization source. The source was operated continuously for 3 months and although surface deterioration was observed visually, the source continued to produce ions at a rate similar that of the initial conditions.

  8. Spatio-temporal Distribution of North African Dust Sources: Controling Mechanism and Interannual Variability

    NASA Astrophysics Data System (ADS)

    Schepanski, K.; Feuerstein, S.

    2016-12-01

    Mineral dust aerosol emitted from arid and semi-arid areas impacts on the weather and climate system by e.g. altering the atmospheric radiation budget and affecting nutrient cycles which ultimately changes the carbon cycle. To estimate the effect of dust in the Earth system, detailed knowledge on the spatio-temporal distribution of active dust sources is necessary. Furthermore, the understanding on the natural variablity of dust source activity has to be improved for a better representation of dust-related processes in numerical models and climate change projections. We discuss the atmospheric dust life-cycle over North Africa with regard to mechanisms both controlling dust uplift and transport pathways. Results from a four-year satellite-based study analysing the spatio-temporal distribution of dust source activations inferred from 15-minute Meteosat Second Generation (MSG) SEVIRI infra-red observations are linked to atmospheric conditions and dust source characteristics. The predominance of dust sources located in desert valleys illustrates the importance of alluvial sediments for the atmospheric dust life-cycle. With focus on alluvial dust sources, Landsat and Sentinel-2 data are analysed to identify changes in surface sediments caused by flash floodings which possibly generate fresh layers of sediments that are prone to wind erosion. Classification algorithms applied to the remote sensing data highlight an increase of alluvial sediments downstream of ephemeral channels and in mountain foothill regions subsequent to events of strong precipitation and a decrease in sediment coverage for long periods of rain absence and the occurrence of wind erosion (dust emission). Altogether, the presented and discussed results (1) illustrate the spatio-temporal distribution of dust sources over North Africa, (2) identify atmospheric controlling mechanism on dust source activation, and (3) investigate alluvial sediments as dust source. In summary, the outcomes contribute to the

  9. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults

    PubMed Central

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-01-01

    Background Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. Methods and Results We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (P<0.01). These results were consistent by sex. Conclusions These findings highlight potential shortcomings of using short-term risk tools for primary prevention strategies because a substantial proportion of Peruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. PMID:26254303

  10. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    SciTech Connect

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  11. Development of a tool dedicated to the evaluation of hydrogen term source for technological Wastes: assumptions, physical models, and validation

    SciTech Connect

    Lamouroux, C.

    2013-07-01

    In radioactive waste packages hydrogen is generated, in one hand, from the radiolysis of wastes (mainly organic materials) and, in the other hand, from the radiolysis of water content in the cement matrix. In order to assess hydrogen generation 2 tools based on operational models have been developed. One is dedicated to the determination of the hydrogen source term issues from the radiolysis of the wastes: the STORAGE tool (Simulation Tool Of Emission Radiolysis Gas), the other deals with the hydrogen source term gas, produced by radiolysis of the cement matrices (the Damar tool). The approach used by the STORAGE tool for assessing the production rate of radiolysis gases is divided into five steps: 1) Specification of the data packages, in particular, inventories and radiological materials defined for a package medium; 2) Determination of radiochemical yields for the different constituents and the laws of behavior associated, this determination of radiochemical yields is made from the PRELOG database in which radiochemical yields in different irradiation conditions have been compiled; 3) Definition of hypothesis concerning the composition and the distribution of contamination inside the package to allow assessment of the power absorbed by the constituents; 4) Sum-up of all the contributions; And finally, 5) validation calculations by comparison with a reduced sampling of packages. Comparisons with measured values confirm the conservative character of the methodology and give confidence in the safety margins for safety analysis report.

  12. Long-Term Safe Storage and Disposal of Spent Sealed Radioactive Sources in Borehole Type Repositories

    SciTech Connect

    Ojovan, M. I.; Dmitriev, S. A.; Sobolev, I. A.

    2003-02-26

    Russian Federation has the leading experience in applying borehole storage/disposal method for SRS. A new immobilization technology for sources being disposed of in underground repositories was mastered by 1986 and since then it is used in the country. This method uses all advantages of borehole type repositories supplementing them with metal encapsulation of sources. Sources being uniformly allocated in the volume of underground vessel are fixed in the metal block hence ensuring long-term safety. The dissipation of radiogenic heat from SRS is considerably improved, radiation fields are reduced, and direct contact of sources to an environment is completely eliminated. The capacity of a typical borehole storage/disposal facility is increased almost 6 times applying metal immobilization. That has made new technology extremely favourable economically. The metal immobilization of SRS is considered as an option in Belarus and Ukraine as well as Bulgaria. Immobilization of sources in metal matrices can be a real solution for retrieval of SRS from inadequate repositories.

  13. Review of uncertainty sources affecting the long-term predictions of space debris evolutionary models

    NASA Astrophysics Data System (ADS)

    Dolado-Perez, J. C.; Pardini, Carmen; Anselmo, Luciano

    2015-08-01

    Since the launch of Sputnik-I in 1957, the amount of space debris in Earth's orbit has increased continuously. Historically, besides abandoned intact objects (spacecraft and orbital stages), the primary sources of space debris in Earth's orbit were (i) accidental and intentional break-ups which produced long-lasting debris and (ii) debris released intentionally during the operation of launch vehicle orbital stages and spacecraft. In the future, fragments generated by collisions are expected to become a significant source as well. In this context, and from a purely mathematical point of view, the orbital debris population in Low Earth Orbit (LEO) should be intrinsically unstable, due to the physics of mutual collisions and the relative ineffectiveness of natural sink mechanisms above~700 km. Therefore, the real question should not be "if", but "when" the exponential growth of the space debris population is supposed to start. From a practical point of view, and in order to answer the previous question, since the end of the 1980's several sophisticated long-term debris evolutionary models have been developed. Unfortunately, the predictions performed with such models, in particular beyond a few decades, are affected by considerable uncertainty. Such uncertainty comes from a relative important number of variables that being either under the partial control or completely out of the control of modellers, introduce a variability on the long-term simulation of the space debris population which cannot be captured with standard Monte Carlo statistics. The objective of this paper is to present and discuss many of the uncertainty sources affecting the long-term predictions done with evolutionary models, in order to serve as a roadmap for the uncertainty and the statistical robustness analysis of the long-term evolution of the space debris population.

  14. Source Distributions of Substorm Ions Observed in the Near-Earth Magnetotail

    NASA Technical Reports Server (NTRS)

    Ashour-Abdalla, M.; El-Alaoui, M.; Peroomian, V.; Walker, R. J.; Raeder, J.; Frank, L. A.; Paterson, W. R.

    1999-01-01

    This study employs Geotail plasma observations and numerical modeling to determine sources of the ions observed in the near-Earth magnetotail near midnight during a substorm. The growth phase has the low-latitude boundary layer as its most important source of ions at Geotail, but during the expansion phase the plasma mantle is dominant. The mantle distribution shows evidence of two distinct entry mechanisms: entry through a high latitude reconnection region resulting in an accelerated component, and entry through open field lines traditionally identified with the mantle source. The two entry mechanisms are separated in time, with the high-latitude reconnection region disappearing prior to substorm onset.

  15. Distribution functions in plasmas generated by a volume source of fission fragments. [in nuclear pumped lasers

    NASA Technical Reports Server (NTRS)

    Deese, J. E.; Hassan, H. A.

    1979-01-01

    The role played by fission fragments and electron distribution functions in nuclear pumped lasers is considered and procedures for their calculations are outlined. The calculations are illustrated for a He-3/Xe mixture where fission is provided by the He-3(n,p)H-3 reaction. Because the dominant ion in the system depends on the Xe fraction, the distribution functions cannot be determined without the simultaneous consideration of a detailed kinetic model. As is the case for wall sources of fission fragments, the resulting plasmas are essentially thermal but the electron distribution functions are non-Maxwellian.

  16. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    NASA Astrophysics Data System (ADS)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both

  17. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    SciTech Connect

    Icenhour, A.S.; Tharp, M.L.

    1996-08-01

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.

  18. Spurious Behavior of Shock-Capturing Methods: Problems Containing Stiff Source Terms and Discontinuities

    NASA Technical Reports Server (NTRS)

    Yee, Helen M. C.; Kotov, D. V.; Wang, Wei; Shu, Chi-Wang

    2013-01-01

    The goal of this paper is to relate numerical dissipations that are inherited in high order shock-capturing schemes with the onset of wrong propagation speed of discontinuities. For pointwise evaluation of the source term, previous studies indicated that the phenomenon of wrong propagation speed of discontinuities is connected with the smearing of the discontinuity caused by the discretization of the advection term. The smearing introduces a nonequilibrium state into the calculation. Thus as soon as a nonequilibrium value is introduced in this manner, the source term turns on and immediately restores equilibrium, while at the same time shifting the discontinuity to a cell boundary. The present study is to show that the degree of wrong propagation speed of discontinuities is highly dependent on the accuracy of the numerical method. The manner in which the smearing of discontinuities is contained by the numerical method and the overall amount of numerical dissipation being employed play major roles. Moreover, employing finite time steps and grid spacings that are below the standard Courant-Friedrich-Levy (CFL) limit on shockcapturing methods for compressible Euler and Navier-Stokes equations containing stiff reacting source terms and discontinuities reveals surprising counter-intuitive results. Unlike non-reacting flows, for stiff reactions with discontinuities, employing a time step and grid spacing that are below the CFL limit (based on the homogeneous part or non-reacting part of the governing equations) does not guarantee a correct solution of the chosen governing equations. Instead, depending on the numerical method, time step and grid spacing, the numerical simulation may lead to (a) the correct solution (within the truncation error of the scheme), (b) a divergent solution, (c) a wrong propagation speed of discontinuities solution or (d) other spurious solutions that are solutions of the discretized counterparts but are not solutions of the governing equations

  19. Impact of the differential fluence distribution of brachytherapy sources on the spectroscopic dose-rate constant

    SciTech Connect

    Malin, Martha J.; Bartol, Laura J.; DeWerd, Larry A. E-mail: ladewerd@wisc.edu

    2015-05-15

    Purpose: To investigate why dose-rate constants for {sup 125}I and {sup 103}Pd seeds computed using the spectroscopic technique, Λ{sub spec}, differ from those computed with standard Monte Carlo (MC) techniques. A potential cause of these discrepancies is the spectroscopic technique’s use of approximations of the true fluence distribution leaving the source, φ{sub full}. In particular, the fluence distribution used in the spectroscopic technique, φ{sub spec}, approximates the spatial, angular, and energy distributions of φ{sub full}. This work quantified the extent to which each of these approximations affects the accuracy of Λ{sub spec}. Additionally, this study investigated how the simplified water-only model used in the spectroscopic technique impacts the accuracy of Λ{sub spec}. Methods: Dose-rate constants as described in the AAPM TG-43U1 report, Λ{sub full}, were computed with MC simulations using the full source geometry for each of 14 different {sup 125}I and 6 different {sup 103}Pd source models. In addition, the spectrum emitted along the perpendicular bisector of each source was simulated in vacuum using the full source model and used to compute Λ{sub spec}. Λ{sub spec} was compared to Λ{sub full} to verify the discrepancy reported by Rodriguez and Rogers. Using MC simulations, a phase space of the fluence leaving the encapsulation of each full source model was created. The spatial and angular distributions of φ{sub full} were extracted from the phase spaces and were qualitatively compared to those used by φ{sub spec}. Additionally, each phase space was modified to reflect one of the approximated distributions (spatial, angular, or energy) used by φ{sub spec}. The dose-rate constant resulting from using approximated distribution i, Λ{sub approx,i}, was computed using the modified phase space and compared to Λ{sub full}. For each source, this process was repeated for each approximation in order to determine which approximations used in

  20. Filtered chemical source term modeling for LES of high Karlovitz number premixed flames

    NASA Astrophysics Data System (ADS)

    Lapointe, Simon; Blanquart, Guillaume

    2015-11-01

    Tabulated chemistry with the transport of a single progress variable is a popular technique for large eddy simulations of premixed turbulent flames. Since the reaction zone thickness is usually smaller than the LES grid size, modeling of the filtered progress variable reaction rate is required. Most models assume that the filtered progress variable reaction rate is a function of the filtered progress variable and its variance where the dependence can be obtained through the probability density function (PDF) of the progress variable. Among the most common approaches, the PDF can be presumed (usually as a β-PDF) or computed using spatially filtered one dimensional laminar flames (FLF). Models for the filtered source term are studied a priori using results from DNS of turbulent n-heptane/air premixed flames at varying Karlovitz numbers. Predictions from the optimal estimator and models based on laminar flames using a β-PDF or a FLF-PDF are compared to the exact filtered source term. For all filter widths and Karlovitz numbers, the optimal estimator yields small errors while β-PDF and FLF-PDF approaches present larger errors. Sources of differences are discussed.

  1. Long-term particle measurements in Finnish Arctic: Part II - Trend analysis and source location identification

    NASA Astrophysics Data System (ADS)

    Laing, James R.; Hopke, Philip K.; Hopke, Eleanor F.; Husain, Liaquat; Dutkiewicz, Vincent A.; Paatero, Jussi; Viisanen, Yrjö.

    2014-05-01

    Forty-seven years (1964-2010) of weekly trace metal and major ion concentrations in total suspended particle samples from Kevo, Finland were analyzed for long-term trends and by source identification methods. Significant long-term decreasing trends were detected for most species. The largest decreases over the 47 years were Sb (-3.90% yr-1), Pb (-3.87% yr-1), Mn (-3.45% yr-1), Cd (-3.42% yr-1), and Ca (-3.13% yr-1). As, Pb, and Cd concentrations at Kevo were consistent with the reported time-trends of European emissions inventories. Pb concentrations at Kevo have dramatically decreased (92%) in the past 47 years due to the reduced use of leaded gasoline in automobiles. Back-trajectory analysis suggests that the main source areas of anthropogenic species (V, Cd, Mn, Mo, Sb, Tl, W) were predominantly in Eastern Europe, European Russia, and the Baltics. Markers of stationary fuel combustion (V, Mn, Mo, Sb, Se, and Tl) pointed towards source regions in the Pechora Basin and Ural industrial areas in Russia, and near gas and oil fields in western Kazakhstan.

  2. Experiments on liquid-metal fast breeder reactor aerosol source terms after severe accidents

    SciTech Connect

    Berthoud, G.; Longest, A.W.; Wright, A.L.; Schutz, W.P.

    1988-05-01

    In the extremely unlikely event of a liquid-metal fast breeder reactor core disruptive accident, expanding core material or sodium vapor inside the sodium pool may cause leaks in the vessel head and transport of radioactive material, mostly aerosols, in one large bubble or several smaller bubbles under energetic conditions to the cover gas and through leaks to the inner containment (''instantaneous source term''). Out-of-pile experiments on bubble expansion from a pressurized source inside a liquid (water or sodium) and related phenomena like heat transfer, condensation, entrainment, rise, and aerosol transport were carried out in France and the United States and are continuing in the Federal Republic of Germany. Parameters and results of these experiments are described and discussed, mainly concerning the aerosol problem. It appears that several mechanisms exist for a very efficient removal of particles from the bubble. Retention factors larger than 10,000 were found in most cases. In addition, a short survey is given of French and German experiments on fuel and fission product release from evaporating or burning sodium pools (delayed source term).

  3. The Multimedia Environmental Pollutant Assessment System (MEPAS){reg_sign}: Source-term release formulations

    SciTech Connect

    Streile, G.P.; Shields, K.D.; Stroh, J.L.; Bagaasen, L.M.; Whelan, G.; McDonald, J.P.; Droppo, J.G.; Buck, J.W.

    1996-11-01

    This report is one of a series of reports that document the mathematical models in the Multimedia Environmental Pollutant Assessment System (MEPAS). Developed by Pacific Northwest National Laboratory for the US Department of Energy, MEPAS is an integrated impact assessment software implementation of physics-based fate and transport models in air, soil, and water media. Outputs are estimates of exposures and health risk assessments for radioactive and hazardous pollutants. Each of the MEPAS formulation documents covers a major MEPAS component such as source-term, atmospheric, vadose zone/groundwater, surface water, and health exposure/health impact assessment. Other MEPAS documentation reports cover the sensitivity/uncertainty formulations and the database parameter constituent property estimation methods. The pollutant source-term release component is documented in this report. MEPAS simulates the release of contaminants from a source, transport through the air, groundwater, surface water, or overland pathways, and transfer through food chains and exposure pathways to the exposed individual or population. For human health impacts, risks are computed for carcinogens and hazard quotients for noncarcinogens. MEPAS is implemented on a desktop computer with a user-friendly interface that allows the user to define the problem, input the required data, and execute the appropriate models for both deterministic and probabilistic analyses.

  4. Computational determination of absorbed dose distributions from multiple volumetric gamma ray sources

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanyu; Inanc, Feyzi

    2002-05-01

    Determination of absorbed dose distributions is very important in brachytherapy procedures. The typical computation involves superposition of absorbed dose distributions from a single seed to compute the combined absorbed dose distribution formed by multiple seeds. This approach does not account for the shadow effect caused by the metallic nature of volumetric radioactive seeds. Since this shadow effect will cause deviations from the targeted dose distribution, it may have important implications on the success of the procedures. We demonstrated accuracy of our deterministic algorithms for isotropic point sources in the past. We will show that we now have the capability of computing absorbed dose distributions from multiple volumetric seeds and demonstrate that our results are quite close to the results published in the literature.

  5. Marine litter on Mediterranean shores: Analysis of composition, spatial distribution and sources in north-western Adriatic beaches.

    PubMed

    Munari, Cristina; Corbau, Corinne; Simeoni, Umberto; Mistri, Michele

    2016-03-01

    Marine litter is one descriptor in the EU Marine Strategy Framework Directive (MSFD). This study provides the first account of an MSFD indicator (Trends in the amount of litter deposited on coastlines) for the north-western Adriatic. Five beaches were sampled in 2015. Plastic dominated in terms of abundance, followed by paper and other groups. The average density was 0.2 litter items m(-2), but at one beach it raised to 0.57 items m(-2). The major categories were cigarette butts, unrecognizable plastic pieces, bottle caps, and others. The majority of marine litter came from land-based sources: shoreline and recreational activities, smoke-related activities and dumping. Sea-based sources contributed for less. The abundance and distribution of litter seemed to be particularly influenced by beach users, reflecting inadequate disposal practices. The solution to these problems involves implementation and enforcement of local educational and management policies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Bacterial Composition in a Metropolitan Drinking Water Distribution System Utilizing Different Source Waters

    EPA Science Inventory

    The microbial community structure was investigated from bulk phase water samples of multiple collection sites from two service areas within the Cincinnati drinking water distribution system (DWDS). Each area is associated with a different primary source of water (i.e., groundwat...

  7. GIS Based Distributed Runoff Predictions in Variable Source Area Watersheds Employing the SCS-Curve Number

    NASA Astrophysics Data System (ADS)

    Steenhuis, T. S.; Mendoza, G.; Lyon, S. W.; Gerard Marchant, P.; Walter, M. T.; Schneiderman, E.

    2003-04-01

    Because the traditional Soil Conservation Service Curve Number (SCS-CN) approach continues to be ubiquitously used in GIS-BASED water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed within an integrated GIS modeling environment a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Spatial representation of hydrologic processes is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point source pollution. The methodology presented here uses the traditional SCS-CN method to predict runoff volume and spatial extent of saturated areas and uses a topographic index to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was incorporated in an existing GWLF water quality model and applied to sub-watersheds of the Delaware basin in the Catskill Mountains region of New York State. We found that the distributed CN-VSA approach provided a physically-based method that gives realistic results for watersheds with VSA hydrology.

  8. Light source distribution and scattering phase function influence light transport in diffuse multi-layered media

    NASA Astrophysics Data System (ADS)

    Vaudelle, Fabrice; L'Huillier, Jean-Pierre; Askoura, Mohamed Lamine

    2017-06-01

    Red and near-Infrared light is often used as a useful diagnostic and imaging probe for highly scattering media such as biological tissues, fruits and vegetables. Part of diffusively reflected light gives interesting information related to the tissue subsurface, whereas light recorded at further distances may probe deeper into the interrogated turbid tissues. However, modelling diffusive events occurring at short source-detector distances requires to consider both the distribution of the light sources and the scattering phase functions. In this report, a modified Monte Carlo model is used to compute light transport in curved and multi-layered tissue samples which are covered with a thin and highly diffusing tissue layer. Different light source distributions (ballistic, diffuse or Lambertian) are tested with specific scattering phase functions (modified or not modified Henyey-Greenstein, Gegenbauer and Mie) to compute the amount of backscattered and transmitted light in apple and human skin structures. Comparisons between simulation results and experiments carried out with a multispectral imaging setup confirm the soundness of the theoretical strategy and may explain the role of the skin on light transport in whole and half-cut apples. Other computational results show that a Lambertian source distribution combined with a Henyey-Greenstein phase function provides a higher photon density in the stratum corneum than in the upper dermis layer. Furthermore, it is also shown that the scattering phase function may affect the shape and the magnitude of the Bidirectional Reflectance Distribution (BRDF) exhibited at the skin surface.

  9. Bacterial Composition in a Metropolitan Drinking Water Distribution System Utilizing Different Source Waters

    EPA Science Inventory

    The microbial community structure was investigated from bulk phase water samples of multiple collection sites from two service areas within the Cincinnati drinking water distribution system (DWDS). Each area is associated with a different primary source of water (i.e., groundwat...

  10. Technical considerations related to interim source-term assumptions for emergency planning and equipment qualification. [PWR; BWR

    SciTech Connect

    Niemczyk, S.J.; McDowell-Boyer, L.M.

    1982-09-01

    The source terms recommended in the current regulatory guidance for many considerations of light water reactor (LWR) accidents were developed a number of years ago when understandings of many of the phenomena pertinent to source term estimation were relatively primitive. The purpose of the work presented here was to develop more realistic source term assumptions which could be used for interim regulatory purposes for two specific considerations, namely, equipment qualification and emergency planning. The overall approach taken was to adopt assumptions and models previously proposed for various aspects of source term estimation and to modify those assumptions and models to reflect recently gained insights into, and data describing, the release and transport of radionuclides during and after LWR accidents. To obtain illustrative estimates of the magnitudes of the source terms, the results of previous calculations employing the adopted assumptions and models were utilized and were modified to account for the effects of the recent insights and data.

  11. Characterization and Source Term Assessments of Radioactive Particles from Marshall Islands Using Non-Destructive Analytical Techniques

    SciTech Connect

    Jernstrom, J; Eriksson, M; Simon, R; Tamborini, G; Bildstein, O; Carlos-Marquez, R; Kehl, S R; Betti, M; Hamilton, T

    2005-06-11

    A considerable fraction of radioactivity entering the environment from different nuclear events is associated with particles. The impact of these events can only be fully assessed where there is some knowledge about the mobility of particle bound radionuclides entering the environment. The behavior of particulate radionuclides is dependent on several factors, including the physical, chemical and redox state of the environment, the characteristics of the particles (e.g., the chemical composition, crystallinity and particle size) and on the oxidative state of radionuclides contained in the particles. Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized using non-destructive analytical and microanalytical methods. By determining the activity of {sup 239,240}Pu and {sup 241}Am isotopes from their gamma peaks structural information related to Pu matrix was obtained, and the source term was revealed. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence (SR-{mu}-XRF) spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector (SEMEDX) and secondary ion mass spectrometer (SIMS) were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups; particles with plain Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogeneously distributed. All of the particles were identified as fragments of initial weapons material. As containing plutonium with low {sup 240}Pu/{sup 239}Pu atomic ratio, {approx}2-6%, which corresponds to weapons grade plutonium, the source term was identified to be among the safety tests conducted in the history of Runit Island.

  12. Transient Flows and Stratification of an Enclosure Containing Both a Localised and Distributed Source of Buoyancy

    NASA Astrophysics Data System (ADS)

    Partridge, Jamie; Linden, Paul

    2014-11-01

    We examine the transient flow and stratification in a naturally ventilated enclosure containing both a localised and distributed source of buoyancy. Both sources of buoyancy are located at the base of the enclosure to represent a building where there is a distributed heat flux from the floor, for example from a sun patch, that competes with a localised heat source within the space. The steady conditions of the space are controlled purely by the geometry of the enclosure and the ratio of the distributed and localised buoyancy fluxes Ψ and are independent of the order buoyancy fluxes are introduced into the space. However, the order sources are introduced into the space, such as delaying the introduction of a localised source, alter the transients significantly. To investigate this problem, small-scale experiments were conducted and compared to a `perfect-mixing' model of the transients. How the stratification evolves in time, in particular how long it takes to reach steady conditions, is key to understanding what can be expected in real buildings. The transient evolution of the interior stratification is reported here and compared to the theoretical model.

  13. [Soil Heavy Metal Spatial Distribution and Source Analysis Around an Aluminum Plant in Baotou].

    PubMed

    Zhang, Lian-ke; Li, Hai-peng; Huang, Xue-min; Li, Yu-mei; Jiao, Kun-ling; Sun, Peng; Wang, Wei-da

    2016-03-15

    The soil with 500 m distance from an aluminum plant in Baotou was studied. A total of 64 soil samples were taken from the 0-5 cm, 5-20 cm, 20-40 cm and 40-60 cm layers, and the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn were tested, respectively. The correlation analysis and principal component analysis were used to identify the sources of these heavy metals in soils. The results suggested that the contents of Cu, Pb, Zn, Cr, Cd, Ni and Mn in study area were 32.9, 50.35, 69.92, 43.78, 0.54, 554.42 and 36.65 mg · kg⁻¹ respectively. All seven heavy metals tested were overweight compared with the background values of soil in Inner Mongolia. The spatial distribution of heavy metals showed that the horizontal distribution of heavy metals was obviously enriched in the southwest, while in vertical distribution, the heavy metal content (0 to 5 cm) was highest in the surface soil, and the heavy metal content decreased with increasing depth and tended to be stabilized when the depth was over 20 cm. Source analysis showed that the source of Cu, Zn, Cr and Mn might be influenced by the aluminum plant and the surrounding industrial activity. The source of Pb and Cd might be mainly related to road transportation. The source of Ni may be affected by agricultural activities and soil parent material together.

  14. Detection prospects for high energy neutrino sources from the anisotropic matter distribution in the local Universe

    NASA Astrophysics Data System (ADS)

    Mertsch, Philipp; Rameez, Mohamed; Tamborra, Irene

    2017-03-01

    Constraints on the number and luminosity of the sources of the cosmic neutrinos detected by IceCube have been set by targeted searches for point sources. We set complementary constraints by using the 2MASS Redshift Survey (2MRS) catalogue, which maps the matter distribution of the local Universe. Assuming that the distribution of the neutrino sources follows that of matter, we look for correlations between ``warm'' spots on the IceCube skymap and the 2MRS matter distribution. Through Monte Carlo simulations of the expected number of neutrino multiplets and careful modelling of the detector performance (including that of IceCube-Gen2), we demonstrate that sources with local density exceeding 10-6 Mpc-3 and neutrino luminosity Lν lesssim 1042 erg s-1 (1041 erg s-1) will be efficiently revealed by our method using IceCube (IceCube-Gen2). At low luminosities such as will be probed by IceCube-Gen2, the sensitivity of this analysis is superior to requiring statistically significant direct observation of a point source.

  15. Parameterized source term in the diffusion approximation for enhanced near-field modeling of collimated light

    NASA Astrophysics Data System (ADS)

    Jia, Mengyu; Wang, Shuang; Chen, Xueying; Gao, Feng; Zhao, Huijuan

    2016-03-01

    Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we have reported on an improved explicit model, referred to as "Virtual Source" (VS) diffuse approximation (DA), to inherit the mathematical simplicity of the DA while considerably extend its validity in modeling the near-field photon migration in low-albedo medium. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the nearfield to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. The proposed VS-DA model is validated by comparing with the Monte Carlo simulations, and further introduced in the image reconstruction of the Laminar Optical Tomography system.

  16. Speed Distributions of Merging X-Ray Sources During Chromospheric Evaporation in Solar Flares

    NASA Astrophysics Data System (ADS)

    Ning, Zongjun

    2011-10-01

    We explore the speed distributions of X-ray source motions after the start of chromospheric evaporation in two Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) flares. First, we make CLEAN images at 15 energy bands with a 12 second integration window; then, we outline a flaring loop geometry to cover the looptop and footpoint sources as much as possible. Consistent with the previous steps, we find converging motion of the double footpoint sources along the flaring loop in these two events. This motion is dependent on the energy band and time and is typically seen at 3 - 25 keV, indicating a chromospheric evaporation origin. The speed distributions at various energy bands are measured for the 10 September 2002 flare, which exhibits a separation-to-mergence motion pattern well correlated with the rising-to-decay phases at 50 - 100 keV.

  17. Source distribution of ocean microseisms and implications for time-dependent noise tomography

    NASA Astrophysics Data System (ADS)

    Kedar, Sharon

    2011-09-01

    A qualitative analysis of ocean microseism source distribution observed in North America during fall and winter months was carried out. I review the theory of the origin of ocean microseisms and show that it can be used in conjunction with wave-wave interaction maps to quantify the source distribution anisotropy. It is demonstrated that microseisms generation in the North Atlantic and in the North Pacific Oceans are inherently different. North Atlantic microseisms are generated predominantly in the deep ocean, while North Pacific microseisms are dominated by coastal reflections. In spite of these differences both result from repeated ocean wave patterns that give rise to an anisotropic noise pattern, which cannot be randomized by time averaging. Considering time-varying ambient noise imaging, which aims to resolve a fraction of a percent changes in the crust over short distances, the source anisotropy would introduce a relatively significant error that needs to be accounted for.

  18. Calculation of the neutron source distribution in the VENUS PWR Mockup Experiment

    SciTech Connect

    Williams, M.L.; Morakinyo, P.; Kam, F.B.K.; Leenders, L.; Minsart, G.; Fabry, A.

    1984-01-01

    The VENUS PWR Mockup Experiment is an important component of the Nuclear Regulatory Commission's program goal of benchmarking reactor pressure vessel (RPV) fluence calculations in order to determine the accuracy to which RPV fluence can be computed. Of particular concern in this experiment is the accuracy of the source calculation near the core-baffle interface, which is the important region for contributing to RPV fluence. Results indicate that the calculated neutron source distribution within the VENUS core agrees with the experimental measured values with an average error of less than 3%, except at the baffle corner, where the error is about 6%. Better agreement with the measured fission distribution was obtained with a detailed space-dependent cross-section weighting procedure for thermal cross sections near the core-baffle interface region. The maximum error introduced into the predicted RPV fluence due to source errors should be on the order of 5%.

  19. Distributed source model for the full-wave electromagnetic simulation of nonlinear terahertz generation.

    PubMed

    Fumeaux, Christophe; Lin, Hungyen; Serita, Kazunori; Withayachumnankul, Withawat; Kaufmann, Thomas; Tonouchi, Masayoshi; Abbott, Derek

    2012-07-30

    The process of terahertz generation through optical rectification in a nonlinear crystal is modeled using discretized equivalent current sources. The equivalent terahertz sources are distributed in the active volume and computed based on a separately modeled near-infrared pump beam. This approach can be used to define an appropriate excitation for full-wave electromagnetic numerical simulations of the generated terahertz radiation. This enables predictive modeling of the near-field interactions of the terahertz beam with micro-structured samples, e.g. in a near-field time-resolved microscopy system. The distributed source model is described in detail, and an implementation in a particular full-wave simulation tool is presented. The numerical results are then validated through a series of measurements on square apertures. The general principle can be applied to other nonlinear processes with possible implementation in any full-wave numerical electromagnetic solver.

  20. Operational source term estimation and ensemble prediction for the Grimsvoetn 2011 event

    NASA Astrophysics Data System (ADS)

    Maurer, Christian; Arnold, Delia; Klonner, Robert; Wotawa, Gerhard

    2014-05-01

    The ESA-funded international project VAST (Volcanic Ash Strategic Initiative Team) includes focusing on a realistic source term estimation in the case of volcanic eruptions as well as on an estimate of the forecast uncertainty in the resulting atmospheric dispersion calculations, which partly derive from the forecast uncertainty in the meteorological input data. SEVIRI earth observation data serve as a basis for the source term estimation, from which the total atmospheric column ash content can be estimated. In an operational environment, the already available EUMETCAST VOLE product may be used. Further an a priori source term is needed, which can be coarsely estimated according to information from previous eruptions and/or constrained with observations of the eruption column. The link between observations and the a priori source is established by runs of the atmospheric transport model FLEXPART for individual emission periods and a predefined number of vertical levels. Through minimizing the differences between observations and model results the so-called a posteriori source term can be depicted for a certain time interval as a function of height. Such a result is shown for a first test case, the eruption of the Grimsvoetn volcano on Iceland in May 2011. Once the dispersion calculations are as optimized as possible with regard to the source term, the uncertainty stemming from the forecast uncertainty of the numeric weather prediction model used is still present, adding up to the unavoidable model errors. Since it is impossible to perform FLEXPART runs for all 50 members of the Integrated Forecasting System (IFS) of ECMWF due to computational (time-storage) constraints, the number of members gets restricted to five (maximum seven) representative runs via cluster analysis. The approach used is as of Klonner (2012) where it was demonstrated that exclusive consideration of the wind components on a pressure level (e.g. 400 hPa) makes it possible to find clusters and

  1. Short-term spatial change in a volcanic tremor source during the 2011 Kirishima eruption

    NASA Astrophysics Data System (ADS)

    Matsumoto, Satoshi; Shimizu, Hiroshi; Matsushima, Takeshi; Uehira, Kenji; Yamashita, Yusuke; Nakamoto, Manami; Miyazaki, Masahiro; Chikura, Hiromi

    2013-04-01

    Volcanic tremors are indicators of magmatic behavior, which is strongly related to volcanic eruptions and activity. Detection of spatial and temporal variations in the source location is important for understanding the mechanism of volcanic eruptions. However, short-term temporal variations within a tremor event have not always been detected by seismic array observations around volcanoes. Here, we show that volcanic tremor sources were activated at both the top (i.e., the crater) and the lower end of the conduit, by analyzing seismograms from a dense seismic array 3 km from the Shinmoedake crater, Kirishima volcano, Japan. We observed changes in the seismic ray direction during a volcanic tremor sequence, and inferred two major sources of the tremor from the slowness vectors of the approaching waves. One was located in a shallow region beneath the Shinmoedake crater. The other was found in a direction N30°W from the array, pointing to a location above a pressure source. The fine spatial and temporal characteristics of volcanic tremors suggest an interaction between deep and shallow conduits.

  2. Long-Term Temporal Trends of Polychlorinated Biphenyls and Their Controlling Sources in China.

    PubMed

    Zhao, Shizhen; Breivik, Knut; Liu, Guorui; Zheng, Minghui; Jones, Kevin C; Sweetman, Andrew J

    2017-03-07

    Polychlorinated biphenyls (PCBs) are industrial organic contaminants identified as persistent, bioaccumulative, toxic (PBT), and subject to long-range transport (LRT) with global scale significance. This study focuses on a reconstruction and prediction for China of long-term emission trends of intentionally and unintentionally produced (UP) ∑7PCBs (UP-PCBs, from the manufacture of steel, cement and sinter iron) and their re-emissions from secondary sources (e.g., soils and vegetation) using a dynamic fate model (BETR-Global). Contemporary emission estimates combined with predictions from the multimedia fate model suggest that primary sources still dominate, although unintentional sources are predicted to become a main contributor from 2035 for PCB-28. Imported e-waste is predicted to play an increasing role until 2020-2030 on a national scale due to the decline of intentionally produced (IP) emissions. Hypothetical emission scenarios suggest that China could become a potential source to neighboring regions with a net output of ∼0.4 t year(-1) by around 2050. However, future emission scenarios and hence model results will be dictated by the efficiency of control measures.

  3. Analysis of source term modeling for low-level radioactive waste performance assessments

    SciTech Connect

    Icenhour, A.S.

    1995-03-01

    Site-specific radiological performance assessments are required for the disposal of low-level radioactive waste (LLW) at both commercial and US Department of Energy facilities. This work explores source term modeling of LLW disposal facilities by using two state-of-the-art computer codes, SOURCEI and SOURCE2. An overview of the performance assessment methodology is presented, and the basic processes modeled in the SOURCE1 and SOURCE2 codes are described. Comparisons are made between the two advective models for a variety of radionuclides, transport parameters, and waste-disposal technologies. These comparisons show that, in general, the zero-order model predicts undecayed cumulative fractions leached that are slightly greater than or equal to those of the first-order model. For long-lived radionuclides, results from the two models eventually reach the same value. By contrast, for short-lived radionuclides, the zero-order model predicts a slightly higher undecayed cumulative fraction leached than does the first-order model. A new methodology, based on sensitivity and uncertainty analyses, is developed for predicting intruder scenarios. This method is demonstrated for {sup 137}Cs in a tumulus-type disposal facility. The sensitivity and uncertainty analyses incorporate input-parameter uncertainty into the evaluation of a potential time of intrusion and the remaining radionuclide inventory. Finally, conclusions from this study are presented, and recommendations for continuing work are made.

  4. Reconstructing source terms from atmospheric concentration measurements: Optimality analysis of an inversion technique

    NASA Astrophysics Data System (ADS)

    Turbelin, Grégory; Singh, Sarvesh Kumar; Issartel, Jean-Pierre

    2014-12-01

    In the event of an accidental or intentional contaminant release in the atmosphere, it is imperative, for managing emergency response, to diagnose the release parameters of the source from measured data. Reconstruction of the source information exploiting measured data is called an inverse problem. To solve such a problem, several techniques are currently being developed. The first part of this paper provides a detailed description of one of them, known as the renormalization method. This technique, proposed by Issartel (2005), has been derived using an approach different from that of standard inversion methods and gives a linear solution to the continuous Source Term Estimation (STE) problem. In the second part of this paper, the discrete counterpart of this method is presented. By using matrix notation, common in data assimilation and suitable for numerical computing, it is shown that the discrete renormalized solution belongs to a family of well-known inverse solutions (minimum weighted norm solutions), which can be computed by using the concept of generalized inverse operator. It is shown that, when the weight matrix satisfies the renormalization condition, this operator satisfies the criteria used in geophysics to define good inverses. Notably, by means of the Model Resolution Matrix (MRM) formalism, we demonstrate that the renormalized solution fulfils optimal properties for the localization of single point sources. Throughout the article, the main concepts are illustrated with data from a wind tunnel experiment conducted at the Environmental Flow Research Centre at the University of Surrey, UK.

  5. Modification to ORIGEN2 for generating N Reactor source terms. Volume 1

    SciTech Connect

    Schwarz, R.A.

    1997-04-01

    This report discusses work that has been done to upgrade the ORIGEN2 code cross sections to be compatible with the WIMS computer code data. Because of the changes in the ORIGEN2 calculations. Details on changes made to the ORIGEN2 computer code and the Radnuc code will be discussed along with additional work that should be done in the future to upgrade both ORIGEN2 and Radnuc. A detailed historical description of how source terms have been generated for N Reactor fuel stored in the K Basins has been generated. The neutron source discussed in this description was generated by the WIMS computer code (Gubbins et al. 1982) because of known shortcomings in the ORIGEN2 (Croff 1980) cross sections. Another document includes a discussion of the ORIGEN2 cross sections.

  6. Source term experiment STEP-3 simulating a PWR severe station blackout

    SciTech Connect

    Simms, R.; Baker, L. Jr.; Ritzman, R.L.

    1987-05-21

    For a severe PWR accident that leads to a loss of feedwater to the steam generators, such as might occur in a station blackout, fission product decay heating will cause a water boiloff. Without effective cooling of the core, steam will begin to oxidize the Zircaloy cladding. The noble gases and volatile fission products, such as Cs and I, that are major contributors to the radiological source term, will be released from the damaged fuel shortly after cladding failure. The accident environment when these volatile fission products escape was simulated in STEP-3 using four fuel elements from the Belgonucleaire BR3 reactor. The primary objective was to examine the releases in samples collected as close to the test zone as possible. In this paper, an analysis of the temperatures and hydrogen generation is compared with the measurements. The analysis is needed to estimate releases and characterize conditions at the source for studies of fission product transport.

  7. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    SciTech Connect

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.; Peterson, Joshua L.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  8. Hierarchical Bayesian estimates of distributed MEG sources: theoretical aspects and comparison of variational and MCMC methods.

    PubMed

    Nummenmaa, Aapo; Auranen, Toni; Hämäläinen, Matti S; Jääskeläinen, Iiro P; Lampinen, Jouko; Sams, Mikko; Vehtari, Aki

    2007-04-01

    Magnetoencephalography (MEG) provides millisecond-scale temporal resolution for noninvasive mapping of human brain functions, but the problem of reconstructing the underlying source currents from the extracranial data has no unique solution. Several distributed source estimation methods based on different prior assumptions have been suggested for the resolution of this inverse problem. Recently, a hierarchical Bayesian generalization of the traditional minimum norm estimate (MNE) was proposed, in which the variance of distributed current at each cortical location is considered as a random variable and estimated from the data using the variational Bayesian (VB) framework. Here, we introduce an alternative scheme for performing Bayesian inference in the context of this hierarchical model by using Markov chain Monte Carlo (MCMC) strategies. In principle, the MCMC method is capable of numerically representing the true posterior distribution of the currents whereas the VB approach is inherently approximative. We point out some potential problems related to hyperprior selection in the previous work and study some possible solutions. A hyperprior sensitivity analysis is then performed, and the structure of the posterior distribution as revealed by the MCMC method is investigated. We show that the structure of the true posterior is rather complex with multiple modes corresponding to different possible solutions to the source reconstruction problem. We compare the results from the VB algorithm to those obtained from the MCMC simulation under different hyperparameter settings. The difficulties in using a unimodal variational distribution as a proxy for a truly multimodal distribution are also discussed. Simulated MEG data with realistic sensor and source geometries are used in performing the analyses.

  9. Inverse modelling-based reconstruction of the Chernobyl source term available for long-range transport

    NASA Astrophysics Data System (ADS)

    Davoine, X.; Bocquet, M.

    2007-03-01

    The reconstruction of the Chernobyl accident source term has been previously carried out using core inventories, but also back and forth confrontations between model simulations and activity concentration or deposited activity measurements. The approach presented in this paper is based on inverse modelling techniques. It relies both on the activity concentration measurements and on the adjoint of a chemistry-transport model. The location of the release is assumed to be known, and one is looking for a source term available for long-range transport that depends both on time and altitude. The method relies on the maximum entropy on the mean principle and exploits source positivity. The inversion results are mainly sensitive to two tuning parameters, a mass scale and the scale of the prior errors in the inversion. To overcome this hardship, we resort to the statistical L-curve method to estimate balanced values for these two parameters. Once this is done, many of the retrieved features of the source are robust within a reasonable range of parameter values. Our results favour the acknowledged three-step scenario, with a strong initial release (26 to 27 April), followed by a weak emission period of four days (28 April-1 May) and again a release, longer but less intense than the initial one (2 May-6 May). The retrieved quantities of iodine-131, caesium-134 and caesium-137 that have been released are in good agreement with the latest reported estimations. Yet, a stronger apportionment of the total released activity is ascribed to the first period and less to the third one. Finer chronological details are obtained, such as a sequence of eruptive episodes in the first two days, likely related to the modulation of the boundary layer diurnal cycle. In addition, the first two-day release surges are found to have effectively reached an altitude up to the top of the domain (5000 m).

  10. Inverse modelling-based reconstruction of the Chernobyl source term available for long-range transport

    NASA Astrophysics Data System (ADS)

    Davoine, X.; Bocquet, M.

    2007-01-01

    The reconstruction of the Chernobyl accident source term has been previously carried out using core inventories, but also back and forth confrontations between model simulations and activity concentration or deposited activity measurements. The approach presented in this paper is based on inverse modelling techniques. It relies both on the activity concentration measurements and on the adjoint of a chemistry-transport model. The location of the release is assumed to be known, and one is looking for a source term available for long-range transport that depends both on time and altitude. The method relies on the maximum entropy on the mean principle and exploits source positivity. The inversion results are mainly sensitive to two tuning parameters, a mass scale and the scale of the prior errors in the inversion. To overcome this hardship, we resort to the statistical L-curve method to estimate balanced values for these two parameters. Once this is done, many of the retrieved features of the source are robust within a reasonable range of parameter values. Our results favour the acknowledged three-step scenario, with a strong initial release (26 to 27 April), followed by a weak emission period of four days (28 April-1 May) and again a release, longer but less intense than the initial one (2 May-6 May). The retrieved quantities of iodine-131, caesium-134 and caesium-137 that have been released are in good agreement with the latest reported estimations. Yet, a stronger apportionment of the total released activity is ascribed to the first period and less to the third one. Finer chronological details are obtained, such as a sequence of eruptive episodes in the first two days, likely related to the modulation of the boundary layer diurnal cycle. In addition, the first two-day release surges are found to have effectively reached an altitude up to the top of the domain (5000 m).

  11. Source term evaluation during seismic events in the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Kim, S.H.; Chen, N.C.J.; Schmidt, R.W.; Taleyarkhan, R.P.

    1996-12-30

    The 00 buildings are expected to collapse (per guidance from structure evaluation) during a seismic event in which acceleration level exceeds 0.15g. All roof beams may slip off supports, and collapse. Equipment may slip off from supports and fall onto the floor. The cell floor is also supposed to collapse due to structural instability and distortion due to excessive acceleration forces. Following structure collapse, expansion joints in the process piping and joints between the piping and equipment are expected to fail. Preliminary analysis showed that converters are likely to remain intact. The UF{sub 6} gas released from the break will rapidly interact with moisture in the air to produce UO{sub 2}F{sub 2} and HF with exothermic energy released of {approximately}0.32 MJ/kg of UF{sub 6} reacted. Depending on the degree of mixing between UF{sub 6} gas, its reaction products, air and freon (R-114), there may occur a strong buoyancy force to disperse UO{sub 2}F{sub 2} aerosol particles that are subjected to the gravitational force for settling. Such a chemical reaction will also occur inside the converters. A substantial amount of UF{sub 6} must be stagnated at the bottom of the converters. At the interface between this stagnated UF{sub 6} and air, UF{sub 6} gas will diffuse into the air, undergo the chemical reaction with moisture there, and eventually be released through the break. Furthermore, lubricant oil fire in the building, if it occurs, will enhance the UF{sub 6} release into the atmosphere. The purpose of this study is to evaluate source term (UO{sub 2}F{sub 2} and HF) during such a seismic event. This study takes an approach using multiple steps as follows: (1) Source term evaluation at the break due to mixing between UF{sub 6} and air along with thermal buoyancy induced by chemical reaction energy, (2) Evaluation of additional source term from the converters in which a substantial UF{sub 6} vapor remains, and (3) Source term evaluation with lubricant oil

  12. Source term analysis for a criticality accident in metal production line glove boxes

    SciTech Connect

    Nguyen, D.H.

    1991-06-01

    A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion calculations. We have used an integrated computer model to determine the release of fission products to the environment from a hypothetical criticality event in a glove box of the metal production line (MPL) at the Lawrence Livermore National Laboratory (LLNL).

  13. An Exact Form of Lilley's Equation with a Velocity Quadrupole/Temperature Dipole Source Term

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2001-01-01

    There have been several attempts to introduce approximations into the exact form of Lilley's equation in order to express the source term as the sum of a quadrupole whose strength is quadratic in the fluctuating velocities and a dipole whose strength is proportional to the temperature fluctuations. The purpose of this note is to show that it is possible to choose the dependent (i.e., the pressure) variable so that this type of result can be derived directly from the Euler equations without introducing any additional approximations.

  14. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    SciTech Connect

    Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young; Kim, Chang-Lak

    2003-02-27

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation of the Korean concept of the LILW disposal project in the near future.

  15. ORIGEN-ARP, A Fast and Easy-to-Use Source Term Generation Tool

    SciTech Connect

    Bowman, S.M.; Hermann, O.W.; Leal, L.C.; Parks, C.V.

    1999-10-17

    ORIGEN-ARP is a new SCALE analytical sequence for spent fuel characterization and source term generation that serves as a faster alternative to the SAS2H sequence by using the Automatic Rapid Processing (ARP) methodology for generating problem-dependent ORIGEN-S cross-section libraries. ORIGEN-ARP provides an easy-to-use menu-driven input processor. This new sequence is two orders of magnitude faster than SAS2H while conserving the rigor and accuracy of the SAS2H methodology. ORIGEN-ARP has been validated against pressurized water reactor (PWR) and boiling water reactor (BWR) spent fuel chemical assay data.

  16. SOURCE TERM REMEDIATION & DEMOLITION STRATEGY FOR THE HANFORD K-AREA SPENT FUEL BASINS

    SciTech Connect

    CHRONISTER, G.B.

    2006-03-23

    This paper discusses the technologies applied at Hanford's K-Basins to mitigate risk and reduce the source term in preparing the basins for deactivation and demolition. These project technologies/strategies (in various stages of implementation) are sequential in nature and are the basis for preparing to dispose of the K Basins--two highly contaminated concrete basins at the Hanford Site in southeastern Washington State. A large collection of spent nuclear fuel stored for many years underwater at the K Basins has been removed to stable, dry, safe storage. Remediation activities are underway to prepare the basin structures for de-inventory, decontamination, and disposal.

  17. Basic repository source term and data sheet report: Deaf Smith County

    SciTech Connect

    Not Available

    1987-01-01

    This report is one of a series describing studies undertaken in support of the US Department of Energy Civilian Radioactive Waste Management (CRWM) Program. This study contains the derivation of values for environmental source terms and resources consumed for a CRWM repository. Estimates include heavy construction equipment; support equipment; shaft-sinking equipment; transportation equipment; and consumption of fuel, water, electricity, and natural gas. Data are presented for construction and operation at an assumed site in Deaf Smith County, Texas. 2 refs., 6 tabs.

  18. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    SciTech Connect

    Not Available

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan - Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identified the data needs for the Environmental Assessment effort for seven possible salt repository sites.

  19. Numerical Dissipation and Wrong Propagation Speed of Discontinuities for Stiff Source Terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Kotov, D. V.; Sjoegreen, B.

    2012-01-01

    In compressible turbulent combustion/nonequilibrium flows, the constructions of numerical schemes for (a) stable and accurate simulation of turbulence with strong shocks, and (b) obtaining correct propagation speed of discontinuities for stiff reacting terms on coarse grids share one important ingredient - minimization of numerical dissipation while maintaining numerical stability. Here coarse grids means standard mesh density requirement for accurate simulation of typical non-reacting flows. This dual requirement to achieve both numerical stability and accuracy with zero or minimal use of numerical dissipation is most often conflicting for existing schemes that were designed for non-reacting flows. The goal of this paper is to relate numerical dissipations that are inherited in a selected set of high order shock-capturing schemes with the onset of wrong propagation speed of discontinuities as a function of stiffness of the source term and the grid spacing.

  20. PHENOstruct: Prediction of human phenotype ontology terms using heterogeneous data sources

    PubMed Central

    Kahanda, Indika; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa

    2015-01-01

    The human phenotype ontology (HPO) was recently developed as a standardized vocabulary for describing the phenotype abnormalities associated with human diseases. At present, only a small fraction of human protein coding genes have HPO annotations. But, researchers believe that a large portion of currently unannotated genes are related to disease phenotypes. Therefore, it is important to predict gene-HPO term associations using accurate computational methods. In this work we demonstrate the performance advantage of the structured SVM approach which was shown to be highly effective for Gene Ontology term prediction in comparison to several baseline methods. Furthermore, we highlight a collection of informative data sources suitable for the problem of predicting gene-HPO associations, including large scale literature mining data. PMID:26834980

  1. Distribution, sources, and potential toxicological significance of PAHs in drinking water sources within the Pearl River Delta.

    PubMed

    An, Taicheng; Qiao, Meng; Li, Guiying; Sun, Hongwei; Zeng, Xiangying; Fu, Jiamo

    2011-05-01

    The Pearl River Delta (PRD) region is one of the most population-dense areas in China. The safety of its drinking source water is essential to human health. Polycyclic aromatic hydrocarbons (PAHs) have attracted attention from the scientific community and the general public due to their toxicity and wide distribution in the global environment. In this work, PAHs pollution levels from the drinking source water in nine main cities within the PRD were investigated. ∑15 PAHs concentrations during the wet season varied from 32.0 to 754.8 ng L(-1) in the dissolved phase, and from 13.4 to 3017.8 ng L(-1) in the particulate phase. During the dry season, dissolved PAHs ranged from 48.1 to 113.6 ng L(-1), and particulate PAHs from 8.6 to 69.6 ng L(-1). Overall, ∑15 PAHs concentrations were extremely high in the XC and ZHQ stations during the wet season in 2008 and 2009. In most sites, PAHs originated from mixed sources. Hazard ratios based on non-cancerous and cancerous risks were extremely higher in XC compared with the others during the wet season, though they were much less than 1. Nevertheless, risks caused by the combined toxicity of ∑15 PAHs and other organics should be seriously considered. PAHs toxic equivalent quantities ranged from 0.508 to 177.077 ng L(-1).

  2. GO::TermFinder--open source software for accessing Gene Ontology information and finding significantly enriched Gene Ontology terms associated with a list of genes.

    PubMed

    Boyle, Elizabeth I; Weng, Shuai; Gollub, Jeremy; Jin, Heng; Botstein, David; Cherry, J Michael; Sherlock, Gavin

    2004-12-12

    GO::TermFinder comprises a set of object-oriented Perl modules for accessing Gene Ontology (GO) information and evaluating and visualizing the collective annotation of a list of genes to GO terms. It can be used to draw conclusions from microarray and other biological data, calculating the statistical significance of each annotation. GO::TermFinder can be used on any system on which Perl can be run, either as a command line application, in single or batch mode, or as a web-based CGI script. The full source code and documentation for GO::TermFinder are freely available from http://search.cpan.org/dist/GO-TermFinder/.

  3. The integration of renewable energy sources into electric power distribution systems. Volume 1: National assessment

    SciTech Connect

    Barnes, P.R.; Van Dyke, J.W.; Tesche, F.M.; Zaininger, H.W.

    1994-06-01

    Renewable energy technologies such as photovoltaic, solar thermal electricity, and wind turbine power are environmentally beneficial sources of electric power generation. The integration of renewable energy sources into electric power distribution systems can provide additional economic benefits because of a reduction in the losses associated with transmission and distribution lines. Benefits associated with the deferment of transmission and distribution investment may also be possible for cases where there is a high correlation between peak circuit load and renewable energy electric generation, such as photovoltaic systems in the Southwest. Case studies were conducted with actual power distribution system data for seven electric utilities with the participation of those utilities. Integrating renewable energy systems into electric power distribution systems increased the value of the benefits by about 20 to 55% above central station benefits in the national regional assessment. In the case studies presented in Vol. II, the range was larger: from a few percent to near 80% for a case where costly investments were deferred. In general, additional savings of at least 10 to 20% can be expected by integrating at the distribution level. Wind energy systems were found to be economical in good wind resource regions, whereas photovoltaic systems costs are presently a factor of 2.5 too expensive under the most favorable conditions.

  4. Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems.

    PubMed

    Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao

    2016-03-12

    In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm.

  5. Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems

    PubMed Central

    Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao

    2016-01-01

    In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm. PMID:26985896

  6. Higher-order-in-spin interaction Hamiltonians for binary black holes from source terms of Kerr geometry in approximate ADM coordinates

    SciTech Connect

    Hergt, Steven; Schaefer, Gerhard

    2008-05-15

    The Kerr metric outside the ergosphere is transformed into Arnowitt-Deser-Misner coordinates up to the orders 1/r{sup 4} and a{sup 2}, respectively, in radial coordinate r and reduced angular momentum variable a, starting from the Kerr solution in quasi-isotropic as well as harmonic coordinates. The distributional source terms for the approximate solution are calculated. To leading order in linear momenta, higher-order-in-spin interaction Hamiltonians for black hole binaries are derived.

  7. Interpreting the neutron's electric form factor: Rest frame charge distribution or foldy term?

    SciTech Connect

    Nathan Isgur

    1998-12-01

    The neutron's electric form factor contains vital information on nucleon structure, but its interpretation within many models has been obscured by relativistic effects. The author demonstrates that, to leading order in the relativistic expansion of a constituent quark model, the Foldy term cancels exactly against a contribution to the Dirac form factor F{sub 1} to leave intact the naive interpretation of G{sup n}{sub E} as arising from the neutron's rest frame charge distribution.

  8. Long-term changes of tree species composition and distribution in Korean mountain forests

    NASA Astrophysics Data System (ADS)

    Lee, Boknam; Lee, Hoontaek; Cho, Sunhee; Yoon, Jongguk; Park, Jongyoung; Kim, Hyun Seok

    2017-04-01

    Long-term changes in the abundance and distribution of tree species in the temperate forests of South Korea remain poorly understood. We investigated how tree species composition and stand distribution change across temperate mountainous forests using the species composition and DBH size collected over the past 15 years (1998-2012) across 130 permanent forest plots of 0.1 ha in Jiri and Baegun mountains in South Korea. The overall net change of tree communities over the years showed positive in terms of stand density, richness, diversity, and evenness. At the species level, the change of relative species composition has been led by intermediate and shade-tolerant species, such as Quercus mongolica, Carpinus laxiflora, Quercus serrate, Quercus variabilis, Styrax japonicus, Lindera erythrocarpa, and Pinus densiflora and was categorized into five species communities, representing gradual increase or decrease, establishment, extinction, fluctuation of species population. At the community level, the change in species composition appeared to have consistent and directional patterns of increase in the annual rate of change in the mean species traits including species density, pole growth rate, adult growth rate, and adult stature. Based on the additive models, the distribution of species diversity was significantly related to topographical variables including elevation, latitude, longitude, slope, topographic wetness index, and curvature where elevation was the most significant driver, followed by latitude and longitude. However, the change in distribution of species diversity was only significantly influenced by latitude and longitude. This is the first study to reveal the long-term dynamics of change in tree species composition and distribution, which are important to broaden our understanding of temperate mountainous forest ecosystem in South Korea.

  9. Regulatory Technology Development Plan Sodium Fast Reactor. Mechanistic Source Term Development

    SciTech Connect

    Grabaskas, David S.; Brunett, Acacia Joann; Bucknor, Matthew D.; Sienicki, James J.; Sofu, Tanju

    2015-02-28

    Construction and operation of a nuclear power installation in the U.S. requires licensing by the U.S. Nuclear Regulatory Commission (NRC). A vital part of this licensing process and integrated safety assessment entails the analysis of a source term (or source terms) that represents the release of radionuclides during normal operation and accident sequences. Historically, nuclear plant source term analyses have utilized deterministic, bounding assessments of the radionuclides released to the environment. Significant advancements in technical capabilities and the knowledge state have enabled the development of more realistic analyses such that a mechanistic source term (MST) assessment is now expected to be a requirement of advanced reactor licensing. This report focuses on the state of development of an MST for a sodium fast reactor (SFR), with the intent of aiding in the process of MST definition by qualitatively identifying and characterizing the major sources and transport processes of radionuclides. Due to common design characteristics among current U.S. SFR vendor designs, a metal-fuel, pool-type SFR has been selected as the reference design for this work, with all phenomenological discussions geared toward this specific reactor configuration. This works also aims to identify the key gaps and uncertainties in the current knowledge state that must be addressed for SFR MST development. It is anticipated that this knowledge state assessment can enable the coordination of technology and analysis tool development discussions such that any knowledge gaps may be addressed. Sources of radionuclides considered in this report include releases originating both in-vessel and ex-vessel, including in-core fuel, primary sodium and cover gas cleanup systems, and spent fuel movement and handling. Transport phenomena affecting various release groups are identified and qualitatively discussed, including fuel pin and primary coolant retention, and behavior in the cover gas and

  10. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  11. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    PubMed Central

    Gao, Xiang; Acar, Levent

    2016-01-01

    This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented. PMID:27384568

  12. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources.

    PubMed

    Gao, Xiang; Acar, Levent

    2016-07-04

    This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors' data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  13. Evaluation of the beta energy spectrum from a distributed uranium mill tailings source

    SciTech Connect

    Reif, R.H.; Martz, D.E.; Carlson, D.S.; Turner, J.B. )

    1993-10-01

    The beta energy spectra from uranium mill tailings, 90Sr with different absorber thicknesses, and a uranium metal slab were measured and compared to select an appropriate beta source for calibrating a personal dosimeter to measure shallow dose equivalent when exposed to uranium mill tailings. The measured beta energy spectrum from the 90Sr source, with a 111 mg cm-2 cover thickness, was selected as a possible calibration source for a personnel dosimeter. The dose equivalent rate to the skin at 1 cm from a distributed tailings source of infinite thickness, with a 226Ra activity of 56 Bq g-1 (1.5 x 10(3) pCi g-1), was measured to be 0.024 mSv h-1 (2.4 mrem h-1).

  14. Evaluation of the beta energy spectrum from a distributed uranium mill tailings source.

    PubMed

    Reif, R H; Martz, D E; Carlson, D S; Turner, J B

    1993-10-01

    The beta energy spectra from uranium mill tailings, 90Sr with different absorber thicknesses, and a uranium metal slab were measured and compared to select an appropriate beta source for calibrating a personal dosimeter to measure shallow dose equivalent when exposed to uranium mill tailings. The measured beta energy spectrum from the 90Sr source, with a 111 mg cm-2 cover thickness, was selected as a possible calibration source for a personnel dosimeter. The dose equivalent rate to the skin at 1 cm from a distributed tailings source of infinite thickness, with a 226Ra activity of 56 Bq g-1 (1.5 x 10(3) pCi g-1), was measured to be 0.024 mSv h-1 (2.4 mrem h-1).

  15. Auroral electron distributions within and close to the Saturn kilometric radiation source region

    NASA Astrophysics Data System (ADS)

    Schippers, P.; Arridge, C. S.; Menietti, J. D.; Gurnett, D. A.; Lamy, L.; Cecconi, B.; Mitchell, D. G.; André, N.; Kurth, W. S.; Grimald, S.; Dougherty, M. K.; Coates, A. J.; Krupp, N.; Young, D. T.

    2011-05-01

    On 17 October 2008, Cassini observed for the first time the electron populations associated with the crossing of a Saturn kilometric radiation source region and its surroundings. These observations allow for the first time the constraint and quantification of the high-latitude acceleration processes, the current systems, and the origin of the low-frequency electromagnetic waves. Enhanced fluxes of field-aligned energetic electrons were measured by the Cassini electron plasma spectrometer in conjunction with unusual intense field-aligned current systems identified using the magnetometer instrument. In the region where downward field-aligned currents were measured, electron data show evidence of two types of upward accelerated electron beams: a broadband energetic (1-100 keV) electron population that is observed throughout the region and a narrow-banded (0.1-1 keV) electron population that is observed sporadically. In the regions where the magnetic field signatures showed evidence for upward field-aligned currents, we observe electron loss cone distributions and some evidence of shell-like distributions. Such nonthermal electron populations are commonly known as a potential free energy source to drive plasma instabilities. In the downward current region, the low-energy and energetic beams are likely the source of the very low frequency emissions. In the upward current region, the shell distribution is identified as a potential source for Saturn kilometric radiation generation via the cyclotron maser instability.

  16. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    SciTech Connect

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco

    2016-07-29

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. Here, we employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ~50 GeV. Furthermore, the index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index of ${2.2}_{-0.3}^{+0.7}$ in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain ${83}_{-13}^{+7}$% (${81}_{-19}^{+52}$%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). Our method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.

  17. Statistical Measurement of the Gamma-Ray Source-count Distribution as a Function of Energy

    NASA Astrophysics Data System (ADS)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco

    2016-08-01

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. We employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ˜50 GeV. The index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index of {2.2}-0.3+0.7 in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain {83}-13+7% ({81}-19+52%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). The method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.

  18. Accident source terms for pressurized water reactors with high-burnup cores calculated using MELCOR 1.8.5.

    SciTech Connect

    Gauntt, Randall O.; Powers, Dana Auburn; Ashbaugh, Scott G.; Leonard, Mark Thomas; Longmire, Pamela

    2010-04-01

    In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in this study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs2MoO4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU

  19. Distribution and sources of carbon, nitrogen, phosphorus and biogenic silica in the sediments of Chilika lagoon

    NASA Astrophysics Data System (ADS)

    Nazneen, Sadaf; Raju, N. Janardhana

    2017-02-01

    The present study investigated the spatial and vertical distribution of organic carbon (OC), total nitrogen (TN), total phosphorus (TP) and biogenic silica (BSi) in the sedimentary environments of Asia's largest brackish water lagoon. Surface and core sediments were collected from various locations of the Chilika lagoon and were analysed for grain-size distribution and major elements in order to understand their distribution and sources. Sand is the dominant fraction followed by silt + clay. Primary production within the lagoon, terrestrial input from river discharge and anthropogenic activities in the vicinity of the lagoon control the distribution of OC, TN, TP and BSi in the surface as well as in the core sediments. Low C/N ratios in the surface sediments (3.49-3.41) and cores (4-11.86) suggest that phytoplankton and macroalgae may be major contributors of organic matter (OM) in the lagoon. BSi is mainly associated with the mud fraction. Core C5 from Balugaon region shows the highest concentration of OC ranging from 0.58-2.34%, especially in the upper 30 cm, due to direct discharge of large amounts of untreated sewage into the lagoon. The study highlights that Chilika is a dynamic ecosystem with a large contribution of OM by autochthonous sources with some input from anthropogenic sources as well.

  20. Dipole versus distributed EEG source localization for single versus averaged spikes in focal epilepsy.

    PubMed

    Plummer, C; Wagner, M; Fuchs, M; Harvey, A S; Cook, M J

    2010-06-01

    The aim of this study is to characterize and compare dipole and distributed EEG source localization (ESL) of interictal epileptiform discharges (IEDs) in focal epilepsy. Single and averaged scalp IEDs from eight patients-four with benign focal epilepsy of childhood with centrotemporal spikes (BFEC) and four with mesial temporal lobe epilepsy (MTLE)-underwent independent component analysis (ICA) from IED onset to peak. The boundary element method forward model was applied to one of four inverse models: two dipolar-moving regularized, rotating nonregularized and two distributed-standardized low-resolution electromagnetic tomography with rotating cortical sources or with fixed extended sources. Solutions were studied at IED onset, midupswing, peak; ESL strength maxima; ESL residual deviation minima (best fit). From 11,040 ESL parameter points and 960 ESL maps, best-fit dipole and distributed solutions fell at the IED midupswing in BFEC and MTLE when the dominant ICA component typically peaked, localizing to the lower Rolandic sulcus in BFEC and to basolateral or anterior temporal cortex in MTLE. Single-to-averaged ESL variability was high in MTLE. Dipole and distributed ESL are complementary; best-fit solutions for both occupy the IED midupswing and not the IED peak. ICA, a "blind" statistical operation, aids clinical interpretation of ESL fit quality. Single-to-averaged IED localization discordance can be high, a problem warranting further scrutiny if ESL is to earn a place in routine epilepsy care.

  1. Analysis of the relationship between landslides size distribution and earthquake source area

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Crosta, Giovanni B.; Frattini, Paolo; Xu, Chong

    2014-05-01

    The spatial distribution of earthquake induced landslides around the seismogenetic source has been analysed to better understand the triggering of landslides in seismic areas and to forecast the maximum distance at which an earthquake, with a certain magnitude, can induce landslides (e.g Keefer, 1984). However, when applying such approaches to old earthquakes (e.g 1929 Buller and 1968 Iningahua earthquakes New Zealand; Parker, 2013; 1976 Friuli earthquake, Italy) one should be concerned about the undersampling of smaller landslides which can be cancelled by erosion and landscape evolution. For this reason, it is important to characterize carefully the relationship between landslide area and number with distance from the source, but also the size distribution of landslides as a function of distance from the source. In this paper, we analyse the 2008 Wenchuan earthquake landslide inventory (Xu et al, 2013). The earthquake triggered more than 197,000 landslides of different type, including rock avalanches, rockfalls, translational and rotational slides, lateral spreads and derbies flows. First, we calculated the landslide intensity (number of landslides per unit area) and spatial density (landslide area per unit area) as a function of distance from the source area of the earthquake. Then, we developed magnitude frequency curves (MFC) for different distances from the source area. Comparing these curves, we can describe the relation between the distance and the frequency density of landslide in seismic area. Keefer D K (1984) Landslides caused by earthquakes. Geological Society of America Bulletin, 95(4), 406-421. Parker R N, (2013) Hillslope memory and spatial and temporal distributions of earthquake-induced landslides, Durham theses, Durham University. Xu, C., Xu, X., Yao, X., & Dai, F. (2013). Three (nearly) complete inventories of landslides triggered by the May 12, 2008 Wenchuan Mw 7.9 earthquake of China and their spatial distribution statistical analysis

  2. The Analytical Repository Source-Term (AREST) model: Description and documentation

    SciTech Connect

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs.

  3. Decontamination Techniques and Fixative Coatings Evaluated in the Building 235-F Legacy Source Term Removal Study

    SciTech Connect

    WAYNE, FARRELL

    2005-04-21

    Savannah River Site Building 235-F was being considered for future plutonium storage and stabilization missions but the Defense Nuclear Facilities Safety Board (DNFSB) noted that large quantities of Plutonium-238 left in cells and gloveboxes from previous operations posed a potential hazard to both the existing and future workforce. This material resulted from the manufacture of Pu-238 heat sources used by the NASA space program to generate electricity for deep space exploration satellites. A multi-disciplinary team was assembled to propose a cost- effective solution to mitigate this legacy source term which would facilitate future DOE plutonium storage activities in 235-F. One aspect of this study involved an evaluation of commercially available radiological decontamination techniques to remove the legacy Pu-238 and fixative coatings that could stabilize any residual Pu-238 following decontamination activities. Four chemical methods were identified as most likely to meet decontamination objectives for this project and are discussed in detail. Short and long term fixatives will be reviewed with particular attention to the potential radiation damage caused by Pu-238, which has a high specific activity and would be expected to cause significant radiation damage to any coating applied. Encapsulants that were considered to mitigate the legacy Pu-238 will also be reviewed.

  4. Methods for Calculating a Simplified Hydrologic Source Term for Frenchman Flat Sensitivity Studies of Radionuclide Transport Away from Underground Nuclear Tests

    SciTech Connect

    Tompson, A; Zavarin, M; Bruton, C J; Pawloski, G A

    2004-01-06

    The purpose of this report is to provide an approach for the development of a simplified unclassified hydrologic source term (HST) for the ten underground nuclear tests conducted in the Frenchman Flat Corrective Action Unit (CAU) at the Nevada Test Site (NTS). It is being prepared in an analytic form for incorporation into a GOLDSIM (Golder Associates, 2000) model of radionuclide release and migration in the Frenchman Flat CAU. This model will be used to explore, in an approximate and probabilistic fashion, sensitivities of the 1,000-year radionuclide contaminant boundary (FFACO, 1996; 2000) to hydrologic and other related parameters. The total inventory (or quantity) of radionuclides associated with each individual test, regardless of its form and distribution, is referred to as the radiologic source term (RST) of that test. The subsequent release of these radionuclides over time into groundwater is referred to as the hydrologic source term (HST) of that test (Tompson, et al., 2002). The basic elements of the simplified hydrologic source term model include: (1) Estimation of the volumes of geologic material physically affected by the tests. (2) Identification, quantification, and distribution of the radionuclides of importance. (3) Development of simplified release and retardation models for these radionuclides in groundwater. The simplifications used in the current HST model are based upon more fundamental analyses that are too complicated for use in a GOLDSIM sensitivity study. These analyses are based upon complex, three-dimensional flow and reactive transport simulations summarized in the original CAMBRIC hydrologic source term model (Tompson et al., 1999), unclassified improvements of this model discussed in Pawloski et al. (2000), as well as more recent studies that are part of an ongoing model of the HST at the CHESHIRE test in Pahute Mesa (Pawloski et al., 2001).

  5. Long-term dust aerosol production from natural sources in Iceland.

    PubMed

    Dagsson-Waldhauserova, Pavla; Arnalds, Olafur; Olafsson, Haraldur

    2017-02-01

    Iceland is a volcanic island in the North Atlantic Ocean with maritime climate. In spite of moist climate, large areas are with limited vegetation cover where >40% of Iceland is classified with considerable to very severe erosion and 21% of Iceland is volcanic sandy deserts. Not only do natural emissions from these sources influenced by strong winds affect regional air quality in Iceland ("Reykjavik haze"), but dust particles are transported over the Atlantic ocean and Arctic Ocean >1000 km at times. The aim of this paper is to place Icelandic dust production area into international perspective, present long-term frequency of dust storm events in northeast Iceland, and estimate dust aerosol concentrations during reported dust events. Meteorological observations with dust presence codes and related visibility were used to identify the frequency and the long-term changes in dust production in northeast Iceland. There were annually 16.4 days on average with reported dust observations on weather stations within the northeastern erosion area, indicating extreme dust plume activity and erosion within the northeastern deserts, even though the area is covered with snow during the major part of winter. During the 2000s the highest occurrence of dust events in six decades was reported. We have measured saltation and Aeolian transport during dust/volcanic ash storms in Iceland, which give some of the most intense wind erosion events ever measured. Icelandic dust affects the ecosystems over much of Iceland and causes regional haze. It is likely to affect the ecosystems of the oceans around Iceland, and it brings dust that lowers the albedo of the Icelandic glaciers, increasing melt-off due to global warming. The study indicates that Icelandic dust may contribute to the Arctic air pollution. Long-term records of meteorological dust observations from Northeast Iceland indicate the frequency of dust events from Icelandic deserts. The research involves a 60-year period and

  6. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization: Frame-like structures

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2009-01-01

    The applicability of interface mobilities for structure-borne sound source characterization critically depends on the admissibility of neglecting the cross-order terms. Following on from a previous study [H.A. Bonhoff, B.A.T. Petersson, Journal of Sound and Vibration 311 (2008) 473-484], the influence of the cross-order terms is investigated for frame-like structures under the assumption of a uniform force-order distribution. Considering the complex power, the cross-order terms are significant from intermediate frequencies on upwards. At lower frequencies, the cross-order terms can come into play for cases where the in-phase motion of the structure along the interface is constrained. The frequency characteristics of the influence of cross-order terms for the zero-order source descriptor and coupling function are similar to those of the complex power. For non-zero source descriptor and coupling function orders, the quality of the equal-order approximation mainly depends on the presence of low-order cross-order interface mobilities. By analyzing the symmetry of an interface system, it is possible to predict which cross-order terms are equal to zero. The equal-order approximation manages to capture the main trends and overall characteristics and offers an acceptable estimate for engineering practice.

  7. Occurrence, distribution and risk assessment of polychlorinated biphenyls and polybrominated diphenyl ethers in nine water sources.

    PubMed

    Yang, Yuyi; Xie, Qilai; Liu, Xinyu; Wang, Jun

    2015-05-01

    Water quality of water sources is a critical issue for human health in South China, which experiences rapid economic development and is the most densely populated region in China. In this study, the pollution of organohalogen compounds in nine important water sources, South China was investigated. Twenty six organohalogen compounds including seventeen polychlorinated biphenyls (PCBs) and nine polybrominated diphenyl ethers (PBDEs) were detected using gas chromatograph analysis. The concentrations of total PCBs ranged from 0.93 to 13.07ngL(-1), with an average value of 7.06ngL(-1). The total concentrations of nine PBDE congeners were found in range not detected (nd) to 7.87ngL(-1) with an average value of 2.59ngL(-1). Compositions of PCBs and PBDEs indicated the historical use of Aroclors 1248, 1254 and 1260, and commercial PBDEs may be the main source of organohalogen compounds in water sources in South China. The nine water sources could be classified into three clusters by self-organizing map neural network. Low halogenated PCBs and PBDEs showed similar distribution in the nine water sources. Cancer risks of PCBs and PBDEs via water consumption were all below 10(-6), indicating the water quality in the nine water sources, South China was safe for human drinking. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Application of the open-source mantle convection code ASPECT to long-term tectonic simulations

    NASA Astrophysics Data System (ADS)

    Naliboff, J.; Brune, S.

    2016-12-01

    The open-source mantle convection code ASPECT (Computational Infrastructure for Geodynamics) provides a robust foundation for numerically examining a wide range of geodynamic processes. ASPECT's strength arises from its massive scalability, use of AMR (adaptive mesh refinement), advanced solvers, community support and modular design, the latter of which permits straightforward modification of discrete components of the code, such as constitutive relationships. Here, we present preliminary work that uses such adaptions to model long-term tectonic problems (lithospheric and upper mantle dynamics) with ASPECT. Crustal- and lithospheric-scale deformation is modeled using a recently implemented visco-plastic constitutive relationship, with viscous flow characterized by diffusion and/or dislocation creep and plastic failure following a Drucker-Prager yield criterion model. Distinct compositional layers and their associated material properties are tracked through field- or tracer-based methods, which also allow tracking of time-dependent properties such as accumulated strain. Preliminary 2-D models of long-term (> 10 Myr) continental extension successfully reproduce results of prior studies with rift basin structure largely dependent on initial lithospheric structure, extension velocity and strain-softening parameterization. Building on these preliminary results, our presentation will focus on ASPECT's viability and performance for modeling long-term tectonics within the context of 2-D vs. 3-D simulations, CPU scaling, distinct solver methods, adaptive mesh refinement and field- vs. tracer-based methods. Our primary goal in addressing these topics is to highlight ASPECT's current functionality and address key areas of future development associated with modeling long-term tectonics. Additionally, we hope to spur discussion regarding a long-term tectonics benchmark that addresses the strong resolution-dependence of simulations using plasticity formulations based on the

  9. Extension of the distributed point source method for ultrasonic field modeling

    PubMed Central

    Cheng, Jiqi; Lin, Wei; Qin, Yi-Xian

    2011-01-01

    The distributed point source method (DPSM) was recently proposed for ultrasonic field modeling and other applications. This method uses distributed point sources, placed slightly behind transducer surface, to model the ultrasound field. The acoustic strength of each point source is obtained through matrix inversion that requires the number of target points on the transducer surface to be equal to the number of point sources. In this work, DPSM was extended and further developed to overcome the limitations of the original method and provide a solid mathematical explanation of the physical principle behind the method. With the extension, the acoustic strength of the point sources was calculated as the solution to the least squares minimization problem instead of using direct matrix inversion. As numerical examples, the ultrasound fields of circular and rectangular transducers were calculated using the extended and original DPSMs which were then systematically compared with the results calculated using the theoretical solution and the exact spatial impulse response method. The numerical results showed the extended method can model ultrasonic fields accurately without the scaling step required by the original method. The extended method has potential applications in ultrasonic field modeling, tissue characterization, nondestructive testing, and ultrasound system optimization. PMID:21269654

  10. Tsunami source parameters estimated from slip distribution and their relation to tsunami intensity

    NASA Astrophysics Data System (ADS)

    Bolshakova, Anna; Nosov, Mikhail; Kolesov, Sergey

    2015-04-01

    Estimation of the level of tsunami hazard on the basis of earthquake moment magnitude often fails. The most important reason for this is that tsunamis are related to earthquakes in a complex and ambiguous way. In order to reveal a measure of tsunamigenic potential of an earthquake that would be better than moment magnitude of earthquake we introduce a set of tsunami source parameters that can be calculated from co-seismic ocean-bottom deformation and bathymetry. We consider more than two hundred ocean-bottom earthquakes (1923-2014) those for which detailed slip distribution data (Finite Fault Model) are available on USGS, UCSB, Caltech, and eQuake-RC sites. Making use of the Okada formulae the vector fields of co-seismic deformation of ocean bottom are estimated from the slip distribution data. Taking into account bathymetry (GEBCO_08) we determine tsunami source parameters such as double amplitude of bottom deformation, displaced water volume, potential energy of initial elevation, etc. The tsunami source parameters are examined as a function of earthquake moment magnitude. The contribution of horisontal component of ocean bottom deformation to tsunami generation is investigated. We analyse the Soloviev-Imamura tsunami intensity as a function of tsunami source parameters. The possibility of usage of tsunami source parameters instead of moment magnitude in tsunami warning is discussed. This work was supported by the Russian Foundation for Basic Research, project 14-05-31295

  11. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    SciTech Connect

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish release fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.

  12. Source-term development for a contaminant plume for use by multimedia risk assessment models

    SciTech Connect

    Whelan, Gene ); McDonald, John P. ); Taira, Randal Y. ); Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equal importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.

  13. Evaluation of the Industrial Source Complex Short-Term model: dispersion over terrain.

    PubMed

    Abdul-Wahab, Sabah A

    2004-04-01

    Terrain around an air discharge source can have several influences on diffusion, the pattern of plume dispersion, the wind flow, and the turbulence characteristics. The Industrial Source Complex Short-Term (ISCST) model contains simple algorithms to attempt to account for the effects of terrain. The model has the ability to analyze concentrations in any type of terrain by using the terrain options available for running the model. In this study, the ISCST model was adopted to predict the concentration of sulfur dioxide (SO2) in and around the Mina Al-Fahal refinery in Oman. The central purpose of the study was to examine the performance of the ISCST model in predicting SO2 concentrations under two different scenarios: (1) when flat terrain was assumed; and (2) when the terrain descriptions were addressed. The results of these two scenarios were validated against SO2 monitoring data. The comparison showed that the model underestimated the observed concentrations for the two scenarios. However, the predicted concentrations of SO2 in the absence of the terrain scenario were in better agreement with the observations. Furthermore, the predicted SO2 concentrations were found to be lower than the World Health Organization guideline values, with the maximum concentrations found to occur relatively close to the sources of emission.

  14. Free-Space Quantum Key Distribution with a High Generation Rate Potassium Titanyl Phosphate Waveguide Photon-Pair Source

    NASA Technical Reports Server (NTRS)

    Wilson, Jeffrey D.; Chaffee, Dalton W.; Wilson, Nathaniel C.; Lekki, John D.; Tokars, Roger P.; Pouch, John J.; Roberts, Tony D.; Battle, Philip; Floyd, Bertram M.; Lind, Alexander J.; hide

    2016-01-01

    A high generation rate photon-pair source using a dual element periodically-poled potassium titanyl phosphate (PP KTP) waveguide is described. The fully integrated photon-pair source consists of a 1064-nanometer pump diode laser, fiber-coupled to a dual element waveguide within which a pair of 1064-nanometer photons are up-converted to a single 532-nanometer photon in the first stage. In the second stage, the 532-nanometer photon is down-converted to an entangled photon-pair at 800 nanometer and 1600 nanometer which are fiber-coupled at the waveguide output. The photon-pair source features a high pair generation rate, a compact power-efficient package, and continuous wave (CW) or pulsed operation. This is a significant step towards the long term goal of developing sources for high-rate Quantum Key Distribution (QKD) to enable Earth-space secure communications. Characterization and test results are presented. Details and preliminary results of a laboratory free-space QKD experiment with the B92 protocol are also presented.

  15. Free-space quantum key distribution with a high generation rate potassium titanyl phosphate waveguide photon-pair source

    NASA Astrophysics Data System (ADS)

    Wilson, Jeffrey D.; Chaffee, Dalton W.; Wilson, Nathaniel C.; Lekki, John D.; Tokars, Roger P.; Pouch, John J.; Roberts, Tony D.; Battle, Philip R.; Floyd, Bertram; Lind, Alexander J.; Cavin, John D.; Helmick, Spencer R.

    2016-09-01

    A high generation rate photon-pair source using a dual element periodically-poled potassium titanyl phosphate (PP KTP) waveguide is described. The fully integrated photon-pair source consists of a 1064-nm pump diode laser, fiber-coupled to a dual element waveguide within which a pair of 1064-nm photons are up-converted to a single 532-nm photon in the first stage. In the second stage, the 532-nm photon is down-converted to an entangled photon-pair at 800 nm and 1600 nm which are fiber-coupled at the waveguide output. The photon-pair source features a high pair generation rate, a compact power-efficient package, and continuous wave (CW) or pulsed operation. This is a significant step towards the long term goal of developing sources for high-rate Quantum Key Distribution (QKD) to enable Earth-space secure communications. Characterization and test results are presented. Details and preliminary results of a laboratory free space QKD experiment with the B92 protocol are also presented.

  16. Calculating method for confinement time and charge distribution of ions in electron cyclotron resonance sources

    SciTech Connect

    Dougar-Jabon, V.D.; Umnov, A.M.; Kutner, V.B.

    1996-03-01

    It is common knowledge that the electrostatic pit in a core plasma of electron cyclotron resonance sources exerts strict control over generation of ions in high charge states. This work is aimed at finding a dependence of the lifetime of ions on their charge states in the core region and to elaborate a numerical model of ion charge dispersion not only for the core plasmas but for extracted beams as well. The calculated data are in good agreement with the experimental results on charge distributions and magnitudes for currents of beams extracted from the 14 GHz DECRIS source. {copyright} {ital 1996 American Institute of Physics.}

  17. Simulating of the measurement-device independent quantum key distribution with phase randomized general sources

    PubMed Central

    Wang, Qin; Wang, Xiang-Bin

    2014-01-01

    We present a model on the simulation of the measurement-device independent quantum key distribution (MDI-QKD) with phase randomized general sources. It can be used to predict experimental observations of a MDI-QKD with linear channel loss, simulating corresponding values for the gains, the error rates in different basis, and also the final key rates. Our model can be applicable to the MDI-QKDs with arbitrary probabilistic mixture of different photon states or using any coding schemes. Therefore, it is useful in characterizing and evaluating the performance of the MDI-QKD protocol, making it a valuable tool in studying the quantum key distributions. PMID:24728000

  18. A bio-inspired cooperative algorithm for distributed source localization with mobile nodes.

    PubMed

    Khalili, Azam; Rastegarnia, Amir; Islam, Md Kafiul; Yang, Zhi

    2013-01-01

    In this paper we propose an algorithm for distributed optimization in mobile nodes. Compared with many published works, an important consideration here is that the nodes do not know the cost function beforehand. Instead of decision-making based on linear combination of the neighbor estimates, the proposed algorithm relies on information-rich nodes that are iteratively identified. To quickly find these nodes, the algorithm adopts a larger step size during the initial iterations. The proposed algorithm can be used in many different applications, such as distributed odor source localization and mobile robots. Comparative simulation results are presented to support the proposed algorithm.

  19. Reactive hydro- end chlorocarbons in the troposphere and lower stratosphere : sources, distributions, and chemical impact

    NASA Astrophysics Data System (ADS)

    Scheeren, H. A.

    2003-09-01

    The work presented in this thesis focuses on measurements of chemical reactive C2 C7 non-methane hydrocarbons (NMHC) and C1 C2 chlorocarbons with atmospheric lifetimes of a few hours up to about a year. The group of reactive chlorocarbons includes the most abundant atmospheric species with large natural sources, which are chloromethane (CH3Cl), dichloromethane (CH2Cl2), and trichloromethane (CHCl3), and tetrachloroethylene (C2Cl4) with mainly anthropogenic sources. The NMHC and chlorocarbons are present at relatively low quantities in our atmosphere (10-12 10-9 mol mol-1 of air). Nevertheless, they play a key role in atmospheric photochemistry. For example, the oxidation of NMHC plays a dominant role in the formation of ozone in the troposphere, while the photolysis of chlorocarbons contributes to enhanced ozone depletion in the stratosphere. In spite of their important role, however, their global source and sinks budgets are still poorly understood. Hence, this study aims at improving our understanding of the sources, distribution, and chemical role of reactive NMHC and chlorocarbons in the troposphere and lower stratosphere. To meet this aim, a comprehensive data set of selected C2 C7 NMHC and chlorocarbons has been analyzed, derived from six aircraft measurement campaigns with two different jet aircrafts (the Dutch TUD/NLR Cessna Citation PH-LAB, and the German DLR Falcon) conducted between 1995 and 2001 (STREAM 1995 and 1997 and 1998, LBA-CLAIRE 1998, INDOEX 1999, MINOS 2001). The NMHC and chlorocarbons have been detected by gas-chromatography (GC-FID/ECD) in pre-concentrated whole air samples collected in stainless steel canister on-board the measurement aircrafts. The measurement locations include tropical (Maldives/Indian Ocean and Surinam), midlatitude (Western Europe and Canada) and polar regions (Lapland/northern Sweden) between the equator to about 70ºN, covering different seasons and pollution levels in the troposphere and lower stratosphere. Of

  20. Long term change in relative contribution of various source regions on the surface ozone over Japan

    NASA Astrophysics Data System (ADS)

    Nagashima, T.; Sudo, K.; Akimoto, H.; Kurokawa, J.; Ohara, T.

    2011-12-01

    Although the concentrations of O3 precursors over Japan have been decreasing in recent decades, long-term monitoring data shows that the surface concentration of O3 in Japan has increased since the mid-1980s until the present time. As the cause of this recent increase in surface O3 over Japan, the trans-boundary transport of O3 from the outside of Japan have been pointed out and discussed. In particular, transport from East Asian countries whose emissions of the O3 precursors have been greatly increasing recently due to their economic growth is likely a major cause of the observed increase in O3 over Japan. However, the long-term change in other factors that also have an influence on the O3 in Japan, such as the domestic emissions or the background O3, should also be evaluated. Here, we performed a long-term (1980-2005) simulation of the Source-Receptor (S-R) relationship for surface O3 in East Asia by utilizing the tagged tracer method with a global chemical transport model. During this period, emissions of O3 precursors in the model from East Asia, especially from China, have increased more than double, while those from North America have not change so much and those from Europe have decreased. The model simulated the long-term increasing trend in the surface O3 over Japan similar to the observation. The long-term changes of contributions from each source region of O3 showed that the largest contributor to the increasing trend of surface O3 in Japan is the increase of O3 created in planetary boundary layer (PBL) of China which accounts for 35% of the trend, and those of O3 created in PBL of Korean Peninsular and Japan account for 13% and 12%, respectively. The O3 created in free troposphere of China also increased, which account for 4% of the trend. Therefore, almost 40% of recent O3 increase in Japan can be attributed to the increase in the O3 created over China.

  1. The Analytical Repository Source-Term (AREST) model: Analysis of spent fuel as a nuclear waste form

    SciTech Connect

    Apted, M.J.; Liebetrau, A.M.; Engel, D.W.

    1989-02-01

    The purpose of this report is to assess the performance of spent fuel as a final waste form. The release of radionuclides from spent nuclear fuel has been simulated for the three repository sites that were nominated for site characterization in accordance with the Nuclear Waste Policy Act of 1982. The simulation is based on waste package designs that were presented in the environmental assessments prepared for each site. Five distinct distributions for containment failure have been considered, and the release for nuclides from the UO/sub 2/ matrix, gap (including grain boundary), crud/surface layer, and cladding has been calculated with the Analytic Repository Source-Term (AREST) code. Separate scenarios involving incongruent and congruent release from the UO/sub 2/ matrix have also been examined using the AREST code. Congruent release is defined here as the condition in which the relative mass release rates of a given nuclide and uranium from the UO/sub 2/ matrix are equal to their mass ratios in the matrix. Incongruent release refers to release of a given nuclide from the UO/sub 2/ matrix controlled by its own solubility-limiting solid phase. Release of nuclides from other sources within the spent fuel (e.g., cladding, fuel/cladding gap) is evaluated separately from either incongruent or congruent matrix release. 51 refs., 200 figs., 9 tabs.

  2. Source contributions to the size and composition distribution of urban particulate air pollution

    NASA Astrophysics Data System (ADS)

    Kleeman, Michael J.; Cass, Glen R.

    A mechanistic air quality model has been constructed which is capable of predicting the contribution of individual emissions source types to the size- and chemical-composition distribution of airborne particles. This model incorporates all of the major aerosol processes relevant to regional air pollution studies including emissions, transport, deposition, gas-to-particle conversion and fog chemistry. In addition, the aerosol is represented as a source-oriented external mixture which is allowed to age in a more realistic fashion than can be accomplished when fresh particle-phase emissions are averaged into the pre-existing atmospheric aerosol size and composition distribution. A source-oriented external mixture is created by differentiating the primary particles emitted from the following source types: catalyst-equipped gasoline engines, non-catalyst-equipped gasoline engines, diesel engines, meat cooking, paved road dust, crustal material from sources other than paved road dust, and sulfur-bearing particles from fuel burning and industrial processes. Discrete primary seed particles from each of these source types are emitted into a simulation of atmospheric transport and chemical reaction. The individual particles evolve over time in the presence of gas-to-particle conversion processes while retaining information on the initial source from which they were emitted. The source- and age-resolved particle mechanics model is applied to the 1987 August SCAQS episode and comparisons are made between model predictions and observations at Claremont, CA. The model explains the origin of the bimodal character of the sub-micron aerosol size distribution. The mode located between 0.2 and 0.3 μm particle diameter is shaped by transformed emissions from diesel engines and meat cooking operations with lesser contributions from gasolinepowered vehicles and other fuel burning. The larger mode located at 0.7-0.8 μm particle diameter is due to fine particle background aerosol that

  3. Quantifying the Combined Effect of Radiation Therapy and Hyperthermia in Terms of Equivalent Dose Distributions

    SciTech Connect

    Kok, H. Petra; Crezee, Johannes; Franken, Nicolaas A.P.; Barendsen, Gerrit W.

    2014-03-01

    Purpose: To develop a method to quantify the therapeutic effect of radiosensitization by hyperthermia; to this end, a numerical method was proposed to convert radiation therapy dose distributions with hyperthermia to equivalent dose distributions without hyperthermia. Methods and Materials: Clinical intensity modulated radiation therapy plans were created for 15 prostate cancer cases. To simulate a clinically relevant heterogeneous temperature distribution, hyperthermia treatment planning was performed for heating with the AMC-8 system. The temperature-dependent parameters α (Gy{sup −1}) and β (Gy{sup −2}) of the linear–quadratic model for prostate cancer were estimated from the literature. No thermal enhancement was assumed for normal tissue. The intensity modulated radiation therapy plans and temperature distributions were exported to our in-house-developed radiation therapy treatment planning system, APlan, and equivalent dose distributions without hyperthermia were calculated voxel by voxel using the linear–quadratic model. Results: The planned average tumor temperatures T90, T50, and T10 in the planning target volume were 40.5°C, 41.6°C, and 42.4°C, respectively. The planned minimum, mean, and maximum radiation therapy doses were 62.9 Gy, 76.0 Gy, and 81.0 Gy, respectively. Adding hyperthermia yielded an equivalent dose distribution with an extended 95% isodose level. The equivalent minimum, mean, and maximum doses reflecting the radiosensitization by hyperthermia were 70.3 Gy, 86.3 Gy, and 93.6 Gy, respectively, for a linear increase of α with temperature. This can be considered similar to a dose escalation with a substantial increase in tumor control probability for high-risk prostate carcinoma. Conclusion: A model to quantify the effect of combined radiation therapy and hyperthermia in terms of equivalent dose distributions was presented. This model is particularly instructive to estimate the potential effects of interaction from different

  4. Preliminary investigation of processes that affect source term identification. Environmental Restoration Program

    SciTech Connect

    Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.

    1991-09-01

    Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ({sup 3}H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total {sup 3}H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon et al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use {sup 3}H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily {sup 3}H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the {sup 3}H discharge from SWSA 5 to streams is increasing or decreasing.

  5. Temporal-spatial distribution of non-point source pollution in a drinking water source reservoir watershed based on SWAT

    NASA Astrophysics Data System (ADS)

    Wang, M.; Cheng, W.; Yu, B.-S.; Fang, Y.

    2015-05-01

    The conservation of drinking water source reservoirs has a close relationship between regional economic development and people's livelihood. Research on the non-point pollution characteristics in its watershed is crucial for reservoir security. Tang Pu Reservoir watershed was selected as the study area. The non-point pollution model of Tang Pu Reservoir was established based on the SWAT (Soil and Water Assessment Tool) model. The model was adjusted to analyse the temporal-spatial distribution patterns of total nitrogen (TN) and total phosphorus (TP). The results showed that the loss of TN and TP in the reservoir watershed were related to precipitation in flood season. And the annual changes showed an "M" shape. It was found that the contribution of loss of TN and TP accounted for 84.5% and 85.3% in high flow years, and for 70.3% and 69.7% in low flow years, respectively. The contributions in normal flow years were 62.9% and 63.3%, respectively. The TN and TP mainly arise from Wangtan town, Gulai town, and Wangyuan town, etc. In addition, it was found that the source of TN and TP showed consistency in space.

  6. Theoretical and measured electric field distributions within an annular phased array: consideration of source antennas.

    PubMed

    Zhang, Y; Joines, W T; Jirtle, R L; Samulski, T V

    1993-08-01

    The magnitude of E-field patterns generated by an annular array prototype device has been calculated and measured. Two models were used to describe the radiating sources: a simple linear dipole and a stripline antenna model. The stripline model includes detailed geometry of the actual antennas used in the prototype and an estimate of the antenna current based on microstrip transmission line theory. This more detailed model yields better agreement with the measured field patterns, reducing the rms discrepancy by a factor of about 6 (from approximately 23 to 4%) in the central region of interest where the SEM is within 25% of the maximum. We conclude that accurate modeling of source current distributions is important for determining SEM distributions associated with such heating devices.

  7. Spatial distribution and source apportionment of PAHs in surficial sediments of the Yangtze Estuary, China.

    PubMed

    Li, Baohua; Feng, Chenghong; Li, Xue; Chen, Yaxin; Niu, Junfeng; Shen, Zhenyao

    2012-03-01

    Spatial distribution and source apportionment of polycyclic aromatic hydrocarbons (PAHs) in the surface sediments of the Yangtze Estuary, especially the North Branch, have been fully investigated. PAH concentrations increased with the descending distance from the inner estuary to the adjacent sea, and varied significantly in various estuarine regions. Water currents (e.g., river runoff and ocean current) greatly affected the distribution pattern. In addition, ambient sewage and traffic also contributed to the PAH pollution in the estuary. In the adjacent sea, PAH values along the -20m isobath were higher than those along the -10m isobath due to the "marginal filter" phenomenon formed by different water currents. In most sites, PAHs had poor correlations with sediment size, but had positive correlations with total organic carbon. Based on the qualitative and quantitative analysis results, PAH sources were proved to be mainly from a mixture of petroleum combustion, biomass, and coal combustion. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Passive-scheme analysis for solving the untrusted source problem in quantum key distribution

    SciTech Connect

    Peng Xiang; Xu Bingjie; Guo Hong

    2010-04-15

    As a practical method, the passive scheme is useful to monitor the photon statistics of an untrusted source in a 'Plug and Play' quantum key distribution (QKD) system. In a passive scheme, three kinds of monitor mode can be adopted: average photon number (APN) monitor, photon number analyzer (PNA), and photon number distribution (PND) monitor. In this paper, the security analysis is rigorously given for the APN monitor, while for the PNA, the analysis, including statistical fluctuation and random noise, is addressed with a confidence level. The results show that the PNA can achieve better performance than the APN monitor and can asymptotically approach the theoretical limit of the PND monitor. Also, the passive scheme with the PNA works efficiently when the signal-to-noise ratio (R{sup SN}) is not too low and so is highly applicable to solve the untrusted source problem in the QKD system.

  9. In situ image segmentation using the convexity of illumination distribution of the light sources.

    PubMed

    Zhang, Li

    2008-10-01

    When separating objects from a background in an image, we often meet difficulties in obtaining the precise output due to the unclear edges of the objects, as well as the poor or nonuniform illumination. In order to solve this problem, this paper presents an in situ segmentation method which takes advantages of the distribution feature of illumination of light sources, rather than analyzing the image pixels themselves. After analyzing the convexity of illumination distribution (CID) of point and linear light sources, the paper makes use of the CID features to find pixels belonging to the background. Then some background pixels are selected as control points to reconstruct the image background by means of B-spline; finally, by subtracting the reconstructed background from the original image, global thresholding can be employed to make the final segmentation. Quantitative evaluation experiments are made to test the performance of the method.

  10. Dataset for Testing Contamination Source Identification Methods for Water Distribution Networks

    EPA Pesticide Factsheets

    This dataset includes the results of a simulation study using the source inversion techniques available in the Water Security Toolkit. The data was created to test the different techniques for accuracy, specificity, false positive rate, and false negative rate. The tests examined different parameters including measurement error, modeling error, injection characteristics, time horizon, network size, and sensor placement. The water distribution system network models that were used in the study are also included in the dataset. This dataset is associated with the following publication:Seth, A., K. Klise, J. Siirola, T. Haxton , and C. Laird. Testing Contamination Source Identification Methods for Water Distribution Networks. Journal of Environmental Division, Proceedings of American Society of Civil Engineers. American Society of Civil Engineers (ASCE), Reston, VA, USA, ., (2016).

  11. Dynamical changes of ion current distribution for a Penning discharge source using a Langmuir probe arraya)

    NASA Astrophysics Data System (ADS)

    Li, M.; Xiang, W.; Xiao, K. X.; Chen, L.

    2012-02-01

    A paralleled plate electrode and a 9-tip Langmuir probe array located 1 mm behind the extraction exit of a cold cathode Penning ion source are employed to measure the total current and the dynamical changes of the ion current in the 2D profile, respectively. Operation of the ion source by 500 V DC power supply, the paralleled plate electrode and the Langmuir probe array are driven by a bias voltage ranging from -200 V to 200 V. The dependence of the total current and the dynamical changes of the ion current in the 2D profile are presented at the different bias voltage. The experimental results show that the distribution of ion current is axial symmetry and approximate a unimodal distribution.

  12. A Monte Carlo study on dose distribution evaluation of Flexisource 192Ir brachytherapy source

    PubMed Central

    Alizadeh, Majid; Ghorbani, Mahdi; Haghparast, Abbas; Zare, Naser; Ahmadi Moghaddas, Toktam

    2015-01-01

    Aim The aim of this study is to evaluate the dose distribution of the Flexisource 192Ir source. Background Dosimetric evaluation of brachytherapy sources is recommended by task group number 43 (TG. 43) of American Association of Physicists in Medicine (AAPM). Materials and methods MCNPX code was used to simulate Flexisource 192Ir source. Dose rate constant and radial dose function were obtained for water and soft tissue phantoms and compared with previous data on this source. Furthermore, dose rate along the transverse axis was obtained by simulation of the Flexisource and a point source and the obtained data were compared with those from Flexiplan treatment planning system (TPS). Results The values of dose rate constant obtained for water and soft tissue phantoms were equal to 1.108 and 1.106, respectively. The values of the radial dose function are listed in the form of tabulated data. The values of dose rate (cGy/s) obtained are shown in the form of tabulated data and figures. The maximum difference between TPS and Monte Carlo (MC) dose rate values was 11% in a water phantom at 6.0 cm from the source. Conclusion Based on dosimetric parameter comparisons with values previously published, the accuracy of our simulation of Flexisource 192Ir was verified. The results of dose rate constant and radial dose function in water and soft tissue phantoms were the same for Flexisource and point sources. For Flexisource 192Ir source, the results of TPS calculations in a water phantom were in agreement with the simulations within the calculation uncertainties. Furthermore, the results from the TPS calculation for Flexisource and MC calculation for a point source were practically equal within the calculation uncertainties. PMID:25949224

  13. [Case study of red water phenomenon in drinking water distribution systems caused by water source switch].

    PubMed

    Wang, Yang; Zhang, Xiao-jian; Chen, Chao; Pan, An-jun; Xu, Yang; Liao, Ping-an; Zhang, Su-xia; Gu, Jun-nong

    2009-12-01

    Red water phenomenon occurred in some communities of a city in China after water source switch in recent days. The origin of this red water problem and mechanism of iron release were investigated in the study. Water quality of local and new water sources was tested and tap water quality in suffered area had been monitored for 3 months since red water occurred. Interior corrosion scales on the pipe which was obtained from the suffered area were analyzed by XRD, SEM, and EDS. Corrosion rates of cast iron under the conditions of two source water were obtained by Annular Reactor. The influence of different source water on iron release was studied by pipe section reactor to simulate the distribution systems. The results indicated that large increase of sulfate concentration by water source shift was regarded as the cause of red water problem. The Larson ratio increased from about 0.4 to 1.7-1.9 and the red water problem happened in the taps of some urban communities just several days after the new water source was applied. The mechanism of iron release was concluded that the stable shell of scales in the pipes had been corrupted by this kind of high-sulfate-concentration source water and it was hard to recover soon spontaneously. The effect of sulfate on iron release of the old cast iron was more significant than its effect on enhancing iron corrosion. The rate of iron release increased with increasing Larson ratio, and the correlation of them was nonlinear on the old cast-iron. The problem remained quite a long time even if the water source re-shifted into the blended one with only small ratio of the new source and the Larson ratio reduced to about 0.6.

  14. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    PubMed

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  15. Single-Event Correlation Analysis of Quantum Key Distribution with Single-Photon Sources

    NASA Astrophysics Data System (ADS)

    Shangli Dong,; Xiaobo Wang,; Guofeng Zhang,; Liantuan Xiao,; Suotang Jia,

    2010-04-01

    Multiple photons exist that allow efficient eavesdropping strategies that threaten the security of quantum key distribution. In this paper, we theoretically discuss the photon correlations between authorized partners in the case of practical single-photon sources including a multiple-photon background. To investigate the feasibility of intercept-resend attacks, the cross correlations and the maximum intercept-resend ratio caused by the background signal are determined using single-event correlation analysis based on single-event detection.

  16. Influence of the electron source distribution on field-aligned currents

    NASA Astrophysics Data System (ADS)

    Bruening, K.; Goertz, C. K.

    1985-01-01

    The field-aligned current density above a discrete auroral arc has been deduced from the downward electron flux and magnetic field measurements onboard the rocket Porcupine flight 4. Both measurements show that the field-aligned current density is, in spite of decreasing peak energies towards the edge of the arc, about 4 times higher there than in the center of the arc. This can be explained by using the single particle description for an anisotropic electron source distribution.

  17. Distribution and probable source of nitrate in ground water of Paradise Valley, Arizona

    SciTech Connect

    Silver, B.A.; Fielden, J.R.

    1980-01-01

    Two theories have been proposed regarding the source of nitrate in Paradise Valley ground water: one suggests contamination by fertilizers and by treated wastewater effluent, and the other suggests that ammonium chloride, leached from tuffs in the adjacent Superstition Mountains, is oxidized to nitrate and deposited in a braided stream complex. The geology, hydrogeology, and distribution of nitrate in Paradise Valley ground water are described.

  18. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  19. Unveiling the Gamma-Ray Source Count Distribution Below the Fermi Detection Limit with Photon Statistics

    DOE PAGES

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; ...

    2016-07-26

    The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. In this paper, we employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b| greater-than or slanted equal to 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6more » yr Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2 x 10-11cm-2s-1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken power law, with a break at 2.1+1.0-1.3 x 10-8cm-2s-1. The power-law index n1 = 3.1+0.7-0.5 for bright sources above the break hardens to n2 = 1.97 ± 0.03 for fainter sources below the break. A possible second break of the dN/dS distribution is constrained to be at fluxes below 6.4 x 10-11cm-2s-1 at 95% confidence level. Finally, the high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ~25% point sources, ~69.3% diffuse Galactic foreground emission, and ~6% isotropic diffuse background.« less

  20. A time-dependent volume of distribution term used to describe linear concentration-time profiles.

    PubMed

    Colburn, W A

    1983-08-01

    The present study was conducted to develop, test, and apply a single exponential time-dependent volume of distribution function to describe the pharmacokinetics of compounds following instantaneous, zero-order and first order input. Simulations were used to show the applicability and flexibility of the equations. Experimental data from the literature were fitted using the equations developed in the present study. In addition, the results of these fitted curves were compared to the results of the original fitting procedures to compare and contrast the methods. In the present analysis, the change from an initial volume (V1) to the total volume (V T) is perceived as a simple exponential function. Therefore, a single exponential term to describe elimination and a single exponential term to describe the change from V1 to V T were used to describe the entire blood concentration profile. The results from present simulations and fitted data indicate that the transition from V1 to V T is a more continuous process than observed with classical methods and is consistent with results obtained from physiologic flow-limited models. This observation suggests that the present curve-fitting technique may be more akin to physiologic reality, in that it depicts the change from the initial volume of distribution to the total volume of distribution as a continuous exponential function which reflects the establishment of an equilibrium.

  1. Short-Term Synaptic Depression Is Topographically Distributed in the Cochlear Nucleus of the Chicken

    PubMed Central

    Oline, Stefan N.

    2014-01-01

    In the auditory system, sounds are processed in parallel frequency-tuned circuits, beginning in the cochlea. Activity of auditory nerve fibers reflects this frequency-specific topographic pattern, known as tonotopy, and imparts frequency tuning onto their postsynaptic target neurons in the cochlear nucleus. In birds, cochlear nucleus magnocellularis (NM) neurons encode the temporal properties of acoustic stimuli by “locking” discharges to a particular phase of the input signal. Physiological specializations exist in gradients corresponding to the tonotopic axis in NM that reflect the characteristic frequency (CF) of their auditory nerve fiber inputs. One feature of NM neurons that has not been investigated across the tonotopic axis is short-term synaptic plasticity. NM offers a rather homogeneous population of neurons with a distinct topographical distribution of synaptic properties that is ideal for the investigation of specialized synaptic plasticity. Here we demonstrate for the first time that short-term synaptic depression (STD) is expressed topographically, where unitary high CF synapses are more robust with repeated stimulation. Correspondingly, high CF synapses drive spiking more reliably than their low CF counterparts. We show that postsynaptic AMPA receptor desensitization does not contribute to the observed difference in STD. Further, rate of recovery from depression, a presynaptic property, does not differ tonotopically. Rather, we show that another presynaptic feature, readily releasable pool (RRP) size, is tonotopically distributed and inversely correlated with vesicle release probability. Mathematical model results demonstrate that these properties of vesicle dynamics are sufficient to explain the observed tonotopic distribution of STD. PMID:24453322

  2. Short-term synaptic depression is topographically distributed in the cochlear nucleus of the chicken.

    PubMed

    Oline, Stefan N; Burger, R Michael

    2014-01-22

    In the auditory system, sounds are processed in parallel frequency-tuned circuits, beginning in the cochlea. Activity of auditory nerve fibers reflects this frequency-specific topographic pattern, known as tonotopy, and imparts frequency tuning onto their postsynaptic target neurons in the cochlear nucleus. In birds, cochlear nucleus magnocellularis (NM) neurons encode the temporal properties of acoustic stimuli by "locking" discharges to a particular phase of the input signal. Physiological specializations exist in gradients corresponding to the tonotopic axis in NM that reflect the characteristic frequency (CF) of their auditory nerve fiber inputs. One feature of NM neurons that has not been investigated across the tonotopic axis is short-term synaptic plasticity. NM offers a rather homogeneous population of neurons with a distinct topographical distribution of synaptic properties that is ideal for the investigation of specialized synaptic plasticity. Here we demonstrate for the first time that short-term synaptic depression (STD) is expressed topographically, where unitary high CF synapses are more robust with repeated stimulation. Correspondingly, high CF synapses drive spiking more reliably than their low CF counterparts. We show that postsynaptic AMPA receptor desensitization does not contribute to the observed difference in STD. Further, rate of recovery from depression, a presynaptic property, does not differ tonotopically. Rather, we show that another presynaptic feature, readily releasable pool (RRP) size, is tonotopically distributed and inversely correlated with vesicle release probability. Mathematical model results demonstrate that these properties of vesicle dynamics are sufficient to explain the observed tonotopic distribution of STD.

  3. Source contributions to the regional distribution of secondary particulate matter in California

    NASA Astrophysics Data System (ADS)

    Ying, Qi; Kleeman, Michael J.

    Source contributions to PM2.5 nitrate, sulfate and ammonium ion concentrations in California's San Joaquin Valley (SJV) (4-6 January 1996) and South Coast Air Basin (SoCAB) surrounding Los Angeles (23-25 September 1996) were predicted using a three-dimensional source-oriented Eulerian air quality model. The air quality model tracks the formation of PM2.5 nitrate, sulfate and ammonium ion from primary particles and precursor gases emitted from different sources though a mathematical simulation of emission, chemical reaction, gas-to-particle conversion, transport and deposition. The observed PM2.5 nitrate, sulfate and ammonium ion concentrations, and the mass distribution of nitrate, sulfate and ammonium ion as a function of particle size have been successfully reproduced by the model simulation. Approximately 45-57% of the PM2.5 nitrate and 34-40% of the PM2.5 ammonium ion in the SJV is formed from precursor gaseous species released from sources upwind of the valley. In the SoCAB, approximately 83% of the PM2.5 nitrate and 82% of the PM2.5 ammonium ion is formed from precursor gaseous species released from sources within the air basin. In the SJV, transportation related sources contribute approximately 24-30% of the PM2.5 nitrate (diesel engines ˜13.5-17.0%, catalyst equipped gasoline engines ˜10.2-12.8% and non-catalyst equipped gasoline engines ˜0.3-0.4%). In the SoCAB, transportation related sources directly contribute to approximately 67% of the PM2.5 nitrate (diesel engines 34.6%, non-catalyst equipped gasoline engine 4.7% and catalyst equipped gasoline engine 28.1%). PM2.5 ammonium ion concentrations in the SJV were dominated by area (including animal) NH 3 sources (16.7-25.3%), soil (7.2-10.9%), fertilizer NH 3 sources (11.4-17.3%) and point NH 3 sources (14.3-21.7%). In the SoCAB, ammonium ion is mainly associated with animal sources (28.2%) and catalyst equipped gasoline engines (16.2%). In both regions, the majority of the relatively low PM2.5 sulfate

  4. Evaluation of severe accident risks: Quantification of major input parameters. Experts` determination of source term issues: Volume 2, Revision 1, Part 4

    SciTech Connect

    Harper, F.T.; Breeding, R.J.; Brown, T.D.; Gregory, J.J.; Jow, H.N.; Payne, A.C.; Gorham, E.D.; Amos, C.N.; Helton, J.; Boyd, G.

    1992-06-01

    In support of the Nuclear Regulatory Commission`s (NRC`s) assessment of the risk from severe accidents at commercial nuclear power plants in the US reported in NUREG-1150, the Severe Accident Risk Reduction Program (SAARP) has completed a revised calculation of the risk to the general public from severe accidents at five nuclear power plants: Surry, Sequoyah, Zion, Peach Bottom and Grand Gulf. The emphasis in this risk analysis was not on determining a point estimate of risk, but to determine the distribution of risk, and to assess the uncertainties that account for the breadth of this distribution. Off-site risk initiation by events, both internal to the power station and external to the power station. Much of this important input to the logic models was generated by expert panels. This document presents the distributions and the rationale supporting the distributions for the questions posed to the Source Term Panel.

  5. Microbial activity and distribution during enhanced contaminant dissolution from a NAPL source zone.

    PubMed

    Amos, Benjamin K; Suchomel, Eric J; Pennell, Kurt D; Löffler, Frank E

    2008-06-01

    Laboratory experiments were conducted to assess microbial reductive dechlorination in one-dimensional sand columns containing a 10 cm long source zone of uniformly distributed residual tetrachloroethene (PCE) nonaqueous phase liquid (NAPL), a 10 cm long transition zone directly down-gradient of the source zone containing some nonuniformly distributed NAPL ganglia, and a 40 cm long plume region down-gradient of the transition zone. The activity and distribution of Sulfurospirillum multivorans, a PCE-to-1,2-cis-dichloroethene (cis-DCE) dechlorinating bacterium, was evaluated in columns containing either a mixed-NAPL (0.25 mol/mol PCE in hexadecane) or pure PCE-NAPL. Significant dechlorination of PCE to cis-DCE was observed in the mixed-NAPL column, resulting in 53% PCE-NAPL mass recovery in the effluent with PCE-NAPL dissolution enhanced by up to 13.6-fold (maximum) and 4.6-fold (cumulative) relative to abiotic dissolution. Quantitative real-time PCR targeting pceA, the PCE reductive dehalogenase gene of S. multivorans, revealed that S. multivorans cells were present in the NAPL source zone, and increased in numbers (i.e., grew) throughout the source and transition zones. In contrast, minimal reductive dechlorination and microbial growth were observed in the column containing pure PCE-NAPL, where aqueous-phase PCE concentrations reached saturation. These results demonstrate that microbial growth within NAPL source zones is possible, provided that contaminant concentrations remain below levels toxic to the dechlorinating organisms, and that microbial growth can result in significant bioenhanced NAPL dissolution.

  6. Polycyclic Aromatic Hydrocarbons in the Dagang Oilfield (China): Distribution, Sources, and Risk Assessment

    PubMed Central

    Jiao, Haihua; Rui, Xiaoping; Wu, Shanghua; Bai, Zhihui; Zhuang, Xuliang; Huang, Zhanbin

    2015-01-01

    The levels of 16 polycyclic aromatic hydrocarbons (PAHs) were investigated in 27 upper layer (0–25 cm) soil samples collected from the Dagang Oilfield (China) in April 2013 to estimate their distribution, possible sources, and potential risks posed. The total concentrations of PAHs (∑PAHs) varied between 103.6 µg·kg−1 and 5872 µg·kg−1, with a mean concentration of 919.8 µg·kg−1; increased concentrations were noted along a gradient from arable desert soil (mean 343.5 µg·kg−1), to oil well areas (mean of 627.3 µg·kg−1), to urban and residential zones (mean of 1856 µg·kg−1). Diagnostic ratios showed diverse source of PAHs, including petroleum, liquid fossil fuels, and biomass combustion sources. Combustion sources were most significant for PAHs in arable desert soils and residential zones, while petroleum sources were a significant source of PAHs in oilfield areas. Based ontheir carcinogenity, PAHs were classified as carcinogenic (B) or not classified/non-carcinogenic (NB). The total concentrations of carcinogenic PAHs (∑BPAHs) varied from 13.3 µg·kg−1 to 4397 µg·kg−1 across all samples, with a mean concentration of 594.4 µg·kg−1. The results suggest that oilfield soil is subject to a certain level of ecological environment risk. PMID:26016436

  7. Potential breeding distributions of U.S. birds predicted with both short-term variability and long-term average climate data.

    PubMed

    Bateman, Brooke L; Pidgeon, Anna M; Radeloff, Volker C; Flather, Curtis H; VanDerWal, Jeremy; Akçakaya, H Resit; Thogmartin, Wayne E; Albright, Thomas P; Vavrus, Stephen J; Heglund, Patricia J

    2016-12-01

    Climate conditions, such as temperature or precipitation, averaged over several decades strongly affect species distributions, as evidenced by experimental results and a plethora of models demonstrating statistical relations between species occurrences and long-term climate averages. However, long-term averages can conceal climate changes that have occurred in recent decades and may not capture actual species occurrence well because the distributions of species, especially at the edges of their range, are typically dynamic and may respond strongly to short-term climate variability. Our goal here was to test whether bird occurrence models can be predicted by either covariates based on short-term climate variability or on long-term climate averages. We parameterized species distribution models (SDMs) based on either short-term variability or long-term average climate covariates for 320 bird species in the conterminous USA and tested whether any life-history trait-based guilds were particularly sensitive to short-term conditions. Models including short-term climate variability performed well based on their cross-validated area-under-the-curve AUC score (0.85), as did models based on long-term climate averages (0.84). Similarly, both models performed well compared to independent presence/absence data from the North American Breeding Bird Survey (independent AUC of 0.89 and 0.90, respectively). However, models based on short-term variability covariates more accurately classified true absences for most species (73% of true absences classified within the lowest quarter of environmental suitability vs. 68%). In addition, they have the advantage that they can reveal the dynamic relationship between species and their environment because they capture the spatial fluctuations of species potential breeding distributions. With this information, we can identify which species and guilds are sensitive to climate variability, identify sites of high conservation value where climate

  8. Potential breeding distributions of U.S. birds predicted with both short-term variability and long-term average climate data

    Treesearch

    Brooke L. Bateman; Anna M. Pidgeon; Volker C. Radeloff; Curtis H. Flather; Jeremy VanDerWal; H. Resit Akcakaya; Wayne E. Thogmartin; Thomas P. Albright; Stephen J. Vavrus; Patricia J. Heglund

    2016-01-01

    Climate conditions, such as temperature or precipitation, averaged over several decades strongly affect species distributions, as evidenced by experimental results and a plethora of models demonstrating statistical relations between species occurrences and long-term climate averages. However, long-term averages can conceal climate changes that have occurred in...

  9. Inverse Analysis of Heat Conduction in Hollow Cylinders with Asymmetric Source Distributions

    NASA Astrophysics Data System (ADS)

    Lambrakos, Samuel G.; Michopoulos, John G.; Jones, Harry N.; Boyer, Craig N.

    2008-10-01

    This paper presents an application of inverse analysis for determining both the temperature field histories and corresponding heat source distributions in hollow cylinders. The primary goal, however, is the development of an inversion infrastructure in a manner that allows taking advantage of all aspects related to its utility, including sensitivity analysis. The conditions generating heat sources are those resulting from intense pulsed-current electrical contact experiments. Under these conditions intense heat currents are generated due to the Joule conversion of the electric conduction currents. Asymmetry of the heat source is induced from the localized melting due to arc-enhanced electric conduction. Experimentally acquired temperature histories and melting domain boundary data are utilized to setup an inverse model of the heat conduction problem. This permits the construction of an estimate not only of the temperature field histories throughout the computational domain but also of an evaluation of the effective thermal diffusivity of the material involved.

  10. Heralded single-photon sources for quantum-key-distribution applications

    NASA Astrophysics Data System (ADS)

    Schiavon, Matteo; Vallone, Giuseppe; Ticozzi, Francesco; Villoresi, Paolo

    2016-01-01

    Single-photon sources (SPSs) are a fundamental building block for optical implementations of quantum information protocols. Among SPSs, multiple crystal heralded single-photon sources seem to give the best compromise between high pair production rate and low multiple photon events. In this work, we study their performance in a practical quantum-key-distribution experiment, by evaluating the achievable key rates. The analysis focuses on the two different schemes, symmetric and asymmetric, proposed for the practical implementation of heralded single-photon sources, with attention on the performance of their composing elements. The analysis is based on the protocol proposed by Bennett and Brassard in 1984 and on its improvement exploiting decoy state technique. Finally, a simple way of exploiting the postselection mechanism for a passive, one decoy state scheme is evaluated.

  11. Balancing continuous-variable quantum key distribution with source-tunable linear optics cloning machine

    NASA Astrophysics Data System (ADS)

    Guo, Ying; Lv, Geli; Zeng, Guihua

    2015-11-01

    We show that the tolerable excess noise can be dynamically balanced in source preparation while inserting a tunable linear optics cloning machine (LOCM) for balancing the secret key rate and the maximal transmission distance of continuous-variable quantum key distribution (CVQKD). The intensities of source noise are sensitive to the tunable LOCM and can be stabilized to the suitable values to eliminate the impact of channel noise and defeat the potential attacks even in the case of the degenerated linear optics amplifier (LOA). The LOCM-additional noise can be elegantly employed by the reference partner of reconciliation to regulate the secret key rate and the transmission distance. Simulation results show that there is a considerable improvement in the secret key rate of the LOCM-based CVQKD while providing a tunable LOCM for source preparation with the specified parameters in suitable ranges.

  12. Radiation Therapy Photon Beams Dose Conformation According to Dose Distribution Around Intracavitary-Applied Brachytherapy Sources

    SciTech Connect

    Jurkovic, Slaven Zauhar, Gordana; Faj, Dario; Radojcic, Deni Smilovic; Svabic, Manda

    2010-04-01

    Intracavitary application of brachytherapy sources followed by external beam radiation is essential for the local treatment of carcinoma of the cervix. Due to very high doses to the central portion of the target volume delivered by brachytherapy sources, this part of the target volume must be shielded while being irradiated by photon beams. Several shielding techniques are available, from rectangular block and standard cervix wedge to more precise, customized step wedge filters. Because the calculation of a step wedge filter's shape was usually based on effective attenuation coefficient, an approach that accounts, in a more precise way, for the scattered radiation, is suggested. The method was verified under simulated clinical conditions using film dosimetry. Measured data for various compensators were compared to the numerically determined sum of the dose distribution around brachytherapy sources and one of compensated beam. Improvements in total dose distribution are demonstrated, using our method. Agreement between calculation and measurements were within 3%. Sensitivity of the method on sources displacement during treatment has also been investigated.

  13. Performance metrics and variance partitioning reveal sources of uncertainty in species distribution models

    USGS Publications Warehouse

    Watling, James I.; Brandt, Laura A.; Bucklin, David N.; Fujisaki, Ikuko; Mazzotti, Frank J.; Romanach, Stephanie; Speroterra, Carolina

    2015-01-01

    Species distribution models (SDMs) are widely used in basic and applied ecology, making it important to understand sources and magnitudes of uncertainty in SDM performance and predictions. We analyzed SDM performance and partitioned variance among prediction maps for 15 rare vertebrate species in the southeastern USA using all possible combinations of seven potential sources of uncertainty in SDMs: algorithms, climate datasets, model domain, species presences, variable collinearity, CO2 emissions scenarios, and general circulation models. The choice of modeling algorithm was the greatest source of uncertainty in SDM performance and prediction maps, with some additional variation in performance associated with the comprehensiveness of the species presences used for modeling. Other sources of uncertainty that have received attention in the SDM literature such as variable collinearity and model domain contributed little to differences in SDM performance or predictions in this study. Predictions from different algorithms tended to be more variable at northern range margins for species with more northern distributions, which may complicate conservation planning at the leading edge of species' geographic ranges. The clear message emerging from this work is that researchers should use multiple algorithms for modeling rather than relying on predictions from a single algorithm, invest resources in compiling a comprehensive set of species presences, and explicitly evaluate uncertainty in SDM predictions at leading range margins.

  14. Source apportionment of ambient fine particle size distribution using positive matrix factorization in Erfurt, Germany

    PubMed Central

    Yue, Wei; Stölzel, Matthias; Cyrys, Josef; Pitz, Mike; Heinrich, Joachim; Kreyling, Wolfgang G.; Wichmann, H.-Erich; Peters, Annette; Wang, Sheng; Hopke, Philip K.

    2008-01-01

    Particle size distribution data collected between September 1997 and August 2001 in Erfurt, Germany were used to investigate the sources of ambient particulate matter by positive matrix factorization (PMF). A total of 29,313 hourly averaged particle size distribution measurements covering the size range of 0.01 to 3.0 μm were included in the analysis. The particle number concentrations (cm−3) for the 9 channels in the ultrafine range, and mass concentrations (ng m−3) for the 41 size bins in the accumulation mode and particle up to 3 μm in aerodynamic diameter were used in the PMF. The analysis was performed separately for each season. Additional analyses were performed including calculations of the correlations of factor contributions with gaseous pollutants (O3, NO, NO2, CO and SO2) and particle composition data (sulfate, organic carbon and elemental carbon), estimating the contributions of each factor to the total number and mass concentration, identifying the directional locations of the sources using the conditional probability function, and examining the diurnal patterns of factor scores. These results were used to assist in the interpretation of the factors. Five factors representing particles from airborne soil, ultrafine particles from local traffic, secondary aerosols from local fuel combustion, particles from remote traffic sources, and secondary aerosols from multiple sources were identified in all seasons. PMID:18433834

  15. Regional Sources of Nitrous Oxide over the United States: Seasonal Variation and Spatial Distribution

    SciTech Connect

    Miller, S. M.; Kort, E. A.; Hirsch, A. I.; Dlugokencky, E. J.; Andrews, A. E.; Xu, X.; Tian, H.; Nehrkorn, T.; Eluszkiewicz, J.; Michalak, A. M.; Wofsy, S. C.

    2012-01-01

    This paper presents top-down constraints on the magnitude, spatial distribution, and seasonality of nitrous oxide (N{sub 2}O) emissions over the central United States. We analyze data from tall towers in 2004 and 2008 using a high resolution Lagrangian particle dispersion model paired with both geostatistical and Bayesian inversions. Our results indicate peak N{sub 2}O emissions in June with a strong seasonal cycle. The spatial distribution of sources closely mirrors data on fertilizer application with particularly large N{sub 2}O sources over the US Cornbelt. Existing inventories for N{sub 2}O predict emissions that differ substantially from the inverse model results in both seasonal cycle and magnitude. We estimate a total annual N{sub 2}O budget over the central US of 0.9-1.2 TgN/yr and an extrapolated budget for the entire US and Canada of 2.1-2.6 TgN/yr. By this estimate, the US and Canada account for 12-15% of the total global N{sub 2}O source or 32-39% of the global anthropogenic source as reported by the Intergovernmental Panel on Climate Change in 2007.

  16. Reactor-building-basement radionuclide and source distribution studies. Volume 3

    SciTech Connect

    Cox, T.E.; Horan, J.T.; Worku, G.

    1983-06-01

    The Three Mile Island Unit 2 (TMI-2) Reactor Building basement has been sampled several times since August 1979. This report compiles the analytical results and sample history for the liquid and solid samples obtained to date. In addition, basement radiation levels were also obtained using thermoluminescent dosimeters (TLDs). The data obtained will provide information to support ongoing mass balance and source term studies and will aid in characterizing the 282-ft elevation for decontamination planning and dose reduction.

  17. Spatial distribution of the source-receptor relationship of sulfur in Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kajino, M.; Ueda, H.; Sato, K.; Sakurai, T.

    2011-07-01

    The spatial distribution of the source-receptor relationship (SRR) of sulfur over Northeast Asia was examined using a chemical transport model (RAQM) off-line coupled with a meteorological model (MM5). The simulation was conducted for the entire year of 2002. The results were evaluated using monitoring data for six remote stations of the Acid Deposition Monitoring Network in East Asia (EANET). The modeled SO2 and O3 concentrations agreed well with the observations quantitatively. The modeled aerosol and wet deposition fluxes of SO42- were underestimated by 30 % and 50 %, respectively. The domain was divided into 5 source-receptor regions: (I) North China; (II) Central China; (III) South China; (IV) South Korea; and (V) Japan. The sulfur deposition in each receptor region amounted to about 50-75 % of the emissions from the same region. The largest contribution to the deposition in each region was originated from the same region, accounting for 53-84 %. The second largest contribution was due to Region II, supplying 14-43 %. The spatial distributions of the SRRs revealed that subregional values varied by about two times more than regional averages due to nonuniformity across the deposition fields. Examining the spatial distributions of the deposition fields was important for identifying subregional areas where the deposition was highest within a receptor region. The horizontal distribution changed substantially according to season.

  18. Spatial distribution of the source-receptor relationship of sulfur in Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kajino, M.; Ueda, H.; Sato, K.; Sakurai, T.

    2010-12-01

    The spatial distribution of the source-receptor relationship (SRR) of sulfur over Northeast Asia was examined using an off-line coupled meteorological/chemical transport model (MM5/RAQM). The simulation was conducted for the entire year of 2002. The results were evaluated using monitoring data for six remote stations of the Acid Deposition Monitoring Network in East Asia (EANET). The modeled SO2 and O3 concentrations agreed well with the observations quantitatively. The modeled aerosol and wet deposition fluxes of SO42- were underestimated by 30% and 50%, respectively, whereas the modeled precipitation was overestimated by 1.6 to 1.9 times. The domain was divided into 5 source-receptor regions: I, North China; II, Central China; III, South China; IV, South Korea; and V, Japan. The sulfur deposition in each receptor region amounted to about 50-75% of the emissions from the same region. The largest contribution to the deposition in each region was the domestic origin, accounting for 53-84%. The second largest contribution after the domestic origin was due to region II, supplying 14-43%, outside region II itself. The spatial distributions of the SRRs revealed that subregional values varied by about two times more than regional averages due to nonuniformity across the deposition fields. Examining the spatial distributions of the deposition fields was important for identifying subregional areas where the deposition was highest within a receptor region. The horizontal distribution changed substantially according to season.

  19. Analysis of electron energy distribution function in the Linac4 H{sup −} source

    SciTech Connect

    Mochizuki, S. Nishida, K.; Hatayama, A.; Mattei, S.; Lettry, J.

    2016-02-15

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H{sup −} negative ion production by reducing the gas pressure.

  20. Analysis of electron energy distribution function in the Linac4 H- source

    NASA Astrophysics Data System (ADS)

    Mochizuki, S.; Mattei, S.; Nishida, K.; Hatayama, A.; Lettry, J.

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H- negative ion production by reducing the gas pressure.

  1. The Temporal and Spatial Distribution Characteristics of Heating Season and Source Tracing in Beijing

    NASA Astrophysics Data System (ADS)

    Gong, Huili; Zhao, Wenhui; Li, Xiaojuan; Zhao, Wenji

    2013-01-01

    Inhalable particulate matter (IPM) is one of the principal pollutants in Beijing. Sand weather in spring and winter seasons partly because of regional airflow, in most cases it is results from autochthonic pollution, especially in heating season of winter. In this paper, the law of temporal spatial distribution of IPM and the relationship between IPM and influence factors were studied combing RS techniques with ground-based monitoring. The change of underlying surface which were obtained from high resolution Remote Sensing images in different periods was analyzed; the content of different diameter of particles were collected by ground observation instrument and chemical composition were analyzed; the relationship of distribution of IPM and underlying surface was studied using spatial analysis of GIS. The results indicate that the pollution distribution of IPM has a very close relation with underlying surface, man-made pollution sources, population density and meteorological factors.

  2. Analysis of electron energy distribution function in the Linac4 H⁻ source.

    PubMed

    Mochizuki, S; Mattei, S; Nishida, K; Hatayama, A; Lettry, J

    2016-02-01

    To understand the Electron Energy Distribution Function (EEDF) in the Radio Frequency Inductively Coupled Plasmas (RF-ICPs) in hydrogen negative ion sources, the detailed analysis of the EEDFs using numerical simulation and the theoretical approach based on Boltzmann equation has been performed. It is shown that the EEDF of RF-ICPs consists of two parts, one is the low energy part which obeys Maxwellian distribution and the other is high energy part deviated from Maxwellian distribution. These simulation results have been confirmed to be reasonable by the analytical approach. The results suggest that it is possible to enhance the dissociation of molecules and the resultant H(-) negative ion production by reducing the gas pressure.

  3. From Source to City: Particulate Matter Concentration and Size Distribution Data from an Icelandic Dust Storm

    NASA Astrophysics Data System (ADS)

    Thorsteinsson, T.; Mockford, T.; Bullard, J. E.

    2015-12-01

    Dust storms are the source of particulate matter in 20%-25% of the cases in which the PM10health limit is exceeded in Reykjavik; which occurred approximately 20 times a year in 2005-2010. Some of the most active source areas for dust storms in Iceland, contributing to the particulate matter load in Reykjavik, are on the south coast of Iceland, with more than 20 dust storm days per year (in 2002-2011). Measurements of particle matter concentration and size distribution were recorded at Markarfljot in May and June 2015. Markarfljot is a glacial river that is fed by Eyjafjallajokull and Myrdalsjokull, and the downstream sandur areas have been shown to be significant dust sources. Particulate matter concentration during dust storms was recorded on the sandur area using a TSI DustTrak DRX Aerosol Monitor 8533 and particle size data was recorded using a TSI Optical Particle Sizer 3330 (OPS). Wind speed was measured using cup anemometers at five heights. Particle size measured at the source area shows an extremely fine dust creation, PM1 concentration reaching over 5000 μg/m3 and accounting for most of the mass. This is potentially due to sand particles chipping during saltation instead of breaking uniformly. Dust events occurring during easterly winds were captured by two permanent PM10 aerosol monitoring stations in Reykjavik (140 km west of Markarfljot) suggesting the regional nature of these events. OPS measurements from Reykjavik also provide an interesting comparison of particle size distribution from source to city. Dust storms contribute to the particular matter pollution in Reykjavik and their small particle size, at least from this source area, might be a serious health concern.

  4. Integrating multiple data sources in species distribution modeling: a framework for data fusion.

    PubMed

    Pacifici, Krishna; Reich, Brian J; Miller, David A W; Gardner, Beth; Stauffer, Glenn; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A

    2017-03-01

    The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species' occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches ("Shared," "Correlation," "Covariates") for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source ("Single"). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.

  5. Resolution of USQ regarding source term in the 232-Z waste incinerator building

    SciTech Connect

    Westsik, G.

    1995-12-31

    The 232-Z waste incinerator at the Hanford plutonium finishing facility was used to incinerate plutonium-bearing combustible materials generated during normal plant operations. Nondestructive analysis performed after the incinerator ceased operations indicated high plutonium loading in exhaust ductwork near the incinerator glove box, while the incinerator was found to have only low quantities. Measurements following a campaign to remove some of the ductwork resulted in a markedly higher assay valve for the incinerator glove box itself. Subsequent assays confirmed the most recent results and pointed to a potential further underestimation of the holdup, in part due to the attenuation due to fire brick which could not be seen and which had been thought to be present. Resolution of the raised concerns entailed forming a task team to perform further assay based on gamma and neutron NDA methods. This paper is a discussion of the unreviewed safety question regarding the source term in this area.

  6. Microbial characterization for the Source-Term Waste Test Program (STTP) at Los Alamos

    SciTech Connect

    Leonard, P.A.; Strietelmeier, B.A.; Pansoy-Hjelvik, M.E.; Villarreal, R.

    1999-04-01

    The effects of microbial activity on the performance of the proposed underground nuclear waste repository, the Waste Isolation Pilot Plant (WIPP) at Carlsbad, New Mexico are being studied at Los Alamos National Laboratory (LANL) as part of an ex situ large-scale experiment. Actual actinide-containing waste is being used to predict the effect of potential brine inundation in the repository in the distant future. The study conditions are meant to simulate what might exist should the underground repository be flooded hundreds of years after closure as a result of inadvertent drilling into brine pockets below the repository. The Department of Energy (DOE) selected LANL to conduct the Actinide Source-Term Waste Test Program (STTP) to confirm the predictive capability of computer models being developed at Sandia National Laboratory.

  7. The Annular Core Research Reactor (ACRR) postulated limiting event initial and building source terms

    SciTech Connect

    Restrepo, L F

    1992-08-01

    As part of the update of the Safety analysis Report (SAR) for the Annular Core Research Reactor (ACRR), operational limiting events under the category of inadvertent withdrawal of an experiment while at power or during a power pulse were determined to be the most limiting event(s) for this reactor. This report provides a summary of the assumptions, modeling, and results in evaluation of: Reactivity and thermal hydraulics analysis to determine the amount of fuel melt or fuel damage ratios; The reactor inventories following the limiting event; A literature review of post NUREG-0772 release fraction experiment results on severe fuel damages; Decontamination factors due to in-pool transport; and In-building transport modeling and building source term analysis.

  8. ACT: a program for calculation of the changes in radiological source terms with time

    SciTech Connect

    Woolfolk, S.W.

    1985-08-12

    The program ACT calculates the source term activity from a set of initial activities as a function of discrete time steps. This calculation considers inbreeding of daughter products. ACT also calculates ''Probable Release'', which is the activity at a given time multiplied by both the fraction released and the probability of the release. The ''Probable Release'' not only assumes that the fraction released is a single step function with time, but that the probability of release is zero for a limited period and it can be described by the ''Wisconsin Regression'' function using time as the independent variable. Finally, the program calculates the time integrated sum of the ''Probable Release'' for each isotope. This program is intended to support analysis of releases from radioactive waste disposal sites such as those required by 40 CFR 191.

  9. Update to the NARAC NNPP Non-Reactor Source Term Products

    SciTech Connect

    Vogt, P

    2009-06-29

    Recent updates to NARAC plots for NNPP requires a modification to your iClient database. The steps you need to take are described below. Implementation of the non-reactor source terms in February 2009 included four plots, the traditional three instantaneous plots (1-3) and a new Gamma Dose Rate: 1. Particulate Air Concentration 2. Total Ground Deposition 3. Whole Body Inhalation Dose Rate (CEDE Rate) 4. Gamma Dose Rate These plots were all initially implemented to be instantaneous output and generated 30 minutes after the release time. Recently, Bettis and NAVSEA have requested the Whole Body CEDE rate plot to be changed to an integrated dose valid at two hours. This is consistent with the change made to the Thyroid Dose rate plot conversion to a 2-hour Integrated Thyroid dose for the Reactor and Criticality accidents.

  10. Identification of an unknown source term in a vibrating cantilevered beam from final overdetermination

    NASA Astrophysics Data System (ADS)

    Hasanov, Alemdar

    2009-11-01

    Inverse problems of determining the unknown source term F(x, t) in the cantilevered beam equation utt = (EI(x)uxx)xx + F(x, t) from the measured data μ(x) := u(x, T) or ν(x) := ut(x, T) at the final time t = T are considered. In view of weak solution approach, explicit formulae for the Fréchet gradients of the cost functionals J1(F) = ||u(x, T; w) - μ(x)||20 and J2(F) = ||ut(x, T; w) - ν(x)||20 are derived via the solutions of corresponding adjoint (backward beam) problems. The Lipschitz continuity of the gradients is proved. Based on these results the gradient-type monotone iteration process is constructed. Uniqueness and ill-conditionedness of the considered inverse problems are analyzed.

  11. Optimal long-term design, rehabilitation and upgrading of water distribution networks

    NASA Astrophysics Data System (ADS)

    Tanyimboh, Tiku; Kalungi, Paul

    2008-07-01

    Given a limited budget, the choice of the best water distribution network upgrading strategy is a complex optimization problem. A model for the optimal long-term design and upgrading of new and existing water distribution networks is presented. A key strength of the methodology is the use of maximum entropy flows, which reduces the size of the problem and enables the application of linear programming for pipe size optimization. It also ensures the reliability level is high. The capital and maintenance costs and hydraulic performance are considered simultaneously for a predefined design horizon. The timing of upgrading over the entire planning horizon is obtained by dynamic programming. The deterioration over time of the structural integrity and hydraulic capacity of every pipe are explicitly considered. The upgrading options considered include pipe paralleling and replacement. The effectiveness of the model is demonstrated using the water supply network of Wobulenzi town in Uganda.

  12. 76 FR 77223 - SourceGas Distribution LLC; Notice of Petition for Rate Approval and Revised Statement of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Petition for Rate Approval and Revised Statement of Operating Conditions Take notice that on December 1, 2011, SourceGas Distribution LLC...

  13. A simple method for estimating potential source term bypass fractions from confinement structures

    SciTech Connect

    Kalinich, D.A.; Paddleford, D.F.

    1997-07-01

    Confinement structures house many of the operating processes at the Savannah River Site (SRS). Under normal operating conditions, a confinement structure in conjunction with its associated ventilation systems prevents the release of radiological material to the environment. However, under potential accident conditions, the performance of the ventilation systems and integrity of the structure may be challenged. In order to calculate the radiological consequences associated with a potential accident (e.g. fires, explosion, spills, etc.), it is necessary to determine the fraction of the source term initially generated by the accident that escapes from the confinement structure to the environment. While it would be desirable to estimate the potential bypass fraction using sophisticated control-volume/flow path computer codes (e.g. CONTAIN, MELCOR, etc.) in order to take as much credit as possible for the mitigative effects of the confinement structure, there are many instances where using such codes is not tractable due to limits on the level-of-effort allotted to perform the analysis. Moreover, the current review environment, with its emphasis on deterministic/bounding-versus probabilistic/best-estimate-analysis discourages using analytical techniques that require the consideration of a large number of parameters. Discussed herein is a simplified control-volume/flow path approach for calculating source term bypass fraction that is amenable to solution in a spreadsheet or with a commercial mathematical solver (e.g. MathCad or Mathematica). It considers the effects of wind and fire pressure gradients on the structure, ventilation system operation, and Halon discharges. Simple models are used to characterize the engineered and non-engineered flow paths. By making judicious choices for the limited set of problem parameters, the results from this approach can be defended as bounding and conservative.

  14. Geochemistry of dissolved trace elements and heavy metals in the Dan River Drainage (China): distribution, sources, and water quality assessment.

    PubMed

    Meng, Qingpeng; Zhang, Jing; Zhang, Zhaoyu; Wu, Tairan

    2016-04-01

    Dissolved trace elements and heavy metals in the Dan River drainage basin, which is the drinking water source area of South-to-North Water Transfer Project (China), affect large numbers of people and should therefore be carefully monitored. To investigate the distribution, sources, and quality of river water, this study integrating catchment geology and multivariate statistical techniques was carried out in the Dan River drainage from 99 river water samples collected in 2013. The distribution of trace metal concentrations in the Dan River drainage was similar to that in the Danjiangkou Reservoir, indicating that the reservoir was significantly affected by the Dan River drainage. Moreover, our results suggested that As, Sb, Cd, Mn, and Ni were the major pollutants. We revealed extremely high concentrations of As and Sb in the Laoguan River, Cd in the Qingyou River, Mn, Ni, and Cd in the Yinhua River, As and Sb in the Laojun River, and Sb in the Dan River. According to the water quality index, water in the Dan River drainage was suitable for drinking; however, an exposure risk assessment model suggests that As and Sb in the Laojun and Laoguan rivers could pose a high risk to humans in terms of adverse health and potential non-carcinogenic effects.

  15. The aerosol at Barrow, Alaska: long-term trends and source locations

    NASA Astrophysics Data System (ADS)

    Polissar, A. V.; Hopke, P. K.; Paatero, P.; Kaufmann, Y. J.; Hall, D. K.; Bodhaine, B. A.; Dutton, E. G.; Harris, J. M.

    Aerosol data consisting of condensation nuclei (CN) counts, black carbon (BC) mass, aerosol light scattering (SC), and aerosol optical depth (AOD) measured at Barrow, Alaska from 1977 to 1994 have been analyzed by three-way positive matrix factorization (PMF3) by pooling all of the different data into one large three-way array. The PMF3 analysis identified four factors that indicate four different combinations of aerosol sources active throughout the year in Alaska. Two of the factors (F1, F2) represent Arctic haze. The first Arctic haze have factor F1 is dominant in January-February while the second factor F2 is dominant in March-April. They appear to be material that is generally ascribed to long-range transported anthropogenic particles. A lower ratio of condensation nuclei to scattering coefficient loadings is obtained for F2 indicating larger particles. Factor F3 is related to condensation nuclei. It has an annual cycle with two maxima, March and July-August indicating some involvement of marine biogenic sources. The fourth factor F4 represents the contribution to the stratospheric aerosol from the eruptions of El Chichon and Mt. Pinatubo. No significant long-term trend for F1 was detected while F2 shows a negative trend over the period from 1982 to 1994 but not over the whole measurement period. A positive trend of F3 over the whole period has been observed. This trend may be related to increased biogenic sulfur production caused by reductions in the sea-ice cover in the Arctic and/or an air temperature increase in the vicinity of Barrow. Potential source contribution function (PSCF) analysis showed that in winter and spring during 1989 to 1993 regions in Eurasia and North America are the sources of particles measured at barrow. In contrast to this, large areas in the North Pacific Ocean and the Arctic Ocean was contributed to observed high concentrations of CN in the summer season. Three-way positive matrix factorization was an effective method to extract

  16. A simplified radionuclide source term for total-system performance assessment; Yucca Mountain Site Characterization Project

    SciTech Connect

    Wilson, M.L.

    1991-11-01

    A parametric model for releases of radionuclides from spent-nuclear-fuel containers in a waste repository is presented. The model is appropriate for use in preliminary total-system performance assessments of the potential repository site at Yucca Mountain, Nevada; for this reason it is simpler than the models used for detailed studies of waste-package performance. Terms are included for releases from the spent fuel pellets, from the pellet/cladding gap and the grain boundaries within the fuel pellets, from the cladding of the fuel rods, and from the radioactive fuel-assembly parts. Multiple barriers are considered, including the waste container, the fuel-rod cladding, the thermal ``dry-out``, and the waste form itself. The basic formulas for release from a single fuel rod or container are extended to formulas for expected releases for the whole repository by using analytic expressions for probability distributions of some important parameters. 39 refs., 4 figs., 4 tabs.

  17. Spatial distributions of Southern Ocean mesozooplankton communities have been resilient to long-term surface warming.

    PubMed

    Tarling, Geraint A; Ward, Peter; Thorpe, Sally E

    2017-08-29

    The biogeographic response of oceanic planktonic communities to climatic change has a large influence on the future stability of marine food webs and the functioning of global biogeochemical cycles. Temperature plays a pivotal role in determining the distribution of these communities and ocean warming has the potential to cause major distributional shifts, particularly in polar regions where the thermal envelope is narrow. We considered the impact of long-term ocean warming on the spatial distribution of Southern Ocean mesozooplankton communities through examining plankton abundance in relation to sea surface temperature between two distinct periods, separated by around 60 years. Analyses considered 16 dominant mesozooplankton taxa (in terms of biomass and abundance) in the southwest Atlantic sector of the Southern Ocean, from net samples and in situ temperature records collected during the Discovery Investigations (1926-1938) and contemporary campaigns (1996-2013). Sea surface temperature was found to have increased significantly by 0.74°C between the two eras. The corresponding sea surface temperature at which community abundance peaked was also significantly higher in contemporary times, by 0.98°C. Spatial projections indicated that the geographical location of community peak abundance had remained the same between the two eras despite the poleward advance of sea surface isotherms. If the community had remained within the same thermal envelope as in the 1920s-1930s, community peak abundance would be 500 km further south in the contemporary era. Studies in the northern hemisphere have found that dominant taxa, such as calanoid copepods, have conserved their thermal niches and tracked surface isotherms polewards. The fact that this has not occurred in the Southern Ocean suggests that other selective pressures, particularly food availability and the properties of underlying water masses, place greater constraints on spatial distributions in this region. It

  18. Improvement of capabilities of the Distributed Electrochemistry Modeling Tool for investigating SOFC long term performance

    SciTech Connect

    Gonzalez Galdamez, Rinaldo A.; Recknagle, Kurtis P.

    2012-04-30

    This report provides an overview of the work performed for Solid Oxide Fuel Cell (SOFC) modeling during the 2012 Winter/Spring Science Undergraduate Laboratory Internship at Pacific Northwest National Laboratory (PNNL). A brief introduction on the concept, operation basics and applications of fuel cells is given for the general audience. Further details are given regarding the modifications and improvements of the Distributed Electrochemistry (DEC) Modeling tool developed by PNNL engineers to model SOFC long term performance. Within this analysis, a literature review on anode degradation mechanisms is explained and future plans of implementing these into the DEC modeling tool are also proposed.

  19. Spatial distribution and source identification of wet deposition at remote EANET sites in Japan

    NASA Astrophysics Data System (ADS)

    Seto, Sinya; Sato, Manabu; Tatano, Tsutomu; Kusakari, Takashi; Hara, Hiroshi

    Wet deposition of major ions was discussed from the viewpoint of its potential sources for six remote EANET sites in Japan (Rishiri, Happo, Oki, Ogasawara, Yusuhara, and Hedo) having sufficiently high data completeness during 2000-2004. The annual deposition for each site ranged from 12.1 to 46.6 meq m -2 yr -1 for nss-SO 42-, from 5.0 to 21.9 meq m -2 yr -1 for NO 3-. The ranges of annual deposition of the two ions for the sites were lower than those for urban and rural sites in Japanese Acid Deposition Survey by Ministry of the Environment, Japan, and higher than those for global remote marine sites. Factor analysis was performed on log-transformed daily wet deposition of major ions for each site. The obtained two factors were interpreted as (1) acid and soil source (or acid source for some sites), and (2) sea-salt source for all the sites. This indicates that wet deposition of ions over the remote areas in Japan has a similar structure in terms of types of sources. Factor scores of acid and soil source were relatively high during Kosa (Asian dust) events in spring in western Japan. Back-trajectories for high-deposition episodes of acid and soil source (or acid source) for the remote sites showed that episodic air masses frequently came from the northeastern area of Asian Continent in spring and winter, and from central China in summer and autumn. This indicates a large contribution of continental emissions to wet deposition of ions over the remote areas in Japan.