Science.gov

Sample records for distributed source term

  1. Energy decay for viscoelastic plates with distributed delay and source term

    NASA Astrophysics Data System (ADS)

    Mustafa, Muhammad I.; Kafini, Mohammad

    2016-06-01

    In this paper we consider a viscoelastic plate equation with distributed delay and source term. Under suitable conditions on the delay and source term, we establish an explicit and general decay rate result without imposing restrictive assumptions on the behavior of the relaxation function at infinity. Our result allows a wider class of relaxation functions and improves earlier results in the literature.

  2. DUSTMS-D: DISPOSAL UNIT SOURCE TERM - MULTIPLE SPECIES - DISTRIBUTED FAILURE DATA INPUT GUIDE.

    SciTech Connect

    SULLIVAN, T.M.

    2006-01-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). Many of these physical processes are influenced by the design of the disposal facility (e.g., how the engineered barriers control infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This has been done and the resulting models have been incorporated into the computer code DUST-MS (Disposal Unit Source Term-Multiple Species). The DUST-MS computer code is designed to model water flow, container degradation, release of contaminants from the wasteform to the contacting solution and transport through the subsurface media. Water flow through the facility over time is modeled using tabular input. Container degradation models include three types of failure rates: (a) instantaneous (all containers in a control volume fail at once), (b) uniformly distributed failures (containers fail at a linear rate between a specified starting and ending time), and (c) gaussian failure rates (containers fail at a rate determined by a mean failure time, standard deviation and gaussian distribution). Wasteform release models include four release mechanisms: (a) rinse with partitioning (inventory is released instantly upon container failure subject to equilibrium partitioning (sorption) with

  3. Spatial distribution of HTO activity in unsaturated soil depth in the vicinity of long-term release source

    SciTech Connect

    Golubev, A.; Golubeva, V.; Mavrin, S.

    2015-03-15

    Previous studies reported about a correlation between HTO activity distribution in unsaturated soil layer and atmospheric long-term releases of HTO in the vicinity of Savannah River Site. The Tritium Working Group of BIOMASS Programme has performed a model-model intercomparison study of HTO transport from atmosphere to unsaturated soil and has evaluated HTO activity distribution in the unsaturated soil layer in the vicinity of permanent atmospheric sources. The Tritium Working Group has also reported about such a correlation, however the conclusion was that experimental data sets are needed to confirm this conclusion and also to validate appropriate computer models. (authors)

  4. Chernobyl source term estimation

    SciTech Connect

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1990-09-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. The model simulations revealed that the radioactive cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. By optimizing the agreement between the observed cloud arrival times and duration of peak concentrations measured over Europe, Japan, Kuwait, and the US with the model predicted concentrations, it was possible to derive source term estimates for those radionuclides measured in airborne radioactivity. This was extended to radionuclides that were largely unmeasured in the environment by performing a reactor core radionuclide inventory analysis to obtain release fractions for the various chemical transport groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium and about 1% or less of the more refractory elements were released. These estimates are in excellent agreement with those obtained on the basis of worldwide deposition measurements. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the {sup 137}Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests, while the {sup 131}I and {sup 90}Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests. 13 refs., 2 figs., 7 tabs.

  5. Approximate factorization with source terms

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Chyu, W. J.

    1991-01-01

    A comparative evaluation is made of three methodologies with a view to that which offers the best approximate factorization error. While two of these methods are found to lead to more efficient algorithms in cases where factors which do not contain source terms can be diagonalized, the third method used generates the lowest approximate factorization error. This method may be preferred when the norms of source terms are large, and transient solutions are of interest.

  6. Long-term measurements of particle number size distributions and the relationships with air mass history and source apportionment in the summer of Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Z. B.; Hu, M.; Wu, Z. J.; Yue, D. L.; He, L. Y.; Huang, X. F.; Liu, X. G.; Wiedensohler, A.

    2013-10-01

    A series of long-term and temporary measurements were conducted to study the improvement of air quality in Beijing during the Olympic Games period (8-24 August 2008). To evaluate actions taken to improve the air quality, comparisons of particle number and volume size distributions of August 2008 and 2004-2007 were performed. The total particle number and volume concentrations were 14 000 cm-3 and 37 μm-3 cm-3 in August of 2008, respectively. These were reductions of 41% and 35% compared with mean values of August 2004-2007. A cluster analysis on air mass history and source apportionment were performed, exploring reasons for the reduction of particle concentrations. Back trajectories were classified into five major clusters. Air masses from the south direction are always associated with pollution events during the summertime in Beijing. In August 2008, the frequency of air mass arriving from the south was 1.3 times higher compared to the average of the previous years, which however did not result in elevated particle volume concentrations in Beijing. Therefore, the reduced particle number and volume concentrations during the 2008 Beijing Olympic Games cannot be only explained by meteorological conditions. Four factors were found influencing particle concentrations using a positive matrix factorization (PMF) model. They were identified as local and remote traffic emissions, combustion sources as well as secondary transformation. The reductions of the four sources were calculated to 47%, 44%, 43% and 30%, respectively. The significant reductions of particle number and volume concentrations may attribute to actions taken, focusing on primary emissions, especially related to the traffic and combustion sources.

  7. Long-term measurements of particle number size distributions and the relationships with air mass history and source apportionment in the summer of Beijing

    NASA Astrophysics Data System (ADS)

    Wang, Z. B.; Hu, M.; Wu, Z. J.; Yue, D. L.; He, L. Y.; Huang, X. F.; Liu, X. G.; Wiedensohler, A.

    2013-02-01

    A series of long-term and temporary measurements were conducted to study the improvement of air quality in Beijing during Olympic Games period (8-24 August 2008). To evaluate actions taken to improve the air quality, comparisons of particle number and volume size distributions of August 2008 and 2004-2007 were performed. The total particle number and volume concentrations were 14 000 cm-3 and 37 μm3 cm-3 in August of 2008, respectively. These were reductions of 41% and 35% compared with the mean values of August 2004-2007. A cluster analysis on air mass history and source apportionment were performed, exploring reasons of the reduction of particle concentrations. Back trajectories were classified into five major clusters. Air mass from south direction are always associated with pollution events during the summertime of Beijing. In August 2008, the frequency of air mass arriving from south has been twice higher compared to the average of the previous years, these southerly air masses did however not result in elevated particle volume concentrations in Beijing. This result implied that the air mass history was not the key factor, explaining reduced particle number and volume concentrations during the Beijing 2008 Olympic Games. Four factors were found influencing particle concentrations using a Positive matrix factorization (PMF) model. They were identified to local and remote traffic emissions, combustion sources as well as secondary transformation. The reductions of the four sources were calculated to 47%, 44%, 43% and 30%, respectively. The significant reductions of particle number and volume concentrations may attribute to actions taken, focusing on primary emissions, especially related to the traffic and combustion sources.

  8. Infrared image processing devoted to thermal non-contact characterization-Applications to Non-Destructive Evaluation, Microfluidics and 2D source term distribution for multispectral tomography

    NASA Astrophysics Data System (ADS)

    Batsale, Jean-Christophe; Pradere, Christophe

    2015-11-01

    The cost of IR cameras is more and more decreasing. Beyond the preliminary calibration step and the global instrumentation, the infrared image processing is then one of the key step for achieving in very broad domains. Generally the IR images are coming from the transient temperature field related to the emission of a black surface in response to an external or internal heating (active IR thermography). The first applications were devoted to the so called thermal Non-Destructive Evaluation methods by considering a thin sample and 1D transient heat diffusion through the sample (transverse diffusion). With simplified assumptions related to the transverse diffusion, the in-plane diffusion and transport phenomena can be also considered. A general equation can be applied in order to balance the heat transfer at the pixel scale or between groups of pixels in order to estimate several fields of thermophysical properties (heterogeneous field of in-plane diffusivity, flow distributions, source terms). There is a lot of possible strategies to process the space and time distributed big amount of data (previous integral transformation of the images, compression, elimination of the non useful areas...), generally based on the necessity to analyse the derivative versus space and time of the temperature field. Several illustrative examples related to the Non-Destructive Evaluation of heterogeneous solids, the thermal characterization of chemical reactions in microfluidic channels and the design of systems for multispectral tomography, will be presented.

  9. Long-Term Distribution and Transport of Nitrate and Ammonium Within a Groundwater Sewage Plume, Cape Cod, MA, After Removal of the Contaminant Source.

    NASA Astrophysics Data System (ADS)

    Repert, D. A.; Smith, R. L.

    2002-12-01

    Disposal of treated sewage for 60 yrs. onto infiltration beds at a site on Cape Cod, MA produced a groundwater contaminant plume >6 km long. The plume was characterized by an anoxic ammonium-containing core, surrounded by an oxic-suboxic outer zone within the sand and gravel aquifer. In Dec. 1995 the sewage treatment facility ceased operation. A long-term study to characterize the distribution of sewage plume constituents was conducted along a 500 m-long transect (source to 3 yrs. groundwater travel distance). Prior to sewage-disposal cessation, total inorganic N within 30 m vertical profiles decreased from 6.6 moles N/m2 (92% NO3-, 8% NH4+) at the point of discharge to 3.3 moles N/m2 (77% NO3-, 23% NH4+) at the furthest point along the transect. Post-cessation nitrate concentrations increased within the first 6 mo. and then gradually decreased. The nitrate decrease was accompanied by an initial nitrite increase, an indication that denitrification was reducing nitrate after the oxygenated sewage discharge was discontinued. There was also an apparent increase in ammonium concentration in the first 6 mo. after cessation. Previous laboratory experiments on pre-cessation cores showed that nitrification was important in converting sorbed ammonium to nitrate under the sewage beds. However, with the removal of the oxygenated sewage source, nitrification ceased, allowing ammonium to initially increase. This increase was correlated with dissolved organic carbon concentrations within the groundwater. Ammonium concentrations decreased dramatically after a year, but subsequently increased in the core of the plume to pre-cessation levels through mineralization of organic N. Recent laboratory core experiments and extractions show that there is a large pool of sorbed organic carbon, although dissolved organic carbon concentrations have been consistently less than 3 mg/L for 6 yrs. Seven yrs. after cessation of the sewage disposal, there is still a significant amount (0.6 moles N

  10. Phase 1 immobilized low-activity waste operational source term

    SciTech Connect

    Burbank, D.A.

    1998-03-06

    This report presents an engineering analysis of the Phase 1 privatization feeds to establish an operational source term for storage and disposal of immobilized low-activity waste packages at the Hanford Site. The source term information is needed to establish a preliminary estimate of the numbers of remote-handled and contact-handled waste packages. A discussion of the uncertainties and their impact on the source term and waste package distribution is also presented. It should be noted that this study is concerned with operational impacts only. Source terms used for accident scenarios would differ due to alpha and beta radiation which were not significant in this study.

  11. HTGR Mechanistic Source Terms White Paper

    SciTech Connect

    Wayne Moe

    2010-07-01

    The primary purposes of this white paper are: (1) to describe the proposed approach for developing event specific mechanistic source terms for HTGR design and licensing, (2) to describe the technology development programs required to validate the design methods used to predict these mechanistic source terms and (3) to obtain agreement from the NRC that, subject to appropriate validation through the technology development program, the approach for developing event specific mechanistic source terms is acceptable

  12. Source term calculations for assessing radiation dose to equipment

    SciTech Connect

    Denning, R.S.; Freeman-Kelly, R.; Cybulskis, P.; Curtis, L.A.

    1989-07-01

    This study examines results of analyses performed with the Source Term Code Package to develop updated source terms using NUREG-0956 methods. The updated source terms are to be used to assess the adequacy of current regulatory source terms used as the basis for equipment qualification. Time-dependent locational distributions of radionuclides within a containment following a severe accident have been developed. The Surry reactor has been selected in this study as representative of PWR containment designs. Similarly, the Peach Bottom reactor has been used to examine radionuclide distributions in boiling water reactors. The time-dependent inventory of each key radionuclide is provided in terms of its activity in curies. The data are to be used by Sandia National Laboratories to perform shielding analyses to estimate radiation dose to equipment in each containment design. See NUREG/CR-5175, Beta and Gamma Dose Calculations for PWR and BWR Containments.'' 6 refs., 11 tabs.

  13. Calculation of source terms for NUREG-1150

    SciTech Connect

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP.

  14. SOURCE TERMS FOR HLW GLASS CANISTERS

    SciTech Connect

    J.S. Tang

    2000-08-15

    This calculation is prepared by the Monitored Geologic Repository (MGR) Waste Package Design Section. The objective of this calculation is to determine the source terms that include radionuclide inventory, decay heat, and radiation sources due to gamma rays and neutrons for the high-level radioactive waste (HLW) from the, West Valley Demonstration Project (WVDP), Savannah River Site (SRS), Hanford Site (HS), and Idaho National Engineering and Environmental Laboratory (INEEL). This calculation also determines the source terms of the canister containing the SRS HLW glass and immobilized plutonium. The scope of this calculation is limited to source terms for a time period out to one million years. The results of this calculation may be used to carry out performance assessment of the potential repository and to evaluate radiation environments surrounding the waste packages (WPs). This calculation was performed in accordance with the Development Plan ''Source Terms for HLW Glass Canisters'' (Ref. 7.24).

  15. Mechanistic facility safety and source term analysis

    SciTech Connect

    PLYS, M.G.

    1999-06-09

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here.

  16. Assessing sensitivity of source term estimation

    NASA Astrophysics Data System (ADS)

    Long, Kerrie J.; Haupt, Sue Ellen; Young, George S.

    2010-04-01

    Source term estimation algorithms compute unknown atmospheric transport and dispersion modeling variables from concentration observations made by sensors in the field. Insufficient spatial and temporal resolution in the meteorological data as well as inherent uncertainty in the wind field data make source term estimation and the prediction of subsequent transport and dispersion extremely difficult. This work addresses the question: how many sensors are necessary in order to successfully estimate the source term and meteorological variables required for atmospheric transport and dispersion modeling? The source term estimation system presented here uses a robust optimization technique - a genetic algorithm (GA) - to find the combination of source location, source height, source strength, surface wind direction, surface wind speed, and time of release that produces a concentration field that best matches the sensor observations. The approach is validated using the Gaussian puff as the dispersion model in identical twin numerical experiments. The limits of the system are tested by incorporating additive and multiplicative noise into the synthetic data. The minimum requirements for data quantity and quality are determined by an extensive grid sensitivity analysis. Finally, a metric is developed for quantifying the minimum number of sensors necessary to accurately estimate the source term and to obtain the relevant wind information.

  17. Dose distributions in regions containing beta sources: Irregularly shaped source distributions in homogeneous media

    SciTech Connect

    Werner, B.L. )

    1991-11-01

    Methods are introduced by which dose rate distributions due to nonuniform, irregularly shaped distributions of beta emitters can be calculated using dose rate distributions for uniform, spherical source distributions. The dose rate distributions can be written in the MIRD formalism.

  18. Subsurface Shielding Source Term Specification Calculation

    SciTech Connect

    S.Su

    2001-04-12

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M&O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M&O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations.

  19. BWR Source Term Generation and Evaluation

    SciTech Connect

    J.C. Ryman

    2003-07-31

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operating conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the

  20. Hazardous constituent source term. Revision 2

    SciTech Connect

    Not Available

    1994-11-17

    The Department of Energy (DOE) has several facilities that either generate and/or store transuranic (TRU)-waste from weapons program research and production. Much of this waste also contains hazardous waste constituents as regulated under Subtitle C of the Resource Conservation and Recovery Act (RCRA). Toxicity characteristic metals in the waste principally include lead, occurring in leaded rubber gloves and shielding. Other RCRA metals may occur as contaminants in pyrochemical salt, soil, debris, and sludge and solidified liquids, as well as in equipment resulting from decontamination and decommissioning activities. Volatile organic compounds (VOCS) contaminate many waste forms as a residue adsorbed on surfaces or occur in sludge and solidified liquids. Due to the presence of these hazardous constituents, applicable disposal regulations include land disposal restrictions established by Hazardous and Solid Waste Amendments (HSWA). The DOE plans to dispose of TRU-mixed waste from the weapons program in the Waste Isolation Pilot Plant (WIPP) by demonstrating no-migration of hazardous constituents. This paper documents the current technical basis for methodologies proposed to develop a post-closure RCRA hazardous constituent source term. For the purposes of demonstrating no-migration, the hazardous constituent source term is defined as the quantities of hazardous constituents that are available for transport after repository closure. Development of the source term is only one of several activities that will be involved in the no-migration demonstration. The demonstration will also include uncertainty and sensitivity analyses of contaminant transport.

  1. Distributed Power Sources for Mars Colonization

    NASA Astrophysics Data System (ADS)

    Miley, George H.; Shaban, Yasser

    2003-01-01

    One of the fundamental needs for Mars colonization is an abundant source of energy. The total energy system will probably use a mixture of sources based on solar energy, fuel cells, and nuclear energy. Here we concentrate on the possibility of developing a distributed system employing several unique new types of nuclear energy sources, specifically small fusion devices using inertial electrostatic confinement and portable ``battery type'' proton reaction cells.

  2. Spiral arms as cosmic ray source distributions

    NASA Astrophysics Data System (ADS)

    Werner, M.; Kissmann, R.; Strong, A. W.; Reimer, O.

    2015-04-01

    The Milky Way is a spiral galaxy with (or without) a bar-like central structure. There is evidence that the distribution of suspected cosmic ray sources, such as supernova remnants, are associated with the spiral arm structure of galaxies. It is yet not clearly understood what effect such a cosmic ray source distribution has on the particle transport in our Galaxy. We investigate and measure how the propagation of Galactic cosmic rays is affected by a cosmic ray source distribution associated with spiral arm structures. We use the PICARD code to perform high-resolution 3D simulations of electrons and protons in galactic propagation scenarios that include four-arm and two-arm logarithmic spiral cosmic ray source distributions with and without a central bar structure as well as the spiral arm configuration of the NE2001 model for the distribution of free electrons in the Milky Way. Results of these simulation are compared to an axisymmetric radial source distribution. Also, effects on the cosmic ray flux and spectra due to different positions of the Earth relative to the spiral structure are studied. We find that high energy electrons are strongly confined to their sources and the obtained spectra largely depend on the Earth's position relative to the spiral arms. Similar finding have been obtained for low energy protons and electrons albeit at smaller magnitude. We find that even fractional contributions of a spiral arm component to the total cosmic ray source distribution influences the spectra on the Earth. This is apparent when compared to an axisymmetric radial source distribution as well as with respect to the Earth's position relative to the spiral arm structure. We demonstrate that the presence of a Galactic bar manifests itself as an overall excess of low energy electrons at the Earth. Using a spiral arm geometry as a cosmic ray source distributions offers a genuine new quality of modeling and is used to explain features in cosmic ray spectra at the Earth

  3. STACE: Source Term Analyses for Containment Evaluations of transport casks

    SciTech Connect

    Seager, K. D.; Gianoulakis, S. E.; Barrett, P. R.; Rashid, Y. R.; Reardon, P. C.

    1992-01-01

    Following the guidance of ANSI N14.5, the STACE methodology provides a technically defensible means for estimating maximum permissible leakage rates. These containment criteria attempt to reflect the true radiological hazard by performing a detailed examination of the spent fuel, CRUD, and residual contamination contributions to the releasable source term. The evaluation of the spent fuel contribution to the source term has been modeled fairly accurately using the STACE methodology. The structural model predicts the cask drop load history, the mechanical response of the fuel assembly, and the probability of cladding breach. These data are then used to predict the amount of fission gas, volatile species, and fuel fines that are releasable from the cask. There are some areas where data are sparse or lacking (e.g., the quantity and size distribution of fuel rod breaches) in which experimental validation is planned. The CRUD spallation fraction is the major area where no quantitative data has been found; therefore, this also requires experimental validation. In the interim, STACE conservatively assumes a 100% spallation fraction for computing the releasable activity. The source term methodology also conservatively assumes that there is 1 Ci of residual contamination available for release in the transport cask. However, residual contamination is still by far the smallest contributor to the source term activity.

  4. Source term evaluation for combustion modeling

    NASA Technical Reports Server (NTRS)

    Sussman, Myles A.

    1993-01-01

    A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.

  5. Improved source term estimation using blind outlier detection

    NASA Astrophysics Data System (ADS)

    Martinez-Camara, Marta; Bejar Haro, Benjamin; Vetterli, Martin; Stohl, Andreas

    2014-05-01

    Emissions of substances into the atmosphere are produced in situations such as volcano eruptions, nuclear accidents or pollutant releases. It is necessary to know the source term - how the magnitude of these emissions changes with time - in order to predict the consequences of the emissions, such as high radioactivity levels in a populated area or high concentration of volcanic ash in an aircraft flight corridor. However, in general, we know neither how much material was released in total, nor the relative variation of emission strength with time. Hence, estimating the source term is a crucial task. Estimating the source term generally involves solving an ill-posed linear inverse problem using datasets of sensor measurements. Several so-called inversion methods have been developed for this task. Unfortunately, objective quantitative evaluation of the performance of inversion methods is difficult due to the fact that the ground truth is unknown for practically all the available measurement datasets. In this work we use the European Tracer Experiment (ETEX) - a rare example of an experiment where the ground truth is available - to develop and to test new source estimation algorithms. Knowledge of the ground truth grants us access to the additive error term. We show that the distribution of this error is heavy-tailed, which means that some measurements are outliers. We also show that precisely these outliers severely degrade the performance of traditional inversion methods. Therefore, we develop blind outlier detection algorithms specifically suited to the source estimation problem. Then, we propose new inversion methods that combine traditional regularization techniques with blind outlier detection. Such hybrid methods reduce the error of reconstruction of the source term up to 45% with respect to previously proposed methods.

  6. TRIGA MARK-II source term

    NASA Astrophysics Data System (ADS)

    Usang, M. D.; Hamzah, N. S.; J. B., Abi M.; M. Z., M. Rawi; Abu, M. P.

    2014-02-01

    ORIGEN 2.2 are employed to obtain data regarding γ source term and the radio-activity of irradiated TRIGA fuel. The fuel composition are specified in grams for use as input data. Three types of fuel are irradiated in the reactor, each differs from the other in terms of the amount of Uranium compared to the total weight. Each fuel are irradiated for 365 days with 50 days time step. We obtain results on the total radioactivity of the fuel, the composition of activated materials, composition of fission products and the photon spectrum of the burned fuel. We investigate the differences of results using BWR and PWR library for ORIGEN. Finally, we compare the composition of major nuclides after 1 year irradiation of both ORIGEN library with results from WIMS. We found only minor disagreements between the yields of PWR and BWR libraries. In comparison with WIMS, the errors are a little bit more pronounced. To overcome this errors, the irradiation power used in ORIGEN could be increased a little, so that the differences in the yield of ORIGEN and WIMS could be reduced. A more permanent solution is to use a different code altogether to simulate burnup such as DRAGON and ORIGEN-S. The result of this study are essential for the design of radiation shielding from the fuel.

  7. TRIGA MARK-II source term

    SciTech Connect

    Usang, M. D. Hamzah, N. S. Abi, M. J. B. Rawi, M. Z. M. Rawi Abu, M. P.

    2014-02-12

    ORIGEN 2.2 are employed to obtain data regarding γ source term and the radio-activity of irradiated TRIGA fuel. The fuel composition are specified in grams for use as input data. Three types of fuel are irradiated in the reactor, each differs from the other in terms of the amount of Uranium compared to the total weight. Each fuel are irradiated for 365 days with 50 days time step. We obtain results on the total radioactivity of the fuel, the composition of activated materials, composition of fission products and the photon spectrum of the burned fuel. We investigate the differences of results using BWR and PWR library for ORIGEN. Finally, we compare the composition of major nuclides after 1 year irradiation of both ORIGEN library with results from WIMS. We found only minor disagreements between the yields of PWR and BWR libraries. In comparison with WIMS, the errors are a little bit more pronounced. To overcome this errors, the irradiation power used in ORIGEN could be increased a little, so that the differences in the yield of ORIGEN and WIMS could be reduced. A more permanent solution is to use a different code altogether to simulate burnup such as DRAGON and ORIGEN-S. The result of this study are essential for the design of radiation shielding from the fuel.

  8. Over-Distribution in Source Memory

    PubMed Central

    Brainerd, C. J.; Reyna, V. F.; Holliday, R. E.; Nakamura, K.

    2012-01-01

    Semantic false memories are confounded with a second type of error, over-distribution, in which items are attributed to contradictory episodic states. Over-distribution errors have proved to be more common than false memories when the two are disentangled. We investigated whether over-distribution is prevalent in another classic false memory paradigm: source monitoring. It is. Conventional false memory responses (source misattributions) were predominantly over-distribution errors, but unlike semantic false memory, over-distribution also accounted for more than half of true memory responses (correct source attributions). Experimental control of over-distribution was achieved via a series of manipulations that affected either recollection of contextual details or item memory (concreteness, frequency, list-order, number of presentation contexts, and individual differences in verbatim memory). A theoretical model was used to analyze the data (conjoint process dissociation) that predicts that predicts that (a) over-distribution is directly proportional to item memory but inversely proportional to recollection and (b) item memory is not a necessary precondition for recollection of contextual details. The results were consistent with both predictions. PMID:21942494

  9. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The... to January 10, 1997, who seek to revise the current accident source term used in their design...

  10. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The... to January 10, 1997, who seek to revise the current accident source term used in their design...

  11. Bayesian Estimation of Prior Variance in Source Term Determination

    NASA Astrophysics Data System (ADS)

    Smidl, Vaclav; Hofman, Radek

    2015-04-01

    The problem of determination of source term of an atmospheric release is studied. We assume that the observations y are obtained as linear combination of the source term, x, and source-receptor sensitivities, which can be written in matrix notation as y = Mx with source receptor sensitivity matrix M. Direct estimation of the source term vector x is not possible since the system is often ill-conditioned. The solution is thus found by minimization of a cost function with regularization terms. A typical cost function is: C (x) = (y - M x)TR-1(y- M x) + αxTDT Dx, (1) where the first term minimizes the error of the measurements with covariance matrix R, and the second term is the regularization with weight α. Various types of regularization arise for different choices of matrix D. For example, Tikhonov regularization arises for D in the form of identity matrix, and smoothing regularization for D in the form of a tri-diagonal matrix (Laplacian operator). Typically, the form of matrix D is assumed to be known, and the weight α is optimized manually by a trial and error procedure. In this contribution, we use the probabilistic formulation of the problem, where term (αDTD)-1 is interpreted as a covariance matrix of the prior distribution of x. Following the Bayesian approach, we relax the assumption of known α and D and assume that these are unknown and estimated from the data. The general problem is not analytically tractable and approximate estimation techniques has to be used. We present Variational Bayesian solution of two special cases of the prior covariance matrix. First, the structure of D is assumed to be known and only the weight α is estimated. Application of the Variational Bayes method to this case yields an iterative estimation algorithm. In the first step, the usual optimization problem is solved for an estimate of α. In the next step, the value of α is re-estimated and the procedure returns to the first step. Positivity of the solution is guaranteed

  12. Long-term source monitoring with BATSE

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Harmon, B. A.; Finger, M. H.; Fishman, G. J.; Meegan, C. A.; Paciesas, W. S.

    1992-01-01

    The uncollimated Burst and Transient Source Experiment (BATSE) large area detectors (LADs) are well suited to nearly continuous monitoring of the stronger hard x-ray sources, and time series analysis for pulsars. An overview of the analysis techniques presently being applied to the data are discussed, including representative observations of the Crab Nebula, Crab pulsar, and summaries of the sources detected to data. Results of a search for variability in the Crab Pulsar pulse profile are presented.

  13. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  14. Calculation of external dose from distributed source

    SciTech Connect

    Kocher, D.C.

    1986-01-01

    This paper discusses a relatively simple calculational method, called the point kernel method (Fo68), for estimating external dose from distributed sources that emit photon or electron radiations. The principles of the point kernel method are emphasized, rather than the presentation of extensive sets of calculations or tables of numerical results. A few calculations are presented for simple source geometries as illustrations of the method, and references and descriptions are provided for other caluclations in the literature. This paper also describes exposure situations for which the point kernel method is not appropriate and other, more complex, methods must be used, but these methods are not discussed in any detail.

  15. Open Source Live Distributions for Computer Forensics

    NASA Astrophysics Data System (ADS)

    Giustini, Giancarlo; Andreolini, Mauro; Colajanni, Michele

    Current distributions of open source forensic software provide digital investigators with a large set of heterogeneous tools. Their use is not always focused on the target and requires high technical expertise. We present a new GNU/Linux live distribution, named CAINE (Computer Aided INvestigative Environment) that contains a collection of tools wrapped up into a user friendly environment. The CAINE forensic framework introduces novel important features, aimed at filling the interoperability gap across different forensic tools. Moreover, it provides a homogeneous graphical interface that drives digital investigators during the acquisition and analysis of electronic evidence, and it offers a semi-automatic mechanism for the creation of the final report.

  16. Photocounting distributions for exponentially decaying sources.

    PubMed

    Teich, M C; Card, H C

    1979-05-01

    Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model. PMID:19687829

  17. Sensitivity analysis of distributed volcanic source inversion

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Camacho, Antonio G.; González, Pablo J.; Puglisi, Giuseppe; Fernández, José

    2016-04-01

    A recently proposed algorithm (Camacho et al., 2011) claims to rapidly estimate magmatic sources from surface geodetic data without any a priori assumption about source geometry. The algorithm takes the advantages of fast calculation from the analytical models and adds the capability to model free-shape distributed sources. Assuming homogenous elastic conditions, the approach can determine general geometrical configurations of pressured and/or density source and/or sliding structures corresponding to prescribed values of anomalous density, pressure and slip. These source bodies are described as aggregation of elemental point sources for pressure, density and slip, and they fit the whole data (keeping some 3D regularity conditions). Although some examples and applications have been already presented to demonstrate the ability of the algorithm in reconstructing a magma pressure source (e.g. Camacho et al., 2011,Cannavò et al., 2015), a systematic analysis of sensitivity and reliability of the algorithm is still lacking. In this explorative work we present results from a large statistical test designed to evaluate the advantages and limitations of the methodology by assessing its sensitivity to the free and constrained parameters involved in inversions. In particular, besides the source parameters, we focused on the ground deformation network topology, and noise in measurements. The proposed analysis can be used for a better interpretation of the algorithm results in real-case applications. Camacho, A. G., González, P. J., Fernández, J. & Berrino, G. (2011) Simultaneous inversion of surface deformation and gravity changes by means of extended bodies with a free geometry: Application to deforming calderas. J. Geophys. Res. 116. Cannavò F., Camacho A.G., González P.J., Mattia M., Puglisi G., Fernández J. (2015) Real Time Tracking of Magmatic Intrusions by means of Ground Deformation Modeling during Volcanic Crises, Scientific Reports, 5 (10970) doi:10.1038/srep

  18. Particle size distribution of indoor aerosol sources

    SciTech Connect

    Shah, K.B.

    1990-10-24

    As concern about Indoor Air Quality (IAQ) has grown in recent years, it has become necessary to determine the nature of particles produced by different indoor aerosol sources and the typical concentration that these sources tend to produce. These data are important in predicting the dose of particles to people exposed to these sources and it will also enable us to take effective mitigation procedures. Further, it will also help in designing appropriate air cleaners. A new state of the art technique, DMPS (Differential Mobility Particle Sizer) System is used to determine the particle size distributions of a number of sources. This system employs the electrical mobility characteristics of these particles and is very effective in the 0.01--1.0 {mu}m size range. A modified system that can measure particle sizes in the lower size range down to 3 nm was also used. Experimental results for various aerosol sources is presented in the ensuing chapters. 37 refs., 20 figs., 2 tabs.

  19. Quantum key distribution with entangled photon sources

    NASA Astrophysics Data System (ADS)

    Ma, Xiongfeng; Fung, Chi-Hang Fred; Lo, Hoi-Kwong

    2007-07-01

    A parametric down-conversion (PDC) source can be used as either a triggered single-photon source or an entangled-photon source in quantum key distribution (QKD). The triggering PDC QKD has already been studied in the literature. On the other hand, a model and a post-processing protocol for the entanglement PDC QKD are still missing. We fill in this important gap by proposing such a model and a post-processing protocol for the entanglement PDC QKD. Although the PDC model is proposed to study the entanglement-based QKD, we emphasize that our generic model may also be useful for other non-QKD experiments involving a PDC source. Since an entangled PDC source is a basis-independent source, we apply Koashi and Preskill’s security analysis to the entanglement PDC QKD. We also investigate the entanglement PDC QKD with two-way classical communications. We find that the recurrence scheme increases the key rate and the Gottesman-Lo protocol helps tolerate higher channel losses. By simulating a recent 144-km open-air PDC experiment, we compare three implementations: entanglement PDC QKD, triggering PDC QKD, and coherent-state QKD. The simulation result suggests that the entanglement PDC QKD can tolerate higher channel losses than the coherent-state QKD. The coherent-state QKD with decoy states is able to achieve highest key rate in the low- and medium-loss regions. By applying the Gottesman-Lo two-way post-processing protocol, the entanglement PDC QKD can tolerate up to 70dB combined channel losses ( 35dB for each channel) provided that the PDC source is placed in between Alice and Bob. After considering statistical fluctuations, the PDC setup can tolerate up to 53dB channel losses.

  20. State of the hydrologic source term

    SciTech Connect

    Kersting, A.

    1996-12-01

    The Underground Test Area (UGTA) Operable Unit was defined by the U.S. Department of energy, Nevada operations Office to characterize and potentially remediate groundwaters impacted by nuclear testing at the Nevada Test Site (NTS). Between 1955 and 1992, 828 nuclear devices were detonated underground at the NTS (DOE), 1994. Approximately one third of the nuclear tests were detonated at or below the standing water table and the remainder were located above the water table in the vadose zone. As a result, the distribution of radionuclides in the subsurface and, in particular, the availability of radionuclides for transport away from individual test cavities are major concerns at the NTS. The approach taken is to carry out field-based studies of both groundwaters and host rocks within the near-field in order to develop a detailed understanding of the present-day concentration and spatial distribution of constituent radionuclides. Understanding the current distribution of contamination within the near-field and the conditions under and processes by which the radionuclides were transported make it possible to predict future transport behavior. The results of these studies will be integrated with archival research, experiments and geochemical modeling for complete characterization.

  1. Experimental quantum key distribution with source flaws

    NASA Astrophysics Data System (ADS)

    Xu, Feihu; Wei, Kejin; Sajeed, Shihan; Kaiser, Sarah; Sun, Shihai; Tang, Zhiyuan; Qian, Li; Makarov, Vadim; Lo, Hoi-Kwong

    2015-09-01

    Decoy-state quantum key distribution (QKD) is a standard technique in current quantum cryptographic implementations. Unfortunately, existing experiments have two important drawbacks: the state preparation is assumed to be perfect without errors and the employed security proofs do not fully consider the finite-key effects for general attacks. These two drawbacks mean that existing experiments are not guaranteed to be proven to be secure in practice. Here, we perform an experiment that shows secure QKD with imperfect state preparations over long distances and achieves rigorous finite-key security bounds for decoy-state QKD against coherent attacks in the universally composable framework. We quantify the source flaws experimentally and demonstrate a QKD implementation that is tolerant to channel loss despite the source flaws. Our implementation considers more real-world problems than most previous experiments, and our theory can be applied to general discrete-variable QKD systems. These features constitute a step towards secure QKD with imperfect devices.

  2. High power distributed x-ray source

    NASA Astrophysics Data System (ADS)

    Frutschy, Kris; Neculaes, Bogdan; Inzinna, Lou; Caiafa, Antonio; Reynolds, Joe; Zou, Yun; Zhang, Xi; Gunturi, Satish; Cao, Yang; Waters, Bill; Wagner, Dave; De Man, Bruno; McDevitt, Dan; Roffers, Rick; Lounsberry, Brian; Pelc, Norbert J.

    2010-04-01

    This paper summarizes the development of a distributed x-ray source with up to 60kW demonstrated instantaneous power. Component integration and test results are shown for the dispenser cathode electron gun, fast switching controls, high voltage stand-off insulator, brazed anode, and vacuum system. The current multisource prototype has been operated for over 100 hours without failure, and additional testing is needed to discover the limiting component. Example focal spot measurements and x-ray radiographs are included. Lastly, future development opportunities are highlighted.

  3. Stochastic Models for the Distribution of Index Terms.

    ERIC Educational Resources Information Center

    Nelson, Michael J.

    1989-01-01

    Presents a probability model of the occurrence of index terms used to derive discrete distributions which are mixtures of Poisson and negative binomial distributions. These distributions give better fits than the simpler Zipf distribution, have the advantage of being more explanatory, and can incorporate a time parameter if necessary. (25…

  4. Optimum target source term estimation for high energy electron accelerators

    NASA Astrophysics Data System (ADS)

    Nayak, M. K.; Sahu, T. K.; Nair, Haridas G.; Nandedkar, R. V.; Bandyopadhyay, Tapas; Tripathi, R. M.; Hannurkar, P. R.

    2016-05-01

    Optimum target for bremsstrahlung emission is defined as the thickness of the target material, which produces maximum bremsstrahlung yield, on interaction of electron with the target. The bremsstrahlung dose rate per unit electron beam power at a distance of 1 m from the target material gives the optimum target source term. In the present work, simulations were performed for three different electron energies, 450, 1000 and 2500 MeV using EGSnrc Monte-Carlo code to determine the optimum thickness. An empirical relation for optimum target as a function of electron energy and atomic number of the target materials is found out from results. Using the simulated optimum target thickness, experiments are conducted to determine the optimum target source term. For the experimental determination, two available electron energies, 450 MeV and 550 MeV from booster synchrotron of Indus facility is used. The optimum target source term for these two energies are also simulated. The experimental and simulated source term are found to be in very good agreement within ±3%. Based on the agreement of the simulated source term with the experimental source term at 450 MeV and 550 MeV, the same simulation methodology is used to simulate optimum target source term up to 2500 MeV. The paper describes the simulations and experiments carried out on optimum target bremsstrahlung source term and the results obtained.

  5. CONSTRAINING SOURCE REDSHIFT DISTRIBUTIONS WITH GRAVITATIONAL LENSING

    SciTech Connect

    Wittman, D.; Dawson, W. A.

    2012-09-10

    We introduce a new method for constraining the redshift distribution of a set of galaxies, using weak gravitational lensing shear. Instead of using observed shears and redshifts to constrain cosmological parameters, we ask how well the shears around clusters can constrain the redshifts, assuming fixed cosmological parameters. This provides a check on photometric redshifts, independent of source spectral energy distribution properties and therefore free of confounding factors such as misidentification of spectral breaks. We find that {approx}40 massive ({sigma}{sub v} = 1200 km s{sup -1}) cluster lenses are sufficient to determine the fraction of sources in each of six coarse redshift bins to {approx}11%, given weak (20%) priors on the masses of the highest-redshift lenses, tight (5%) priors on the masses of the lowest-redshift lenses, and only modest (20%-50%) priors on calibration and evolution effects. Additional massive lenses drive down uncertainties as N{sub lens}{sup -1/2}, but the improvement slows as one is forced to use lenses further down the mass function. Future large surveys contain enough clusters to reach 1% precision in the bin fractions if the tight lens-mass priors can be maintained for large samples of lenses. In practice this will be difficult to achieve, but the method may be valuable as a complement to other more precise methods because it is based on different physics and therefore has different systematic errors.

  6. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  7. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  8. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    SciTech Connect

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  9. Panchromatic spectral energy distributions of Herschel sources

    NASA Astrophysics Data System (ADS)

    Berta, S.; Lutz, D.; Santini, P.; Wuyts, S.; Rosario, D.; Brisbin, D.; Cooray, A.; Franceschini, A.; Gruppioni, C.; Hatziminaoglou, E.; Hwang, H. S.; Le Floc'h, E.; Magnelli, B.; Nordon, R.; Oliver, S.; Page, M. J.; Popesso, P.; Pozzetti, L.; Pozzi, F.; Riguccini, L.; Rodighiero, G.; Roseboom, I.; Scott, D.; Symeonidis, M.; Valtchanov, I.; Viero, M.; Wang, L.

    2013-03-01

    Combining far-infrared Herschel photometry from the PACS Evolutionary Probe (PEP) and Herschel Multi-tiered Extragalactic Survey (HerMES) guaranteed time programs with ancillary datasets in the GOODS-N, GOODS-S, and COSMOS fields, it is possible to sample the 8-500 μm spectral energy distributions (SEDs) of galaxies with at least 7-10 bands. Extending to the UV, optical, and near-infrared, the number of bands increases up to 43. We reproduce the distribution of galaxies in a carefully selected restframe ten colors space, based on this rich data-set, using a superposition of multivariate Gaussian modes. We use this model to classify galaxies and build median SEDs of each class, which are then fitted with a modified version of the magphys code that combines stellar light, emission from dust heated by stars and a possible warm dust contribution heated by an active galactic nucleus (AGN). The color distribution of galaxies in each of the considered fields can be well described with the combination of 6-9 classes, spanning a large range of far- to near-infrared luminosity ratios, as well as different strength of the AGN contribution to bolometric luminosities. The defined Gaussian grouping is used to identify rare or odd sources. The zoology of outliers includes Herschel-detected ellipticals, very blue z ~ 1 Ly-break galaxies, quiescent spirals, and torus-dominated AGN with star formation. Out of these groups and outliers, a new template library is assembled, consisting of 32 SEDs describing the intrinsic scatter in the restframe UV-to-submm colors of infrared galaxies. This library is tested against L(IR) estimates with and without Herschel data included, and compared to eightother popular methods often adopted in the literature. When implementing Herschel photometry, these approaches produce L(IR) values consistent with each other within a median absolute deviation of 10-20%, the scatter being dominated more by fine tuning of the codes, rather than by the choice of

  10. Evaluation of source-term data for plutonium aerosolization

    SciTech Connect

    Haschke, J.M.

    1992-07-01

    Relevant data are reviewed and evaluated in an effort to define the time dependence and maximum value of the source term for plutonium aerosolization during a fuel fire. The rate of plutonium oxidation at high temperatures is a major determinant of the time dependence. Analysis of temperature-time data for oxidation of plutonium shows that the rate is constant (0.2 g PUO{sub 2}/cm{sup 2} of metal surface per min) and independent of temperature above 500{degrees}C. Total mass and particle distributions are derived for oxide products formed by reactions of plutonium metal and hydride. The mass distributions for products of all metal-gas reactions are remarkably similar with approximately 0.07 mass% of the oxide particles having geometric diameters {le} 10 {mu}m. In comparison, 25 mass% of the oxide formed by the PuH{sub 2}+O{sub 2} reaction is in this range. Experimental values of mass fractions released during oxidation are evaluated and factors that alter the release fraction are discussed.

  11. Source Term Model for an Array of Vortex Generator Vanes

    NASA Technical Reports Server (NTRS)

    Buning, P. G. (Technical Monitor); Waithe, Kenrick A.

    2003-01-01

    A source term model was developed for numerical simulations of an array of vortex generators. The source term models the side force created by a vortex generator being modeled. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on a local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low-profile vortex generator vane, which is only a fraction of the boundary layer thickness, over a flat plate. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data.

  12. Revised accident source terms for light-water reactors

    SciTech Connect

    Soffer, L.

    1995-02-01

    This paper presents revised accident source terms for light-water reactors incorporating the severe accident research insights gained in this area over the last 15 years. Current LWR reactor accident source terms used for licensing date from 1962 and are contained in Regulatory Guides 1.3 and 1.4. These specify that 100% of the core inventory of noble gases and 25% of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental iodine. These assumptions have strongly affected present nuclear air cleaning requirements by emphasizing rapid actuation of spray systems and filtration systems optimized to retain elemental iodine. A proposed revision of reactor accident source terms and some im implications for nuclear air cleaning requirements was presented at the 22nd DOE/NRC Nuclear Air Cleaning Conference. A draft report was issued by the NRC for comment in July 1992. Extensive comments were received, with the most significant comments involving (a) release fractions for both volatile and non-volatile species in the early in-vessel release phase, (b) gap release fractions of the noble gases, iodine and cesium, and (c) the timing and duration for the release phases. The final source term report is expected to be issued in late 1994. Although the revised source terms are intended primarily for future plants, current nuclear power plants may request use of revised accident source term insights as well in licensing. This paper emphasizes additional information obtained since the 22nd Conference, including studies on fission product removal mechanisms, results obtained from improved severe accident code calculations and resolution of major comments, and their impact upon the revised accident source terms. Revised accident source terms for both BWRS and PWRS are presented.

  13. Source term identification in atmospheric modelling via sparse optimization

    NASA Astrophysics Data System (ADS)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  14. An Evaluation of Short-Term Distributed Online Learning Events

    ERIC Educational Resources Information Center

    Barker, Bradley; Brooks, David

    2005-01-01

    The purpose of this study was to evaluate the effectiveness of short-term distributed online training events using an adapted version of the compressed evaluation form developed by Wisher and Curnow (1998). Evaluating online distributed training events provides insight into course effectiveness, the contribution of prior knowledge to learning, and…

  15. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  16. Selection of models to calculate the LLW source term

    SciTech Connect

    Sullivan, T.M. )

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab.

  17. Effect of convective term on temperature distribution in biological tissue

    NASA Astrophysics Data System (ADS)

    Kengne, Emmanuel; Saydé, Michel; Lakhssassi, Ahmed

    2013-08-01

    We introduce a phase imprint into the order parameter describing the influence of blood flow on the temperature distribution in the tissue described by the one-dimensional Pennes equation and then engineer the imprinted phase suitably to generate a modified Pennes equation with a gradient term (known in the theory of biological systems as convective term) which is associated with the heat convected by the flowing blood. Using the derived model, we analytically investigate temperature distribution in biological tissues subject to two different spatial heating methods. The applicability of our results is illustrated by one of typical bio-heat transfer problems which is often encountered in therapeutic treatment, cancer hyperthermia, laser surgery, thermal injury evaluation, etc. Analyzing the effect of the convective term on temperature distribution, we found that an optimum heating of a biological system can be obtained through regulating the convective term.

  18. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  19. Problem solving as intelligent retrieval from distributed knowledge sources

    NASA Technical Reports Server (NTRS)

    Chen, Zhengxin

    1987-01-01

    Distributed computing in intelligent systems is investigated from a different perspective. From the viewpoint that problem solving can be viewed as intelligent knowledge retrieval, the use of distributed knowledge sources in intelligent systems is proposed.

  20. Flowsheets and source terms for radioactive waste projections

    SciTech Connect

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  1. Spallation Neutron Source Accident Terms for Environmental Impact Statement Input

    SciTech Connect

    Devore, J.R.; Harrington, R.M.

    1998-08-01

    This report is about accidents with the potential to release radioactive materials into the environment surrounding the Spallation Neutron Source (SNS). As shown in Chap. 2, the inventories of radioactivity at the SNS are dominated by the target facility. Source terms for a wide range of target facility accidents, from anticipated events to worst-case beyond-design-basis events, are provided in Chaps. 3 and 4. The most important criterion applied to these accident source terms is that they should not underestimate potential release. Therefore, conservative methodology was employed for the release estimates. Although the source terms are very conservative, excessive conservatism has been avoided by basing the releases on physical principles. Since it is envisioned that the SNS facility may eventually (after about 10 years) be expanded and modified to support a 4-MW proton beam operational capability, the source terms estimated in this report are applicable to a 4-MW operating proton beam power unless otherwise specified. This is bounding with regard to the 1-MW facility that will be built and operated initially. See further discussion below in Sect. 1.2.

  2. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  3. Long-Term Stability of Radio Sources in VLBI Analysis

    NASA Technical Reports Server (NTRS)

    Engelhardt, Gerald; Thorandt, Volkmar

    2010-01-01

    Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.

  4. Source Term Code Package: a user's guide (Mod 1)

    SciTech Connect

    Gieseke, J.A.; Cybulskis, P.; Jordan, H.; Lee, K.W.; Schumacher, P.M.; Curtis, L.A.; Wooton, R.O.; Quayle, S.F.; Kogan, V.

    1986-07-01

    As part of a major reassessment of the release of radioactive materials to the environment (source terms) in severe reactor accidents, a group of state-of-the-art computer codes was utilized to perform extensive analyses. A major product of this source term reassessment effort was a demonstrated methodology for analyzing specific accident situations to provide source term predictions. The computer codes forming this methodology have been upgraded and modified for release and further use. This system of codes has been named the Source Term Code Package (STCP) and is the subject of this user's guide. The guide is intended to provide an understanding of the STCP structure and to facilitate STCP use. The STCP was prepared for operation on a CDC system but is written in FORTRAN-77 to permit transportability. In the current version (Mod 1) of the STCP, the various calculational elements fall into four major categories represented by the codes MARCH3, TRAP-MELT3, VANESA, and NAUA/SPARC/ICEDF. The MARCH3 code is a combination of the MARCH2, CORSOR-M, and CORCON-Mod 2 codes. The TRAP-MELT3 code is a combination of the TRAP-MELT2.0 and MERGE codes.

  5. Disposal Unit Source Term (DUST) data input guide

    SciTech Connect

    Sullivan, T.M.

    1993-05-01

    Performance assessment of a low-level waste (LLW) disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). The computer code DUST (Disposal Unit Source Term) has been developed to model these processes. This document presents the models used to calculate release from a disposal facility, verification of the model, and instructions on the use of the DUST code. In addition to DUST, a preprocessor, DUSTIN, which helps the code user create input decks for DUST and a post-processor, GRAFXT, which takes selected output files and plots them on the computer terminal have been written. Use of these codes is also described.

  6. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    SciTech Connect

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercial spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.

  7. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Equipment General § 23.1310 Power source capacity and distribution. (a) Each installation whose...

  8. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Equipment General § 23.1310 Power source capacity and distribution. (a) Each installation whose...

  9. Energy efficient wireless sensor networks using asymmetric distributed source coding

    NASA Astrophysics Data System (ADS)

    Rao, Abhishek; Kulkarni, Murlidhar

    2013-01-01

    Wireless Sensor Networks (WSNs) are networks of sensor nodes deployed over a geographical area to perform a specific task. WSNs pose many design challenges. Energy conservation is one such design issue. In literature a wide range of solutions addressing this issue have been proposed. Generally WSNs are densely deployed. Thus the nodes with the close proximity are more likely to have the same data. Transmission of such non-aggregated data may lead to an inefficient energy management. Hence the data fusion has to be performed at the nodes so as to combine the edundant information into a single data unit. Distributed Source Coding is an efficient approach in achieving this task. In this paper an attempt has been made in modeling such a system. Various energy efficient codes were considered for the analysis. System performance in terms of energy efficiency has been made.

  10. Pressure distribution in unsteady sink and source flows.

    PubMed

    Voropayev, S I

    2015-05-01

    Basic flow generated in a viscous incompressible fluid by a "point" sink (source) of mass is revised. In practice, such flow can be modeled by sucking (pushing) fluid from a thin tube with a small porous sphere at one end. Intuitively, by sucking (pushing) fluid, one creates low (high) pressure near the origin and a positive (negative) radial pressure gradient drives the fluid to (from) the origin. A simple analysis, however, shows that the pressure distribution for both steady flows is the same. Then a question arises: How does the fluid "know" in what direction to flow? To explain this "paradox" an unsteady flow is considered and the pressure terms responsible for the flow direction are derived. PMID:26066255

  11. Open source portal to distributed image repositories

    NASA Astrophysics Data System (ADS)

    Tao, Wenchao; Ratib, Osman M.; Kho, Hwa; Hsu, Yung-Chao; Wang, Cun; Lee, Cason; McCoy, J. M.

    2004-04-01

    In large institution PACS, patient data may often reside in multiple separate systems. While most systems tend to be DICOM compliant, none of them offer the flexibility of seamless integration of multiple DICOM sources through a single access point. We developed a generic portal system with a web-based interactive front-end as well as an application programming interface (API) that allows both web users and client applications to query and retrieve image data from multiple DICOM sources. A set of software tools was developed to allow accessing several DICOM archives through a single point of access. An interactive web-based front-end allows user to search image data seamlessly from the different archives and display the results or route the image data to another DICOM compliant destination. An XML-based API allows other software programs to easily benefit from this portal to query and retrieve image data as well. Various techniques are employed to minimize the performance overhead inherent in the DICOM. The system is integrated with a hospital-wide HIPAA-compliant authentication and auditing service that provides centralized management of access to patient medical records. The system is provided under open source free licensing and developed using open-source components (Apache Tomcat for web server, MySQL for database, OJB for object/relational data mapping etc.). The portal paradigm offers a convenient and effective solution for accessing multiple image data sources in a given healthcare enterprise and can easily be extended to multi-institution through appropriate security and encryption mechanisms.

  12. IMPACTS OF SOURCE TERM HETEROGENEITIES ON WATER PATHWAY DOSE.

    SciTech Connect

    SULLIVAN, T.; GUSKOV, A.; POSKAS, P.; RUPERTI, N.; HANUSIK, V.; ET AL.

    2004-09-15

    and for which a solution has to be found in term of long-term disposal. Together with their casing and packaging, they are one form of heterogeneous waste; many other forms of waste with heterogeneous properties exist. They may arise in very small quantities and with very specific characteristics in the case of small producers, or in larger streams with standard characteristics in others. This wide variety of waste induces three main different levels of waste heterogeneity: (1) hot spot (e.g. disused sealed sources); (2) large item inside a package (e.g. metal components); and (3) very large items to be disposed of directly in the disposal unit (e.g. irradiated pipes, vessels). Safety assessments generally assume a certain level of waste homogeneity in most of the existing or proposed disposal facilities. There is a need to evaluate the appropriateness of such an assumption and the influence on the results of safety assessment. This need is especially acute in the case of sealed sources. There are many cases where are storage conditions are poor, or there is improper management leading to a radiological accident, some with significant or detrimental impacts. Disposal in a near surface disposal facility has been used in the past for some disused sealed sources. This option is currently in use for others sealed sources, or is being studied for the rest of them. The regulatory framework differs greatly between countries. In some countries, large quantities of disused sealed sources have been disposed of without any restriction, in others their disposal is forbidden by law. In any case, evaluation of the acceptability of disposal of disused sealed sources in near surface disposal facility is of utmost importance.

  13. Short and long term representation of an unfamiliar tone distribution

    PubMed Central

    Diercks, Charlette; Troje, Nikolaus F.; Cuddy, Lola L.

    2016-01-01

    We report on a study conducted to extend our knowledge about the process of gaining a mental representation of music. Several studies, inspired by research on the statistical learning of language, have investigated statistical learning of sequential rules underlying tone sequences. Given that the mental representation of music correlates with distributional properties of music, we tested whether participants are able to abstract distributional information contained in tone sequences to form a mental representation. For this purpose, we created an unfamiliar music genre defined by an underlying tone distribution, to which 40 participants were exposed. Our stimuli allowed us to differentiate between sensitivity to the distributional properties contained in test stimuli and long term representation of the distributional properties of the music genre overall. Using a probe tone paradigm and a two-alternative forced choice discrimination task, we show that listeners are able to abstract distributional properties of music through mere exposure into a long term representation of music. This lends support to the idea that statistical learning is involved in the process of gaining musical knowledge.

  14. Phonatory sound sources in terms of Lagrangian Coherent Structures

    NASA Astrophysics Data System (ADS)

    McPhail, Michael; Krane, Michael

    2015-11-01

    Lagrangian Coherent Structures (LCS) are used to identify sound sources in phonation. Currently, it is difficult to causally relate changes in airflow topology from voice disorders to changes in voiced sound production. LCS reveals a flow's topology by decomposing the flow into regions of distinct dynamics. The aeroacoustic sources can be written in terms of the motion of these regions in terms of the motion of the boundaries of the distinct regions. Breaking down the flow into constituent parts shows how each distinct region contributes to sound production. This approach provides a framework to connect changes in anatomy from a voice disorder to measurable changes in the resulting sound. This approach is presented for simulations of some canonical cases of vortex sound generation, and a two-dimensional simulation of phonation. Acknowledge NIH grant 2R01 2R01DC005642.

  15. A nuclear source term analysis for spacecraft power systems

    SciTech Connect

    McCulloch, W.H.

    1998-12-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries.

  16. An approach to distribution short-term load forecasting

    SciTech Connect

    Stratton, R.C.; Gaustad, K.L.

    1995-03-01

    This paper reports on the developments and findings of the Distribution Short-Term Load Forecaster (DSTLF) research activity. The objective of this research is to develop a distribution short-term load forecasting technology consisting of a forecasting method, development methodology, theories necessary to support required technical components, and the hardware and software tools required to perform the forecast The DSTLF consists of four major components: monitored endpoint load forecaster (MELF), nonmonitored endpoint load forecaster (NELF), topological integration forecaster (TIF), and a dynamic tuner. These components interact to provide short-term forecasts at various points in the, distribution system, eg., feeder, line section, and endpoint. This paper discusses the DSTLF methodology and MELF component MELF, based on artificial neural network technology, predicts distribution endpoint loads for an hour, a day, and a week in advance. Predictions are developed using time, calendar, historical load, and weather data. The overall DSTLF architecture and a prototype MELF module for retail endpoints have been developed. Future work will be focused on refining and extending MELF and developing NELF and TIF capabilities.

  17. Actinide Source Term Program, position paper. Revision 1

    SciTech Connect

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-11-15

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA {open_quotes}expert panel{close_quotes} model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the {open_quotes}inventory limits{close_quotes} model is the only existing defensible model for the actinide source term. The model effort in progress, {open_quotes}chemical modeling of mobile actinide concentrations{close_quotes}, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the {open_quotes}Inventory limits{close_quotes} model.

  18. Dose distributions in regions containing beta sources: Uniform spherical source regions in homogeneous media

    SciTech Connect

    Werner, B.L.; Rahman, M.; Salk, W.N. ); Kwok, C.S. )

    1991-11-01

    The energy-averaged transport model for the calculation of dose rate distributions is applied to uniform, spherical source distributions in homogeneous media for radii smaller than the electron range. The model agrees well with Monte Carlo based calculations for source distributions with radii greater than half the continuous slowing down approximation range. The dose rate distributions can be written in the medical internal radiation dose (MIRD) formalism.

  19. The source and distribution of Galactic positrons

    NASA Technical Reports Server (NTRS)

    Purcell, W. R.; Dixon, D. D.; Cheng, L.-X.; Leventhal, M.; Kinzer, R. L.; Kurfess, J. D.; Skibo, J. G.; Smith, D. M.; Tueller, J.

    1997-01-01

    The oriented scintillation spectrometer experiment (OSSE) observations of the Galactic plane and the Galactic center region were combined with observations acquired with other instruments in order to produce a map of the Galactic 511 keV annihilation radiation. Two mapping techniques were applied to the data: the maximum entropy method, and the basis pursuit inversion method. The resulting maps are qualitatively similar and show evidence for a central bulge and a weak galactic disk component. The weak disk is consistent with that expected from positrons produced by the decay of radioactive Al-26 in the interstellar medium. Both maps suggest an enhanced region of emission near l = -4 deg, b = 7 deg, with a flux of approximately 50 percent of that of the bulge. The existence of this emission appears significant, although the location is not well determined. The source of this enhanced emission is presently unknown.

  20. Sources and distributions of dark matter

    SciTech Connect

    Sikivie, P. |

    1995-12-31

    In the first section, the author tries to convey a sense of the variety of observational inputs that tell about the existence and the spatial distribution of dark matter in the universe. In the second section, he briefly reviews the four main dark matter candidates, taking note of each candidate`s status in the world of particle physics, its production in the early universe, its effect upon large scale structure formation and the means by which it may be detected. Section 3 concerns the energy spectrum of (cold) dark matter particles on earth as may be observed some day in a direct detection experiment. It is a brief account of work done in collaboration with J. Ipser and, more recently, with I. Tkachev and Y. Wang.

  1. Tetrodotoxin: chemistry, toxicity, source, distribution and detection.

    PubMed

    Bane, Vaishali; Lehane, Mary; Dikshit, Madhurima; O'Riordan, Alan; Furey, Ambrose

    2014-02-01

    Tetrodotoxin (TTX) is a naturally occurring toxin that has been responsible for human intoxications and fatalities. Its usual route of toxicity is via the ingestion of contaminated puffer fish which are a culinary delicacy, especially in Japan. TTX was believed to be confined to regions of South East Asia, but recent studies have demonstrated that the toxin has spread to regions in the Pacific and the Mediterranean. There is no known antidote to TTX which is a powerful sodium channel inhibitor. This review aims to collect pertinent information available to date on TTX and its analogues with a special emphasis on the structure, aetiology, distribution, effects and the analytical methods employed for its detection. PMID:24566728

  2. Tetrodotoxin: Chemistry, Toxicity, Source, Distribution and Detection

    PubMed Central

    Bane, Vaishali; Lehane, Mary; Dikshit, Madhurima; O’Riordan, Alan; Furey, Ambrose

    2014-01-01

    Tetrodotoxin (TTX) is a naturally occurring toxin that has been responsible for human intoxications and fatalities. Its usual route of toxicity is via the ingestion of contaminated puffer fish which are a culinary delicacy, especially in Japan. TTX was believed to be confined to regions of South East Asia, but recent studies have demonstrated that the toxin has spread to regions in the Pacific and the Mediterranean. There is no known antidote to TTX which is a powerful sodium channel inhibitor. This review aims to collect pertinent information available to date on TTX and its analogues with a special emphasis on the structure, aetiology, distribution, effects and the analytical methods employed for its detection. PMID:24566728

  3. Contamination on LDEF: Sources, distribution, and history

    NASA Technical Reports Server (NTRS)

    Pippin, Gary; Crutcher, Russ

    1993-01-01

    An introduction to contamination effects observed on the Long Duration Exposure Facility (LDEF) is presented. The activities reported are part of Boeing's obligation to the LDEF Materials Special Investigation Group. The contamination films and particles had minimal influence on the thermal performance of the LDEF. Some specific areas did have large changes in optical properties. Films also interfered with recession rate determination by reacting with the oxygen or physically shielding underlying material. Generally, contaminant films lessen the measured recession rate relative to 'clean' surfaces. On orbit generation of particles may be an issue for sensitive optics. Deposition on lenses may lead to artifacts on photographic images or cause sensors to respond inappropriately. Particles in the line of sight of sensors can cause stray light to be scattered into sensors. Particles also represent a hazard for mechanisms in that they can physically block and/or increase friction or wear on moving surfaces. LDEF carried a rather complex mixture of samples and support hardware into orbit. The experiments were assembled under a variety of conditions and time constraints and stored for up to five years before launch. The structure itself was so large that it could not be baked after the interior was painted with chemglaze Z-306 polyurethane based black paint. Any analysis of the effects of molecular and particulate contamination must account for a complex array of sources, wide variation in processes over time, and extreme variation in environment from ground to launch to flight. Surface conditions at certain locations on LDEF were established by outgassing of molecular species from particular materials onto adjacent surfaces, followed by alteration of those species due to exposure to atomic oxygen and/or solar radiation.

  4. Separating More Sources Than Sensors Using Time-Frequency Distributions

    NASA Astrophysics Data System (ADS)

    Linh-Trung, Nguyen; Belouchrani, Adel; Abed-Meraim, Karim; Boashash, Boualem

    2005-12-01

    We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF) signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs). The underlying assumption is that the original sources are disjoint in the time-frequency (TF) domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD) matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.

  5. Adaptive Source Coding Schemes for Geometrically Distributed Integer Alphabets

    NASA Technical Reports Server (NTRS)

    Cheung, K-M.; Smyth, P.

    1993-01-01

    Revisit the Gallager and van Voorhis optimal source coding scheme for geometrically distributed non-negative integer alphabets and show that the various subcodes in the popular Rice algorithm can be derived from the Gallager and van Voorhis code.

  6. The costs of long-term care: distribution and responsibility.

    PubMed

    Wallack, S S; Cohen, M A

    1988-01-01

    Long-term care costs will result in financial hardship for millions of elderly Americans and their families. The growing number of elderly people has focused public attention on the catastrophic problem of coverage for long-term care. Social insurance is unlikely to emerge as a solution in the USA. One reason is that the expected total cost is viewed as an unmanageable burden by both Federal and State governments. To others, it is the uncertainty surrounding the projected costs. This paper reports on the results of a double-decrement life-table analysis, based on a national survey of the elderly taken in early 1977 and one year later, that estimated the distribution and total lifetime nursing-home costs of the elderly. Combining the probability of nursing-home entry and length of stay, a 65-year-old faces a 43% chance of entering a nursing home and spending about +11,000 (1980 dollars). The distribution of lifetime costs is however very skewed with 13% of the elderly consuming 90% of the resources. Thus, while the costs of nursing-home care can be catastrophic for an individual, spread across a group they are not unmanageable. Given the distribution of income and assets among the elderly, a sizeable proportion could readily afford the necessary premiums of different emerging insurance and delivery programmes. Alternative private and public models of long-term care must be evaluated in terms of the goals of a finance and delivery system for long-term care. PMID:3129256

  7. Near term climate projections for invasive species distributions

    USGS Publications Warehouse

    Jarnevich, C.S.; Stohlgren, T.J.

    2009-01-01

    Climate change and invasive species pose important conservation issues separately, and should be examined together. We used existing long term climate datasets for the US to project potential climate change into the future at a finer spatial and temporal resolution than the climate change scenarios generally available. These fine scale projections, along with new species distribution modeling techniques to forecast the potential extent of invasive species, can provide useful information to aide conservation and invasive species management efforts. We created habitat suitability maps for Pueraria montana (kudzu) under current climatic conditions and potential average conditions up to 30 years in the future. We examined how the potential distribution of this species will be affected by changing climate, and the management implications associated with these changes. Our models indicated that P. montana may increase its distribution particularly in the Northeast with climate change and may decrease in other areas. ?? 2008 Springer Science+Business Media B.V.

  8. 14 CFR 25.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... functioning normally. (2) Essential loads, after failure of any one prime mover, power converter, or energy... source of power is required, after any failure or malfunction in any one power supply system... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Power source capacity and distribution....

  9. 14 CFR 23.1310 - Power source capacity and distribution.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Power source capacity and distribution. 23.1310 Section 23.1310 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL, UTILITY, ACROBATIC, AND COMMUTER CATEGORY AIRPLANES Equipment General § 23.1310 Power source...

  10. Trace metal source terms in carbon sequestration environments.

    PubMed

    Karamalidis, Athanasios K; Torres, Sharon G; Hakala, J Alexandra; Shao, Hongbo; Cantrell, Kirk J; Carroll, Susan

    2013-01-01

    Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising; however, possible CO(2) or CO(2)-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define a range of concentrations that can be used as the trace element source term for reservoirs and leakage pathways in risk simulations. Storage source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from cements and sandstones, shales, carbonates, evaporites, and basalts from the Frio, In Salah, Illinois Basin, Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands, and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution was tracked by measuring solution concentrations over time under conditions (e.g., pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for maximum contaminant levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments because of the presence of CO(2). Results indicate that Cr and Pb released from sandstone reservoir and shale cap rocks exceed the MCLs by an order of magnitude, while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the trace element source term for reservoirs and leakage pathways in risk simulations to further evaluate the impact of leakage on groundwater quality. PMID:23215015

  11. Trace Metal Source Terms in Carbon Sequestration Environments

    SciTech Connect

    Karamalidis, Athanasios K; Torres, Sharon G; Hakala, J Alexandra; Shao, Hongbo; Cantrell, Kirk J; Carroll, Susan

    2012-02-05

    Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising, however, possible CO₂ or CO₂-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define trace metal source terms from the reaction of supercritical CO₂, storage reservoir brines, reservoir and cap rocks. Storage reservoir source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from sandstones, shales, carbonates, evaporites, basalts and cements from the Frio, In Salah, Illinois Basin – Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution is tracked by measuring solution concentrations over time under conditions (e.g. pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for Maximum Contaminant Levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments due to the presence of CO₂. Results indicate that Cr and Pb released from sandstone reservoir and shale cap rock exceed the MCLs by an order of magnitude while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the reservoir and caprock source term to further evaluate the impact of leakage on groundwater quality.

  12. Trace Metal Source Terms in Carbon Sequestration Environments

    SciTech Connect

    Karamalidis, Athanasios; Torres, Sharon G.; Hakala, Jacqueline A.; Shao, Hongbo; Cantrell, Kirk J.; Carroll, Susan A.

    2013-01-01

    ABSTRACT: Carbon dioxide sequestration in deep saline and depleted oil geologic formations is feasible and promising; however, possible CO2 or CO2-saturated brine leakage to overlying aquifers may pose environmental and health impacts. The purpose of this study was to experimentally define to provide a range of concentrations that can be used as the trace element source term for reservoirs and leakage pathways in risk simulations. Storage source terms for trace metals are needed to evaluate the impact of brines leaking into overlying drinking water aquifers. The trace metal release was measured from cements and sandstones, shales, carbonates, evaporites, and basalts from the Frio, In Salah, Illinois Basin, Decatur, Lower Tuscaloosa, Weyburn-Midale, Bass Islands, and Grand Ronde carbon sequestration geologic formations. Trace metal dissolution was tracked by measuring solution concentrations over time under conditions (e.g., pressures, temperatures, and initial brine compositions) specific to the sequestration projects. Existing metrics for maximum contaminant levels (MCLs) for drinking water as defined by the U.S. Environmental Protection Agency (U.S. EPA) were used to categorize the relative significance of metal concentration changes in storage environments because of the presence of CO2. Results indicate that Cr and Pb released from sandstone reservoir and shale cap rocks exceed the MCLs byan order of magnitude, while Cd and Cu were at or below drinking water thresholds. In carbonate reservoirs As exceeds the MCLs by an order of magnitude, while Cd, Cu, and Pb were at or below drinking water standards. Results from this study can be used as a reasonable estimate of the trace element source term for reservoirs and leakage pathways in risk simulations to further evaluate the impact of leakage on groundwater quality.

  13. Long-term cycles in cosmic X-ray sources

    NASA Technical Reports Server (NTRS)

    Priedhorsky, W. C.; Holt, S. S.

    1987-01-01

    Data on long-term cycles in galactic X-ray sources are reviewed, and classes of variations are identified including precessional activity, recurrent outbursts in Population II sources, and Be/neutron star flare cycles. Cycles of 30-300 days have been found in LMC X-4, Her X-1, SS433, and Cyg X-1 which represent cyclic variations in both the inner and outer parts of the accretion disk. Quasi-periodic cycles with periods ranging from 1/2 to 2 years have been noted in several low-mass X-ray binaries. It is suggested that periodic outbursts in the Be/neutron star systems may result from variable mass transfer in a wide eccentric orbit.

  14. Development of alternate methods of determining integrated SMR source terms

    SciTech Connect

    Barry, Kenneth

    2014-06-10

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted to the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced

  15. Long-Term Stability of the NIST Standard Ultrasonic Source.

    PubMed

    Fick, Steven E

    2008-01-01

    The National Institute of Standards and Technology (NIST) Standard Ultrasonic Source (SUS) is a system comprising a transducer capable of output power levels up to 1 W at multiple frequencies between 1 MHz and 30 MHz, and an electrical impedance-matching network that allows the system to be driven by a conventional 50 Ω rf (radio-frequency) source. It is designed to allow interlaboratory replication of ultrasonic power levels with high accuracy using inexpensive readily available ancillary equipment. The SUS was offered for sale for 14 years (1985 to 1999). Each system was furnished with data for the set of calibration points (combinations of power level and frequency) specified by the customer. Of the systems that had been ordered with some calibration points in common, three were returned more than once to NIST for recalibration. Another system retained at NIST has been recalibrated periodically since 1984. The collective data for these systems comprise 9 calibration points and 102 measurements spanning a 17 year interval ending in 2001, the last year NIST ultrasonic power measurement services were available to the public. These data have been analyzed to compare variations in output power with frequency, power level, and time elapsed since the first calibration. The results verify the claim, made in the instruction sheet furnished with every SUS, that "long-term drift, if any, in the calibration of NIST Standard Sources is insignificant compared to the uncertainties associated with a single measurement of ultrasonic power by any method available at NIST." PMID:27096127

  16. Accident source terms for light-water nuclear power plants using high-burnup or MOX fuel.

    SciTech Connect

    Salay, Michael; Gauntt, Randall O.; Lee, Richard Y.; Powers, Dana Auburn; Leonard, Mark Thomas

    2011-01-01

    Representative accident source terms patterned after the NUREG-1465 Source Term have been developed for high burnup fuel in BWRs and PWRs and for MOX fuel in a PWR with an ice-condenser containment. These source terms have been derived using nonparametric order statistics to develop distributions for the timing of radionuclide release during four accident phases and for release fractions of nine chemical classes of radionuclides as calculated with the MELCOR 1.8.5 accident analysis computer code. The accident phases are those defined in the NUREG-1465 Source Term - gap release, in-vessel release, ex-vessel release, and late in-vessel release. Important differences among the accident source terms derived here and the NUREG-1465 Source Term are not attributable to either fuel burnup or use of MOX fuel. Rather, differences among the source terms are due predominantly to improved understanding of the physics of core meltdown accidents. Heat losses from the degrading reactor core prolong the process of in-vessel release of radionuclides. Improved understanding of the chemistries of tellurium and cesium under reactor accidents changes the predicted behavior characteristics of these radioactive elements relative to what was assumed in the derivation of the NUREG-1465 Source Term. An additional radionuclide chemical class has been defined to account for release of cesium as cesium molybdate which enhances molybdenum release relative to other metallic fission products.

  17. A comparison of world-wide uses of severe reactor accident source terms

    SciTech Connect

    Ang, M.L.; Frid, W.; Kersting, E.J.; Friederichs, H.G.; Lee, R.Y.; Meyer-Heine, A.; Powers, D.A.; Soda, K.; Sweet, D.

    1994-09-01

    The definitions of source terms to reactor containments and source terms to the environment are discussed. A comparison is made between the TID-14844 example source term and the alternative source term described in NUREG-1465. Comparisons of these source terms to the containments and those used in France, Germany, Japan, Sweden, and the United Kingdom are made. Source terms to the environment calculated in NUREG-1500 and WASH-1400 are discussed. Again, these source terms are compared to those now being used in France, Germany, Japan, Sweden, and the United Kingdom. It is concluded that source terms to the containment suggested in NUREG-1465 are not greatly more conservative than those used in other countries. Technical bases for the source terms are similar. The regulatory use of the current understanding of radionuclide behavior varies among countries.

  18. Tank waste source term inventory validation. Volume 1. Letter report

    SciTech Connect

    Brevick, C.H.; Gaddis, L.A.; Johnson, E.D.

    1995-04-28

    The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories. This document is Volume I of the Letter Report entitled Tank Waste Source Term Inventory Validation.

  19. Estimating Source Terms for Diverse Spent Nuclear Fuel Types

    SciTech Connect

    Brett Carlsen; Layne Pincock

    2004-11-01

    The U.S. Department of Energy (DOE) National Spent Nuclear Fuel Program is responsible for developing a defensible methodology for determining the radionuclide inventory for the DOE spent nuclear fuel (SNF) to be dispositioned at the proposed Monitored Geologic Repository at the Yucca Mountain Site. SNF owned by DOE includes diverse fuels from various experimental, research, and production reactors. These fuels currently reside at several DOE sites, universities, and foreign research reactor sites. Safe storage, transportation, and ultimate disposal of these fuels will require radiological source terms as inputs to safety analyses that support design and licensing of the necessary equipment and facilities. This paper summarizes the methodology developed for estimating radionuclide inventories associated with DOE-owned SNF. The results will support development of design and administrative controls to manage radiological risks and may later be used to demonstrate conformance with repository acceptance criteria.

  20. Computational determination of absorbed dose distributions from gamma ray sources

    NASA Astrophysics Data System (ADS)

    Zhou, Chuanyu; Inanc, Feyzi

    2001-04-01

    A biomedical procedure known as brachytherapy involves insertion of many radioactive seeds into a sick gland for eliminating sick tissue. For such implementations, the spatial distribution of absorbed dose is very important. A simulation tool has been developed to determine the spatial distribution of absorbed dose in heterogeneous environments where the gamma ray source consists of many small internal radiation emitters. The computation is base on integral transport method and the computations are done in a parallel fashion. Preliminary results involving 137Cs and 125I sources surrounded by water and comparison of the results to the experimental and computational data available in the literature are presented.

  1. The brightness and spatial distributions of terrestrial radio sources

    NASA Astrophysics Data System (ADS)

    Offringa, A. R.; de Bruyn, A. G.; Zaroubi, S.; Koopmans, L. V. E.; Wijnholds, S. J.; Abdalla, F. B.; Brouw, W. N.; Ciardi, B.; Iliev, I. T.; Harker, G. J. A.; Mellema, G.; Bernardi, G.; Zarka, P.; Ghosh, A.; Alexov, A.; Anderson, J.; Asgekar, A.; Avruch, I. M.; Beck, R.; Bell, M. E.; Bell, M. R.; Bentum, M. J.; Best, P.; Bîrzan, L.; Breitling, F.; Broderick, J.; Brüggen, M.; Butcher, H. R.; de Gasperin, F.; de Geus, E.; de Vos, M.; Duscha, S.; Eislöffel, J.; Fallows, R. A.; Ferrari, C.; Frieswijk, W.; Garrett, M. A.; Grießmeier, J.; Hassall, T. E.; Horneffer, A.; Iacobelli, M.; Juette, E.; Karastergiou, A.; Klijn, W.; Kondratiev, V. I.; Kuniyoshi, M.; Kuper, G.; van Leeuwen, J.; Loose, M.; Maat, P.; Macario, G.; Mann, G.; McKean, J. P.; Meulman, H.; Norden, M. J.; Orru, E.; Paas, H.; Pandey-Pommier, M.; Pizzo, R.; Polatidis, A. G.; Rafferty, D.; Reich, W.; van Nieuwpoort, R.; Röttgering, H.; Scaife, A. M. M.; Sluman, J.; Smirnov, O.; Sobey, C.; Tagger, M.; Tang, Y.; Tasse, C.; Veen, S. ter; Toribio, C.; Vermeulen, R.; Vocks, C.; van Weeren, R. J.; Wise, M. W.; Wucknitz, O.

    2013-10-01

    Faint undetected sources of radio-frequency interference (RFI) might become visible in long radio observations when they are consistently present over time. Thereby, they might obstruct the detection of the weak astronomical signals of interest. This issue is especially important for Epoch of Reionization (EoR) projects that try to detect the faint redshifted H I signals from the time of the earliest structures in the Universe. We explore the RFI situation at 30-163 MHz by studying brightness histograms of visibility data observed with Low-Frequency Array (LOFAR), similar to radio-source-count analyses that are used in cosmology. An empirical RFI distribution model is derived that allows the simulation of RFI in radio observations. The brightness histograms show an RFI distribution that follows a power-law distribution with an estimated exponent around -1.5. With several assumptions, this can be explained with a uniform distribution of terrestrial radio sources whose radiation follows existing propagation models. Extrapolation of the power law implies that the current LOFAR EoR observations should be severely RFI limited if the strength of RFI sources remains strong after time integration. This is in contrast with actual observations, which almost reach the thermal noise and are thought not to be limited by RFI. Therefore, we conclude that it is unlikely that there are undetected RFI sources that will become visible in long observations. Consequently, there is no indication that RFI will prevent an EoR detection with LOFAR.

  2. Continuous-variable quantum key distribution with Gaussian source noise

    SciTech Connect

    Shen Yujie; Peng Xiang; Yang Jian; Guo Hong

    2011-05-15

    Source noise affects the security of continuous-variable quantum key distribution (CV QKD) and is difficult to analyze. We propose a model to characterize Gaussian source noise through introducing a neutral party (Fred) who induces the noise with a general unitary transformation. Without knowing Fred's exact state, we derive the security bounds for both reverse and direct reconciliations and show that the bound for reverse reconciliation is tight.

  3. Electric Transport Traction Power Supply System With Distributed Energy Sources

    NASA Astrophysics Data System (ADS)

    Abramov, E. Y.; Schurov, N. I.; Rozhkova, M. V.

    2016-04-01

    The paper states the problem of traction substation (TSS) leveling of daily-load curve for urban electric transport. The circuit of traction power supply system (TPSS) with distributed autonomous energy source (AES) based on photovoltaic (PV) and energy storage (ES) units is submitted here. The distribution algorithm of power flow for the daily traction load curve leveling is also introduced in this paper. In addition, it illustrates the implemented experiment model of power supply system.

  4. Distributed source x-ray tube technology for tomosynthesis imaging

    PubMed Central

    Sprenger, F.; Calderon-Colon, X.; Cheng, Y.; Englestad, K.; Lu, J.; Maltz, J.; Paidi, A.; Qian, X.; Spronk, D.; Sultana, S.; Yang, G.; Zhou, O.

    2011-01-01

    Tomosynthesis imaging requires projection images from different viewing angles. Conventional systems use a moving xray source to acquire the individual projections. Using a stationary distributed x-ray source with a number of sources that equals the number of required projections, this can be achieved without any mechanical motion. Advantages are a potentially faster image acquisition speed, higher spatial and temporal resolution and simple system design. We present distributed x-ray sources based on carbon nanotube (CNT) field emission cathodes. The field emission cathodes deliver the electrons required for x-ray production. CNT emitters feature a stable emission at high current density, a cold emission, excellent temporal control of the emitted electrons and good configurability. We discuss the use of stationary sources for two applications: (i) a linear tube for stationary digital breast tomosynthesis (sDBT), and (ii) a square tube for on-board tomosynthesis image-guided radiation therapy (IGRT). Results from high energy distributed sources up to 160kVp are also presented. PMID:21785671

  5. Method for image reconstruction of moving radionuclide source distribution

    DOEpatents

    Stolin, Alexander V.; McKisson, John E.; Lee, Seung Joon; Smith, Mark Frederick

    2012-12-18

    A method for image reconstruction of moving radionuclide distributions. Its particular embodiment is for single photon emission computed tomography (SPECT) imaging of awake animals, though its techniques are general enough to be applied to other moving radionuclide distributions as well. The invention eliminates motion and blurring artifacts for image reconstructions of moving source distributions. This opens new avenues in the area of small animal brain imaging with radiotracers, which can now be performed without the perturbing influences of anesthesia or physical restraint on the biological system.

  6. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    NASA Technical Reports Server (NTRS)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  7. Wave spectra partitioning and long term statistical distribution

    NASA Astrophysics Data System (ADS)

    Portilla-Yandún, Jesús; Cavaleri, Luigi; Van Vledder, Gerbrant Ph.

    2015-12-01

    A new method is presented for a physically based statistical description of wind wave climatology. The method applies spectral partitioning to identify individual wave systems (partitions) in time series of 2D-wave spectra, followed by computing the probability of occurrence of their (peak) position in frequency-direction space. This distribution can be considered as a spectral density function to which another round of partitioning is applied to obtain spectral domains, each representing a typical wave system or population in a statistical sense. This two-step partitioning procedure allows identifying aggregate wave systems without the need to discuss specific characteristics as wind sea and swell systems. We suggest that each of these aggregate wave systems (populations) is linked to a specific generation pattern opening the way to dedicated analyses. Each population (of partitions) can be subjected to further analyses to add dimension carrying information based on integrated wave parameters of each partition, such as significant wave height, wave age, mean wave period and direction, among others. The new method is illustrated by analysing model spectra from a numerical wave prediction model and measured spectra from a directional wave buoy located in the Southern North Sea. It is shown that these two sources of information yield consistent results. Examples are given of computing the statistical distribution of significant wave height, spectral energy distribution and the spatial variation of wind wave characteristics along a north-south transect in the North Sea. Wind or wave age information can be included as an extra attribute of the members of a population to label them as wind sea or swell systems. Finally, suggestions are given for further applications of this new method.

  8. Production, Distribution, and Applications of Californium-252 Neutron Sources

    SciTech Connect

    Balo, P.A.; Knauer, J.B.; Martin, R.C.

    1999-10-03

    The radioisotope {sup 252}Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6-year half-life. A source the size of a person's little finger can emit up to 10{sup 11} neutrons/s. Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement, and minerals, as well as for detection and identification of explosives, laud mines, and unexploded military ordnance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards, and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 years of experience and by U.S. Bureau of Mines tests of source survivability during explosions. The production and distribution center for the U. S Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory (ORNL). DOE sells The radioisotope {sup 252}Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6- year half-life. A source the size of a person's little finger can emit up to 10 neutrons/s. Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement, and minerals, as well as for detection and identification of explosives, laud mines, and unexploded military ordnance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards, and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 years of experience and by U.S. Bureau of Mines tests of source survivability during explosions. The production and distribution center for the U. S Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory(ORNL). DOE sells {sup 252}Cf to commercial

  9. Security of quantum key distribution with light sources that are not independently and identically distributed

    NASA Astrophysics Data System (ADS)

    Nagamatsu, Yuichi; Mizutani, Akihiro; Ikuta, Rikizo; Yamamoto, Takashi; Imoto, Nobuyuki; Tamaki, Kiyoshi

    2016-04-01

    Although quantum key distribution (QKD) is theoretically secure, there is a gap between the theory and practice. In fact, real-life QKD may not be secure because component devices in QKD systems may deviate from the theoretical models assumed in security proofs. To solve this problem, it is necessary to construct the security proof under realistic assumptions on the source and measurement unit. In this paper, we prove the security of a QKD protocol under practical assumptions on the source that accommodate fluctuation of the phase and intensity modulations. As long as our assumptions hold, it does not matter at all how the phase and intensity distribute or whether or not their distributions over different pulses are independently and identically distributed. Our work shows that practical sources can be safely employed in QKD experiments.

  10. Production, distribution and applications of californium-252 neutron sources.

    PubMed

    Martin, R C; Knauer, J B; Balo, P A

    2000-01-01

    The radioisotope 252Cf is routinely encapsulated into compact, portable, intense neutron sources with a 2.6-yr half-life. A source the size of a person's little finger can emit up to 10(11) neutrons s(-1). Californium-252 is used commercially as a reliable, cost-effective neutron source for prompt gamma neutron activation analysis (PGNAA) of coal, cement and minerals, as well as for detection and identification of explosives, land mines and unexploded military ordinance. Other uses are neutron radiography, nuclear waste assays, reactor start-up sources, calibration standards and cancer therapy. The inherent safety of source encapsulations is demonstrated by 30 yr of experience and by US Bureau of Mines tests of source survivability during explosions. The production and distribution center for the US Department of Energy (DOE) Californium Program is the Radiochemical Engineering Development Center (REDC) at Oak Ridge National Laboratory (ORNL). DOE sells 252Cf to commercial reencapsulators domestically and internationally. Sealed 252Cf sources are also available for loan to agencies and subcontractors of the US government and to universities for educational, research and medical applications. The REDC has established the Californium User Facility (CUF) for Neutron Science to make its large inventory of 252Cf sources available to researchers for irradiations inside uncontaminated hot cells. Experiments at the CUF include a land mine detection system, neutron damage testing of solid-state detectors, irradiation of human cancer cells for boron neutron capture therapy experiments and irradiation of rice to induce genetic mutations. PMID:11003521

  11. Understanding the electrical behavior of the action potential in terms of elementary electrical sources.

    PubMed

    Rodriguez-Falces, Javier

    2015-03-01

    A concept of major importance in human electrophysiology studies is the process by which activation of an excitable cell results in a rapid rise and fall of the electrical membrane potential, the so-called action potential. Hodgkin and Huxley proposed a model to explain the ionic mechanisms underlying the formation of action potentials. However, this model is unsuitably complex for teaching purposes. In addition, the Hodgkin and Huxley approach describes the shape of the action potential only in terms of ionic currents, i.e., it is unable to explain the electrical significance of the action potential or describe the electrical field arising from this source using basic concepts of electromagnetic theory. The goal of the present report was to propose a new model to describe the electrical behaviour of the action potential in terms of elementary electrical sources (in particular, dipoles). The efficacy of this model was tested through a closed-book written exam. The proposed model increased the ability of students to appreciate the distributed character of the action potential and also to recognize that this source spreads out along the fiber as function of space. In addition, the new approach allowed students to realize that the amplitude and sign of the extracellular electrical potential arising from the action potential are determined by the spatial derivative of this intracellular source. The proposed model, which incorporates intuitive graphical representations, has improved students' understanding of the electrical potentials generated by bioelectrical sources and has heightened their interest in bioelectricity. PMID:25727465

  12. SOURCE TERM TARGETED THRUST FY 2005 NEW START PROJECTS

    SciTech Connect

    NA

    2005-10-05

    While a significant amount of work has been devoted to developing thermodynamic data. describing the sorption of radionuclides to iron oxides and other geomedia, little data exist to describe the interaction of key radionuclides found in high-level radioactive waste with the uranium surfaces expected in corroded spent nuclear fuel (SNF) waste packages. Recent work indicates that actinide adsorption to the U(VI) solids expected in the engineered barrier system may play a key role in the reduction of dissolved concentrations of radionuclides such as Np(V). However, little is known about the mechanism(s) of adsorption, nor are the thermodynamic data available to represent the phenomenon in predictive modeling codes. Unfortunately, this situation makes it difficult to consider actinide adsorption to the U(VI) silicates in either geochemical or performance assessment (PA) predictions. The primary goal in the Source Term Targeted Thrust area is to ''study processes that control radionuclide release from the waste form''. Knowledge of adsorption of actinides to U(VI) silicate solids its and parameterization in geochemical models will be an important step towards this goal.

  13. Verification test calculations for the Source Term Code Package

    SciTech Connect

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  14. Challenges in defining a radiologic and hydrologic source term for underground nuclear test centers, Nevada Test Site, Nye County, Nevada

    SciTech Connect

    Smith, D.K.

    1995-06-01

    The compilation of a radionuclide inventory for long-lived radioactive contaminants residual from nuclear testing provides a partial measure of the radiologic source term at the Nevada Test Site. The radiologic source term also includes potentially mobile short-lived radionuclides excluded from the inventory. The radiologic source term for tritium is known with accuracy and is equivalent to the hydrologic source term within the saturated zone. Definition of the total hydrologic source term for fission and activation products that have high activities for decades following underground testing involves knowledge and assumptions which are presently unavailable. Systematic investigation of the behavior of fission products, activation products and actinides under saturated or Partially saturated conditions is imperative to define a representative total hydrologic source term. This is particularly important given the heterogeneous distribution of radionuclides within testing centers. Data quality objectives which emphasize a combination of measurements and credible estimates of the hydrologic source term are a priority for near-field investigations at the Nevada Test Site.

  15. Oil source bed distribution in upper Tertiary of Gulf Coast

    SciTech Connect

    Dow, W.G.

    1985-02-01

    Effective oil source beds have not been reported in Miocene and younger Gulf Coast sediments and the organic matter present is invariably immature and oxidized. Crude oil composition, however, indicates origin from mature source beds containing reduced kerogen. Oil distribution suggests extensive vertical migration through fracture systems from localized sources in deeply buried, geopressured shales. A model is proposed in which oil source beds were deposited in intraslope basins that formed behind salt ridges. The combination of silled basin topography, rapid sedimentation, and enhanced oxygen-minimum zones during global warmups resulted in periodic anoxic environments and preservation of oil-generating organic matter. Anoxia was most widespread during the middle Miocene and Pliocene transgressions and rare during regressive cycles when anoxia occurred primarily in hypersaline conditions such as exist today in the Orca basin.

  16. Secure quantum key distribution with an uncharacterized source.

    PubMed

    Koashi, Masato; Preskill, John

    2003-02-01

    We prove the security of the Bennett-Brassard (BB84) quantum key distribution protocol for an arbitrary source whose averaged states are basis independent, a condition that is automatically satisfied if the source is suitably designed. The proof is based on the observation that, to an adversary, the key extraction process is equivalent to a measurement in the sigma(x) basis performed on a pure sigma(z)-basis eigenstate. The dependence of the achievable key length on the bit error rate is the same as that established by Shor and Preskill [Phys. Rev. Lett. 85, 441 (2000)

  17. Space distribution of extragalactic sources - Cosmology versus evolution

    NASA Technical Reports Server (NTRS)

    Cavaliere, A.; Maccacaro, T.

    1990-01-01

    Alternative cosmologies have been recurrently invoked to explain in terms of global spacetime structure the apparent large increase, with increasing redshift, in the average luminosity of active galactic nuclei. These models interestingly seek to avoid the complexities of the canonical interpretation in terms of intrinsic population evolutions in a Friedmann universe. However, a problem of consistency for these cosmologies is pointed out, since they have to include also other classes of extragalactic sources, such as clusters of galaxies and BL Lac objects, for which there is preliminary evidence of a different behavior.

  18. Review: Particle number size distributions from seven major sources and implications for source apportionment studies

    NASA Astrophysics Data System (ADS)

    Vu, Tuan V.; Delgado-Saborit, Juana Maria; Harrison, Roy M.

    2015-12-01

    The particle number size distribution (PNSD) of airborne particles not only provides us with information about sources and atmospheric processing of particles, but also plays an important role in determining regional lung dose. As a result, urban particles and their size distributions have received much attention with a rapid increase of publications in recent years. The object of this review is to synthesise and analyse existing knowledge on particles in urban environments with a focus on their number concentration and size distribution. This study briefly reviews the characterization of PNSD from seven major sources of urban particles including traffic emissions, industrial emissions, biomass burning, cooking, transported aerosol, marine aerosol and nucleation. It then discusses atmospheric physical processes such as coagulation or condensation which have a strong influence on PNSD. Finally, the implications of PNSD datasets for source modelling are briefly discussed. Based on this review, it is concluded that the concentrations, modal structures and temporal patterns of urban particles are strongly influenced by traffic emissions, which are identified as the main source of particle number in urban environments. Information derived from particle number size distributions is beginning to play an important role in source apportionment studies.

  19. Spatiotemporal distributions of tsunami sources and discovered periodicities

    NASA Astrophysics Data System (ADS)

    Levin, B. W.; Sasorova, E. V.

    2014-09-01

    Both spatial and spatiotemporal distributions of the sources of tsunamigenic earthquakes of tectonic origin over the last 112 years have been analyzed. This analysis has been made using tsunami databases published by the Institute of Computational Mathematics and Mathematical Geophysics (Siberian Branch, Russian Academy of Sciences) and the National Aeronautics and Space Administration (United States), as well as earthquake catalogs published by the National Earthquake Information Center (United States). It has been found that the pronounced activation of seismic processes and an increase in the total energy of tsunamigenic earthquakes were observed at the beginning of both the 20th (1905-1920) and 21st (2004-2011) centuries. Studying the spatiotemporal periodicity of such events on the basis of an analysis of the two-dimensional distributions of the sources of tectonic tsunamis has made it possible to determine localized latitudinal zones with a total lack of such events (90°-75° N, 45°-90° S, and 35°-25° N) and regions with a periodic occurrence of tsunamis mainly within the middle (65°-35° N and 25°-40° S) and subequatorial (15° N-20° S) latitudes of the Northern and Southern hemispheres. The objective of this work is to analyze the spatiotemporal distributions of sources of tsunamigenic earthquakes and the effect of the periodic occurrence of such events on the basis of data taken from global tsunami catalogs.

  20. Distributed policy based access to networked heterogeneous ISR data sources

    NASA Astrophysics Data System (ADS)

    Bent, G.; Vyvyan, D.; Wood, David; Zerfos, Petros; Calo, Seraphin

    2010-04-01

    Within a coalition environment, ad hoc Communities of Interest (CoI's) come together, perhaps for only a short time, with different sensors, sensor platforms, data fusion elements, and networks to conduct a task (or set of tasks) with different coalition members taking different roles. In such a coalition, each organization will have its own inherent restrictions on how it will interact with the others. These are usually stated as a set of policies, including security and privacy policies. The capability that we want to enable for a coalition operation is to provide access to information from any coalition partner in conformance with the policies of all. One of the challenges in supporting such ad-hoc coalition operations is that of providing efficient access to distributed sources of data, where the applications requiring the data do not have knowledge of the location of the data within the network. To address this challenge the International Technology Alliance (ITA) program has been developing the concept of a Dynamic Distributed Federated Database (DDFD), also know as a Gaian Database. This type of database provides a means for accessing data across a network of distributed heterogeneous data sources where access to the information is controlled by a mixture of local and global policies. We describe how a network of disparate ISR elements can be expressed as a DDFD and how this approach enables sensor and other information sources to be discovered autonomously or semi-autonomously and/or combined, fused formally defined local and global policies.

  1. Robust video transmission with distributed source coded auxiliary channel.

    PubMed

    Wang, Jiajun; Majumdar, Abhik; Ramchandran, Kannan

    2009-12-01

    We propose a novel solution to the problem of robust, low-latency video transmission over lossy channels. Predictive video codecs, such as MPEG and H.26x, are very susceptible to prediction mismatch between encoder and decoder or "drift" when there are packet losses. These mismatches lead to a significant degradation in the decoded quality. To address this problem, we propose an auxiliary codec system that sends additional information alongside an MPEG or H.26x compressed video stream to correct for errors in decoded frames and mitigate drift. The proposed system is based on the principles of distributed source coding and uses the (possibly erroneous) MPEG/H.26x decoder reconstruction as side information at the auxiliary decoder. The distributed source coding framework depends upon knowing the statistical dependency (or correlation) between the source and the side information. We propose a recursive algorithm to analytically track the correlation between the original source frame and the erroneous MPEG/H.26x decoded frame. Finally, we propose a rate-distortion optimization scheme to allocate the rate used by the auxiliary encoder among the encoding blocks within a video frame. We implement the proposed system and present extensive simulation results that demonstrate significant gains in performance both visually and objectively (on the order of 2 dB in PSNR over forward error correction based solutions and 1.5 dB in PSNR over intrarefresh based solutions for typical scenarios) under tight latency constraints. PMID:19703801

  2. Long-term Trend of Solar Coronal Hole Distribution from 1975 to 2014

    NASA Astrophysics Data System (ADS)

    Fujiki, K.; Tokumaru, M.; Hayashi, K.; Satonaka, D.; Hakamada, K.

    2016-08-01

    We developed an automated prediction technique for coronal holes using potential magnetic field extrapolation in the solar corona to construct a database of coronal holes appearing from 1975 February to 2015 July (Carrington rotations from 1625 to 2165). Coronal holes are labeled with the location, size, and average magnetic field of each coronal hole on the photosphere and source surface. As a result, we identified 3335 coronal holes and found that the long-term distribution of coronal holes shows a similar pattern known as the magnetic butterfly diagram, and polar/low-latitude coronal holes tend to decrease/increase in the last solar minimum relative to the previous two minima.

  3. Atmospheric PAHs in North China: Spatial distribution and sources.

    PubMed

    Zhang, Yanjun; Lin, Yan; Cai, Jing; Liu, Yue; Hong, Linan; Qin, Momei; Zhao, Yifan; Ma, Jin; Wang, Xuesong; Zhu, Tong; Qiu, Xinghua; Zheng, Mei

    2016-09-15

    Polycyclic aromatic hydrocarbons (PAHs), formed through incomplete combustion process, have adverse health effects. To investigate spatial distribution and sources of PAHs in North China, PAHs with passive sampling in 90 gridded sites during June to September in 2011 were analyzed. The average concentration of the sum of fifteen PAHs in North China is 220±14ng/m(3), with the highest in Shanxi, followed by Shandong and Hebei, and then the Beijing-Tianjin area. Major sources of PAHs are identified for each region of North China, coke process for Shanxi, biomass burning for Hebei and Shandong, and coal combustion for Beijing-Tianjin area, respectively. Emission inventory is combined with back trajectory analysis to study the influence of emissions from surrounding areas at receptor sites. Shanxi and Beijing-Tianjin areas are more influenced by sources nearby while regional sources have more impact on Hebei and Shandong areas. Results from this study suggest the areas where local emission should be the major target for control and areas where both local and regional sources should be considered for PAH abatement in North China. PMID:27241206

  4. Volatile Organic Compounds: Characteristics, distribution and sources in urban schools

    NASA Astrophysics Data System (ADS)

    Mishra, Nitika; Bartsch, Jennifer; Ayoko, Godwin A.; Salthammer, Tunga; Morawska, Lidia

    2015-04-01

    Long term exposure to organic pollutants, both inside and outside school buildings may affect children's health and influence their learning performance. Since children spend significant amount of time in school, air quality, especially in classrooms plays a key role in determining the health risks associated with exposure at schools. Within this context, the present study investigated the ambient concentrations of Volatile Organic Compounds (VOCs) in 25 primary schools in Brisbane with the aim to quantify the indoor and outdoor VOCs concentrations, identify VOCs sources and their contribution, and based on these; propose mitigation measures to reduce VOCs exposure in schools. One of the most important findings is the occurrence of indoor sources, indicated by the I/O ratio >1 in 19 schools. Principal Component Analysis with Varimax rotation was used to identify common sources of VOCs and source contribution was calculated using an Absolute Principal Component Scores technique. The result showed that outdoor 47% of VOCs were contributed by petrol vehicle exhaust but the overall cleaning products had the highest contribution of 41% indoors followed by air fresheners and art and craft activities. These findings point to the need for a range of basic precautions during the selection, use and storage of cleaning products and materials to reduce the risk from these sources.

  5. CMP reflection imaging via interferometry of distributed subsurface sources

    NASA Astrophysics Data System (ADS)

    Kim, D.; Brown, L. D.; Quiros, D. A.

    2015-12-01

    The theoretical foundations of recovering body wave energy via seismic interferometry are well established. However in practice, such recovery remains problematic. Here, synthetic seismograms computed for subsurface sources are used to evaluate the geometrical combinations of realistic ambient source and receiver distributions that result in useful recovery of virtual body waves. This study illustrates how surface receiver arrays that span a limited distribution suite of sources, can be processed to reproduce virtual shot gathers that result in CMP gathers which can be effectively stacked with traditional normal moveout corrections. To verify the feasibility of the approach in practice, seismic recordings of 50 aftershocks following the magnitude of 5.8 Virginia earthquake occurred in August, 2011 have been processed using seismic interferometry to produce seismic reflection images of the crustal structure above and beneath the aftershock cluster. Although monotonic noise proved to be problematic by significantly reducing the number of usable recordings, the edited dataset resulted in stacked seismic sections characterized by coherent reflections that resemble those seen on a nearby conventional reflection survey. In particular, "virtual" reflections at travel times of 3 to 4 seconds suggest reflector sat approximately 7 to 12 km depth that would seem to correspond to imbricate thrust structures formed during the Appalachian orogeny. The approach described here represents a promising new means of body wave imaging of 3D structure that can be applied to a wide array of geologic and energy problems. Unlike other imaging techniques using natural sources, this technique does not require precise source locations or times. It can thus exploit aftershocks too small for conventional analyses. This method can be applied to any type of microseismic cloud, whether tectonic, volcanic or man-made.

  6. Mapping the source distribution of microseisms using noise covariogram envelopes

    NASA Astrophysics Data System (ADS)

    Sadeghisorkhani, Hamzeh; Gudmundsson, Ólafur; Roberts, Roland; Tryggvason, Ari

    2016-06-01

    We introduce a method for mapping the noise-source distribution of microseisms which uses information from the full length of covariograms (cross-correlations). We derive a forward calculation based on the plane-wave assumption in 2-D, to formulate an iterative, linearized inversion of covariogram envelopes in the time domain. The forward calculation involves bandpass filtering of the covariograms. The inversion exploits the well-known feature of noise cross-correlation, that is, an anomaly in the noise field that is oblique to the interstation direction appears as cross-correlation amplitude at a smaller time lag than the in-line, surface wave arrival. Therefore, the inversion extracts more information from the covariograms than that contained at the expected surface wave arrival, and this allows us to work with few stations to find the propagation directions of incoming energy. The inversion is naturally applied to data that retain physical units that are not amplitude normalized in any way. By dividing a network into groups of stations, we can constrain the source location by triangulation. We demonstrate results of the method with synthetic data and one year (2012) of data from the Swedish National Seismic Network and also look at the seasonal variation of source distribution around Scandinavia. After preprocessing and cross-correlation, the stations are divided into five groups of 9-12 stations. We invert the envelopes of each group in eight period ranges between 2 and 25 s. Results show that the noise sources at short periods (less than 12 s) lie predominantly in the North Atlantic Ocean and the Barents Sea, and at longer periods the energy appears to have a broader distribution. The strongly anisotropic source distribution in this area is estimated to cause significant biases of velocity measurements compared to the level of heterogeneity in the region. The amplitude of the primary microseisms varies little over the year, but secondary microseisms are much

  7. Mapping the source distribution of microseisms using noise covariogram envelopes

    NASA Astrophysics Data System (ADS)

    Sadeghisorkhani, Hamzeh; Gudmundsson, Ólafur; Roberts, Roland; Tryggvason, Ari

    2016-03-01

    We introduce a method for mapping the noise-source distribution of microseisms which uses information from the full length of covariograms (cross-correlations). We derive a forward calculation based on the plane-wave assumption in 2D, to formulate an iterative, linearized inversion of covariogram envelopes in the time domain. The forward calculation involves bandpass filtering of the covariograms. The inversion exploits the well-known feature of noise cross-correlation, i.e., that an anomaly in the noise field that is oblique to the inter-station direction appears as cross-correlation amplitude at a smaller time lag than the in-line, surface-wave arrival. Therefore, the inversion extracts more information from the covariograms than that contained at the expected surface-wave arrival, and this allows us to work with few stations to find the propagation directions of incoming energy. The inversion is naturally applied to data that retain physical units, i.e., that are not amplitude normalized in any way. By dividing a network into groups of stations, we can constrain the source location by triangulation. We demonstrate results of the method with synthetic data and one year (2012) of data from the Swedish National Seismic Network (SNSN) and also look at the seasonal variation of source distribution around Scandinavia. After preprocessing and cross-correlation, the stations are divided into 5 groups of 9 to 12 stations. We invert the envelopes of each group in 8 period ranges between 2 to 25 sec. Results show that the noise sources at short periods (less than 12 sec) lie predominantly in the North Atlantic Ocean and the Barents Sea, and at longer periods the energy appears to have a broader distribution. The strongly anisotropic source distribution in this area is estimated to cause significant biases of velocity measurements compared to the level of heterogeneity in the region. The amplitude of the primary microseisms varies little over the year, but secondary

  8. The Impact of Source Distribution on Scalar Transport over Forested Hills

    NASA Astrophysics Data System (ADS)

    Ross, Andrew N.; Harman, Ian N.

    2015-08-01

    Numerical simulations of neutral flow over a two-dimensional, isolated, forested ridge are conducted to study the effects of scalar source distribution on scalar concentrations and fluxes over forested hills. Three different constant-flux sources are considered that span a range of idealized but ecologically important source distributions: a source at the ground, one uniformly distributed through the canopy, and one decaying with depth in the canopy. A fourth source type, where the in-canopy source depends on both the wind speed and the difference in concentration between the canopy and a reference concentration on the leaf, designed to mimic deposition, is also considered. The simulations show that the topographically-induced perturbations to the scalar concentration and fluxes are quantitatively dependent on the source distribution. The net impact is a balance of different processes affecting both advection and turbulent mixing, and can be significant even for moderate topography. Sources that have significant input in the deep canopy or at the ground exhibit a larger magnitude advection and turbulent flux-divergence terms in the canopy. The flows have identical velocity fields and so the differences are entirely due to the different tracer concentration fields resulting from the different source distributions. These in-canopy differences lead to larger spatial variations in above-canopy scalar fluxes for sources near the ground compared to cases where the source is predominantly located near the canopy top. Sensitivity tests show that the most significant impacts are often seen near to or slightly downstream of the flow separation or reattachment points within the canopy flow. The qualitative similarities to previous studies using periodic hills suggest that important processes occurring over isolated and periodic hills are not fundamentally different. The work has important implications for the interpretation of flux measurements over forests, even in

  9. Extending Marine Species Distribution Maps Using Non-Traditional Sources

    PubMed Central

    Moretzsohn, Fabio; Gibeaut, James

    2015-01-01

    Abstract Background Traditional sources of species occurrence data such as peer-reviewed journal articles and museum-curated collections are included in species databases after rigorous review by species experts and evaluators. The distribution maps created in this process are an important component of species survival evaluations, and are used to adapt, extend and sometimes contract polygons used in the distribution mapping process. New information During an IUCN Red List Gulf of Mexico Fishes Assessment Workshop held at The Harte Research Institute for Gulf of Mexico Studies, a session included an open discussion on the topic of including other sources of species occurrence data. During the last decade, advances in portable electronic devices and applications enable 'citizen scientists' to record images, location and data about species sightings, and submit that data to larger species databases. These applications typically generate point data. Attendees of the workshop expressed an interest in how that data could be incorporated into existing datasets, how best to ascertain the quality and value of that data, and what other alternate data sources are available. This paper addresses those issues, and provides recommendations to ensure quality data use. PMID:25941453

  10. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds. (a... security interest of the Government. (c) Source of funds. Loans under this subpart will be made as...

  11. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 12 2011-01-01 2011-01-01 false Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds. (a... security interest of the Government. (c) Source of funds. Loans under this subpart will be made as...

  12. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 12 2012-01-01 2012-01-01 false Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds. (a... security interest of the Government. (c) Source of funds. Loans under this subpart will be made as...

  13. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 12 2014-01-01 2013-01-01 true Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds. (a... security interest of the Government. (c) Source of funds. Loans under this subpart will be made as...

  14. 7 CFR 1822.268 - Rates, terms, and source of funds.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 12 2013-01-01 2013-01-01 false Rates, terms, and source of funds. 1822.268 Section... Site Loan Policies, Procedures, and Authorizations § 1822.268 Rates, terms, and source of funds. (a... security interest of the Government. (c) Source of funds. Loans under this subpart will be made as...

  15. Diversity, distribution and sources of bacteria in residential kitchens.

    PubMed

    Flores, Gilberto E; Bates, Scott T; Caporaso, J Gregory; Lauber, Christian L; Leff, Jonathan W; Knight, Rob; Fierer, Noah

    2013-02-01

    Bacteria readily colonize kitchen surfaces, and the exchange of microbes between humans and the kitchen environment can impact human health. However, we have a limited understanding of the overall diversity of these communities, how they differ across surfaces and sources of bacteria to kitchen surfaces. Here we used high-throughput sequencing of the 16S rRNA gene to explore biogeographical patterns of bacteria across > 80 surfaces within the kitchens of each of four households. In total, 34 bacterial and two archaeal phyla were identified, with most sequences belonging to the Actinobacteria, Bacteroidetes, Firmicutes and Proteobacteria. Genera known to contain common food-borne pathogens were low in abundance but broadly distributed throughout the kitchens, with different taxa exhibiting distinct distribution patterns. The most diverse communities were associated with infrequently cleaned surfaces such as fans above stoves, refrigerator/freezer door seals and floors. In contrast, the least diverse communities were observed in and around sinks, which were dominated by biofilm-forming Gram-negative lineages. Community composition was influenced by conditions on individual surfaces, usage patterns and dispersal from source environments. Human skin was the primary source of bacteria across all kitchen surfaces, with contributions from food and faucet water dominating in a few specific locations. This study demonstrates that diverse bacterial communities are widely distributed in residential kitchens and that the composition of these communities is often predictable. These results also illustrate the ease with which human- and food-associated bacteria can be transferred in residential settings to kitchen surfaces. PMID:23171378

  16. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  17. Sources and distribution of hexabromocyclododecanes (HBCDs) in Japanese river sediment.

    PubMed

    Managaki, Satoshi; Enomoto, Iku; Masunaga, Shigeki

    2012-03-01

    The distribution of hexabromocyclododecane (HBCD) in the sediment of three Japanese rivers with different characteristics (i.e., population and potential source in the catchment) was investigated and compared with the results estimated using a multimedia fate model (ChemCAN). High concentrations of HBCD in sediments in the range of 134-2060 ng g(-1) were found in a river receiving textile wastewater. This contrasted with the much lower concentrations (0.8-4.8 ng g(-1)) observed for an urban river (with a surrounding population of 1.8 million). The medians of observed HBCD concentrations in each river were close to those estimated based on the assumed input source (e.g., 1810 ng g(-1) for the observed median concentration, and 1436 ng g(-1) for the estimation, in the Kuzuryu River). These results demonstrated the importance of considering source contributions of HBCD, including both industrial and consumer sources, to aquatic environments, for reliable risk management. PMID:22286550

  18. Considering sources and detectors distributions for quantitative photoacoustic tomography

    PubMed Central

    Song, Ningning; Deumié, Carole; Da Silva, Anabela

    2014-01-01

    Photoacoustic tomography (PAT) is a hybrid imaging modality that takes advantage of high optical contrast brought by optical imaging and high spatial resolution brought by ultrasound imaging. However, the quantification in photoacoustic imaging is challenging. Multiple optical illumination approach has proven to achieve uncoupling of diffusion and absorption effects. In this paper, this protocol is adopted and synthetic photoacoustic data, blurred with some noise, were generated. The influence of the distribution of optical sources and transducers on the reconstruction of the absorption and diffusion coefficients maps is studied. Specific situations with limited view angles were examined. The results show multiple illuminations with a wide field improve the reconstructions. PMID:25426322

  19. Size distributions, sources and source areas of water-soluble organic carbon in urban background air

    NASA Astrophysics Data System (ADS)

    Timonen, H.; Saarikoski, S.; Tolonen-Kivimäki, O.; Aurela, M.; Saarnio, K.; Petäjä, T.; Aalto, P. P.; Kulmala, M.; Pakkanen, T.; Hillamo, R.

    2008-04-01

    This paper represents the results of one year long measurement period of the size distributions of water-soluble organic carbon (WSOC), inorganic ions and gravimetric mass of particulate matter. Measurements were done at an urban background station (SMEAR III) by using a micro-orifice uniform deposit impactor (MOUDI). The site is located in northern European boreal region in Helsinki, Finland. The WSOC size distribution measurements were completed with the chemical analysis of inorganic ions, organic carbon (OC) and monosaccharide anhydrides from the filter samples. During the measurements gravimetric mass in the MOUDI collections varied between 3.4 and 55.0 μg m-3 and the WSOC concentration was between 0.3 and 7.4 μg m-3. On average, water-soluble particulate organic matter (WSPOM, WSOC multiplied by 1.6) comprised 25±7.7% and 7.5±3.4% of aerosol PM1 mass and the PM1-10 mass, respectively. Inorganic ions contributed 33±12% and 28±19% of the analyzed PM1 and PM1-10 aerosol mass. Five different aerosol categories corresponding to different sources or source areas were identified (long-range transport aerosols, biomass burning aerosols from wild land fires and from small-scale wood combustion, aerosols originating from marine areas and from the clean arctic areas). Clear differences in WSOC concentrations and size distributions originating from different sources or source areas were observed, although there are also many other factors which might affect the results. E.g. the local conditions and sources of volatile organic compounds (VOCs) and aerosols as well as various transformation processes are likely to have an impact on the measured aerosol composition. Using the source categories, it was identified that especially the oxidation products of biogenic VOCs in summer had a clear effect on WSOC concentrations.

  20. Long-term optical behavior of 114 extragalactic sources

    NASA Astrophysics Data System (ADS)

    Pica, A. J.; Pollock, J. T.; Smith, A. G.; Leacock, R. J.; Edwards, P. L.; Scott, R. L.

    1980-11-01

    Photographic observations of over 200 quasars and related objects have been obtained at the Rosemary Hill Observatory since 1968. Twenty that are optically violent variables were reported on by Pollock et al. (1979). This paper presents data for 114 less active sources, 58 of which exhibit optical variations at a confidence level of 95% or greater. Light curves are given for the 26 most active sources. In addition, the overall monitoring program at the Observatory is reviewed, and information on the status of 206 objects is provided.

  1. Local tsunamis and distributed slip at the source

    USGS Publications Warehouse

    Geist, E.L.; Dmowska, R.

    1999-01-01

    Variations in the local tsunami wave field are examined in relation to heterogeneous slip distributions that are characteristic of many shallow subduction zone earthquakes. Assumptions inherent in calculating the coseismic vertical displacement field that defines the initial condition for tsunami propagation are examined. By comparing the seafloor displacement from uniform slip to that from an ideal static crack, we demonstrate that dip-directed slip variations significantly affect the initial cross-sectional wave profile. Because of the hydrodynamic stability of tsunami wave forms, these effects directly impact estimates of maximum runup from the local tsunami. In most cases, an assumption of uniform slip in the dip direction significantly underestimates the maximum amplitude and leading wave steepness of the local tsunami. Whereas dip-directed slip variations affect the initial wave profile, strike-directed slip variations result in wavefront-parallel changes in amplitude that are largely preserved during propagation from the source region toward shore, owing to the effects of refraction. Tests of discretizing slip distributions indicate that small fault surface elements of dimensions similar to the source depth can acceptably approximate the vertical displacement field in comparison to continuous slip distributions. Crack models for tsunamis generated by shallow subduction zone earthquakes indicate that a rupture intersecting the free surface results in approximately twice the average slip. Therefore, the observation of higher slip associated with tsunami earthquakes relative to typical subduction zone earthquakes of the same magnitude suggests that tsunami earthquakes involve rupture of the seafloor, whereas rupture of deeper subduction zone earthquakes may be imbedded and not reach the seafloor.

  2. Reservoir, seal, and source rock distribution in Essaouira Rift Basin

    SciTech Connect

    Ait Salem, A. )

    1994-07-01

    The Essaouira onshore basin is an important hydrocarbon generating basin, which is situated in western Morocco. There are seven oil and gas-with-condensate fields; six are from Jurassic reservoirs and one from a Triassic reservoir. As a segment of the Atlantic passive continental margin, the Essaouira basin was subjected to several post-Hercynian basin deformation phases, which resulted in distribution, in space and time, of reservoir, seal, and source rock. These basin deformations are synsedimentary infilling of major half grabens with continental red buds and evaporite associated with the rifting phase, emplacement of a thick postrifting Jurassic and Cretaceous sedimentary wedge during thermal subsidence, salt movements, and structural deformations in relation to the Atlas mergence. The widely extending lower Oxfordian shales are the only Jurassic shale beds penetrated and recognized as potential and mature source rocks. However, facies analysis and mapping suggested the presence of untested source rocks in Dogger marine shales and Triassic to Liassic lacustrine shales. Rocks with adequate reservoir characteristics were encountered in Triassic/Liassic fluvial sands, upper Liassic dolomites, and upper Oxfordian sandy dolomites. The seals are provided by Liassic salt for the lower reservoirs and Middle to Upper Jurassic anhydrite for the upper reservoirs. Recent exploration studies demonstrate that many prospective structure reserves remain untested.

  3. Strategies for satellite-based monitoring of CO2 from distributed area and point sources

    NASA Astrophysics Data System (ADS)

    Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David

    2014-05-01

    and sensor provides the full range of temporal sampling needed to characterize distributed area and point source emissions. For instance, point source emission patterns will vary with source strength, wind speed and direction. Because wind speed, direction and other environmental factors change rapidly, short term variabilities should be sampled. For detailed target selection and pointing verification, important lessons have already been learned and strategies devised during JAXA's GOSAT mission (Schwandner et al, 2013). The fact that competing spatial and temporal requirements drive satellite remote sensing sampling strategies dictates a systematic, multi-factor consideration of potential solutions. Factors to consider include vista, revisit frequency, integration times, spatial resolution, and spatial coverage. No single satellite-based remote sensing solution can address this problem for all scales. It is therefore of paramount importance for the international community to develop and maintain a constellation of atmospheric CO2 monitoring satellites that complement each other in their temporal and spatial observation capabilities: Polar sun-synchronous orbits (fixed local solar time, no diurnal information) with agile pointing allow global sampling of known distributed area and point sources like megacities, power plants and volcanoes with daily to weekly temporal revisits and moderate to high spatial resolution. Extensive targeting of distributed area and point sources comes at the expense of reduced mapping or spatial coverage, and the important contextual information that comes with large-scale contiguous spatial sampling. Polar sun-synchronous orbits with push-broom swath-mapping but limited pointing agility may allow mapping of individual source plumes and their spatial variability, but will depend on fortuitous environmental conditions during the observing period. These solutions typically have longer times between revisits, limiting their ability to resolve

  4. CHALLENGES IN SOURCE TERM MODELING OF DECONTAMINATION AND DECOMMISSIONING WASTES.

    SciTech Connect

    SULLIVAN, T.M.

    2006-08-01

    Development of real-time predictive modeling to identify the dispersion and/or source(s) of airborne weapons of mass destruction including chemical, biological, radiological, and nuclear material in urban environments is needed to improve response to potential releases of these materials via either terrorist or accidental means. These models will also prove useful in defining airborne pollution dispersion in urban environments for pollution management/abatement programs. Predicting gas flow in an urban setting on a scale of less than a few kilometers is a complicated and challenging task due to the irregular flow paths that occur along streets and alleys and around buildings of different sizes and shapes, i.e., ''urban canyons''. In addition, air exchange between the outside and buildings and subway areas further complicate the situation. Transport models that are used to predict dispersion of WMD/CBRN materials or to back track the source of the release require high-density data and need defensible parameterizations of urban processes. Errors in the data or any of the parameter inputs or assumptions will lead to misidentification of the airborne spread or source release location(s). The need for these models to provide output in a real-time fashion if they are to be useful for emergency response provides another challenge. To improve the ability of New York City's (NYC's) emergency management teams and first response personnel to protect the public during releases of hazardous materials, the New York City Urban Dispersion Program (UDP) has been initiated. This is a four year research program being conducted from 2004 through 2007. This paper will discuss ground level and subway Perfluorocarbon tracer (PFT) release studies conducted in New York City. The studies released multiple tracers to study ground level and vertical transport of contaminants. This paper will discuss the results from these tests and how these results can be used for improving transport models

  5. Modeling the voice source in terms of spectral slopes.

    PubMed

    Garellek, Marc; Samlan, Robin; Gerratt, Bruce R; Kreiman, Jody

    2016-03-01

    A psychoacoustic model of the voice source spectrum is proposed. The model is characterized by four spectral slope parameters: the difference in amplitude between the first two harmonics (H1-H2), the second and fourth harmonics (H2-H4), the fourth harmonic and the harmonic nearest 2 kHz in frequency (H4-2 kHz), and the harmonic nearest 2 kHz and that nearest 5 kHz (2 kHz-5 kHz). As a step toward model validation, experiments were conducted to establish the acoustic and perceptual independence of these parameters. In experiment 1, the model was fit to a large number of voice sources. Results showed that parameters are predictable from one another, but that these relationships are due to overall spectral roll-off. Two additional experiments addressed the perceptual independence of the source parameters. Listener sensitivity to H1-H2, H2-H4, and H4-2 kHz did not change as a function of the slope of an adjacent component, suggesting that sensitivity to these components is robust. Listener sensitivity to changes in spectral slope from 2 kHz to 5 kHz depended on complex interactions between spectral slope, spectral noise levels, and H4-2 kHz. It is concluded that the four parameters represent non-redundant acoustic and perceptual aspects of voice quality. PMID:27036277

  6. Environmental radiation safety: source term modification by soil aerosols. Interim report

    SciTech Connect

    Moss, O.R.; Allen, M.D.; Rossignol, E.J.; Cannon, W.C.

    1980-08-01

    The goal of this project is to provide information useful in estimating hazards related to the use of a pure refractory oxide of /sup 238/Pu as a power source in some of the space vehicles to be launched during the next few years. Although the sources are designed and built to withstand re-entry into the earth's atmosphere, and to impact with the earth's surface without releasing any plutonium, the possibility that such an event might produce aerosols composed of soil and /sup 238/PuO/sub 2/ cannot be absolutely excluded. This report presents the results of our most recent efforts to measure the degree to which the plutonium aerosol source term might be modified in a terrestrial environment. The five experiments described represent our best effort to use the original experimental design to study the change in the size distribution and concentration of a /sup 238/PuO/sub 2/ aerosol due to coagulation with an aerosol of clay or sandy loam soil.

  7. Plutonium isotopes and 241Am in the atmosphere of Lithuania: A comparison of different source terms

    NASA Astrophysics Data System (ADS)

    Lujanienė, G.; Valiulis, D.; Byčenkienė, S.; Šakalys, J.; Povinec, P. P.

    2012-12-01

    137Cs, 241Am and Pu isotopes collected in aerosol samples during 1994-2011 were analyzed with special emphasis on better understanding of Pu and Am behavior in the atmosphere. The results from long-term measurements of 240Pu/239Pu atom ratios showed a bimodal frequency distribution with median values of 0.195 and 0.253, indicating two main sources contributing to the Pu activities at the Vilnius sampling station. The low Pu atom ratio of 0.141 could be attributed to the weapon-grade plutonium derived from the nuclear weapon test sites. The frequency of air masses arriving from the North-West and North-East correlated with the Pu atom ratio indicating the input from the sources located in these regions (the Novaya Zemlya test site, Siberian nuclear plants), while no correlation with the Chernobyl region was observed. Measurements carried out during the Fukushima accident showed a negligible impact of this source with Pu activities by four orders of magnitude lower as compared to the Chernobyl accident. The activity concentration of actinides measured in the integrated sample collected in March-April, 2011 showed a small contribution of Pu with unusual activity and atom ratios indicating the presence of the spent fuel of different origin than that of the Chernobyl accident.

  8. Processes driving short-term temporal dynamics of small mammal distribution in human-disturbed environments.

    PubMed

    Martineau, Julie; Pothier, David; Fortin, Daniel

    2016-07-01

    As the impact of anthropogenic activities intensifies worldwide, an increasing proportion of landscape is converted to early successional stages every year. To understand and anticipate the global effects of the human footprint on wildlife, assessing short-term changes in animal populations in response to disturbance events is becoming increasingly important. We used isodar habitat selection theory to reveal the consequences of timber harvesting on the ecological processes that control the distribution dynamics of a small mammal, the red-backed vole (Myodes gapperi). The abundance of voles was estimated in pairs of cut and uncut forest stands, prior to logging and up to 2 years afterwards. A week after logging, voles did not display any preference between cut and uncut stands, and a non-significant isodar indicated that their distribution was not driven by density-dependent habitat selection. One month after harvesting, however, juvenile abundance increased in cut stands, whereas the highest proportions of reproductive females were observed in uncut stands. This distribution pattern appears to result from interference competition, with juveniles moving into cuts where there was weaker competition with adults. In fact, the emergence of source-sink dynamics between uncut and cut stands, driven by interference competition, could explain why the abundance of red-backed voles became lower in cut (the sink) than uncut (the source) stands 1-2 years after logging. Our study demonstrates that the influences of density-dependent habitat selection and interference competition in shaping animal distribution can vary frequently, and for several months, following anthropogenic disturbance. PMID:27003700

  9. Reference-frame-independent quantum key distribution with source flaws

    NASA Astrophysics Data System (ADS)

    Wang, Can; Sun, Shi-Hai; Ma, Xiang-Chun; Tang, Guang-Zhao; Liang, Lin-Mei

    2015-10-01

    Compared with the traditional protocols of quantum key distribution (QKD), the reference-frame-independent (RFI)-QKD protocol has been generally proved to be very useful and practical, since its experimental implementation can be simplified without the alignment of a reference frame. In most RFI-QKD systems, the encoding states are always taken to be perfect, which, however, is not practical in realizations. In this paper, we consider the security of RFI QKD with source flaws based on the loss-tolerant method proposed by Tamaki et al. [Phys. Rev. A 90, 052314 (2014), 10.1103/PhysRevA.90.052314]. As the six-state protocol can be realized with four states, we show that the RFI-QKD protocol can also be performed with only four encoding states instead of six encoding states in its standard version. Furthermore, the numerical simulation results show that the source flaws in the key-generation basis (Z basis) will reduce the key rate but are loss tolerant, while the ones in X and Y bases almost have no effect and the key rate remains almost the same even when they are very large. Hence, our method and results will have important significance in practical experiments, especially in earth-to-satellite or chip-to-chip quantum communications.

  10. Size distributions, sources and source areas of water-soluble organic carbon in urban background air

    NASA Astrophysics Data System (ADS)

    Timonen, H.; Saarikoski, S.; Tolonen-Kivimäki, O.; Aurela, M.; Saarnio, K.; Petäjä, T.; Aalto, P. P.; Kulmala, M.; Pakkanen, T.; Hillamo, R.

    2008-09-01

    This paper represents the results of one year long measurement period of the size distributions of water-soluble organic carbon (WSOC), inorganic ions and gravimetric mass of particulate matter. Measurements were done at an urban background station (SMEAR III) by using a micro-orifice uniform deposit impactor (MOUDI). The site is located in northern European boreal region in Helsinki, Finland. The WSOC size distribution measurements were completed with the chemical analysis of inorganic ions, organic carbon (OC) and monosaccharide anhydrides from the filter samples (particle aerodynamic diameter smaller than 1 μm, PM1). Gravimetric mass concentration varied during the MOUDI samplings between 3.4 and 55.0 μg m-3 and the WSOC concentrations were between 0.3 and 7.4 μg m-3. On average, water-soluble particulate organic matter (WSPOM, WSOC multiplied by 1.6 to convert the analyzed carbon mass to organic matter mass) comprised 25±7.7% and 7.5±3.4% of aerosol PM1 mass and the PM1-10 mass, respectively. Inorganic ions contributed 33±12% and 28±19% of the analyzed PM1 and PM1-10 aerosol mass. Five different aerosol categories corresponding to different sources or source areas were identified (long-range transport aerosols, biomass burning aerosols from wild land fires and from small-scale wood combustion, aerosols originating from marine areas and from the clean arctic areas). Categories were identified mainly using levoglucosan concentration level for wood combustion and air mass backward trajectories for other groups. Clear differences in WSOC concentrations and size distributions originating from different sources or source areas were observed, although there are also many other factors which might affect the results. E.g. the local conditions and sources of volatile organic compounds (VOCs) and aerosols as well as various transformation processes are likely to have an impact on the measured aerosol composition. Using the source categories, it was identified that

  11. Source-Manipulating Wavelength-Dependent Continuous-Variable Quantum Key Distribution with Heterodyne Detectors

    NASA Astrophysics Data System (ADS)

    Lv, Geli; Huang, Dazu; Guo, Ying

    2016-05-01

    The intensities of signal and local oscillator (LO) can be elegantly manipulated for the noise-based quantum system while manipulating the wavelength-dependent modulation in source to increase the performance of the continuous-variable key distribution in terms of the secret key rate and maximal transmission distance. The source-based additional noises can be tuned and stabilized to the suitable values to eliminate the effect of the LO fluctuations and defeat the potential attacks in imperfect quantum channels. It is firmly proved that the secret key rate can be manipulated in source over imperfect channels by the intensities of signal and LO with different wavelengths, which have an effect on the optimal signal-to-noise ratio of the heterodyne detectors resulting from the detection efficiency and the additional electronic noise as well. Simulation results show that there is a nice balance between the secret key rate and the maximum transmission distance.

  12. Source-rock distribution model of the periadriatic region

    SciTech Connect

    Zappaterra, E. )

    1994-03-01

    The Periadriatic area is a mosaic of geological provinces comprised of spatially and temporally similar tectonic-sedimentary cycles. Tectonic evolution progressed from a Triassic-Early Jurassic (Liassic) continental rifting stage on the northern edge of the African craton, through an Early Jurassic (Middle Liassic)-Late Cretaceous/Eocene oceanic rifting stage and passive margin formation, to a final continental collision and active margin deformation stage in the Late Cretaceous/Eocene to Holocene. Extensive shallow-water carbonate platform deposits covered large parts of the Periadriatic region in the Late Triassic. Platform breakup and development of a platform-to-basin carbonate shelf morphology began in the Late Triassic and extended through the Cretaceous. On the basis of this paleogeographic evolution, the regional geology of the Periadriatic region can be expressed in terms of three main Upper Triassic-Paleogene sedimentary sequences: (A), the platform sequence; (B), the platform to basin sequence; and (C), the basin sequence. These sequences developed during the initial rifting and subsequent passive-margin formation tectonic stages. The principal Triassic source basins and most of the surface hydrocarbon indications and economically important oil fields of the Periadriatic region are associated with sequence B areas. No major hydrocarbon accumulations can be directly attributed to the Jurassic-Cretaceous epioceanic and intraplatform source rock sequences. The third episode of source bed deposition characterizes the final active margin deformation stage and is represented by Upper Tertiary organic-rich terrigenous units, mostly gas-prone. These are essentially associated with turbiditic and flysch sequences of foredeep basins and have generated the greater part of the commercial biogenic gases of the Periadriatic region. 82 refs., 11 figs., 2 tabs.

  13. Spatial distribution and migration of nonylphenol in groundwater following long-term wastewater irrigation.

    PubMed

    Wang, Shiyu; Wu, Wenyong; Liu, Fei; Yin, Shiyang; Bao, Zhe; Liu, Honglu

    2015-01-01

    Seen as a solution to water shortages, wastewater reuse for crop irrigation does however poses a risk owing to the potential release of organic contaminants into soil and water. The frequency of detection (FOD), concentration, and migration of nonylphenol (NP) isomers in reclaimed water (FODRW), surface water (FODSW), and groundwater (FODGW) were investigated in a long-term wastewater irrigation area in Beijing. The FODRW, FODSW and FODGW of any or all of 12 NP isomers were 66.7% to 100%, 76.9% to 100% and 13.3% to 60%, respectively. The mean (±standard deviation) NP concentrations of the reclaimed water, surface water, and groundwater (NPRW, NPSW, NPGW, repectively) were 469.4±73.4 ng L(-1), 694.6±248.7 ng(-1) and 244.4±230.8 ng(-1), respectively. The existence of external pollution sources during water transmission and distribution resulted in NPSW exceeding NPRW. NP distribution in groundwater was related to the duration and quantity of wastewater irrigation, the sources of aquifer recharge, and was seen to decrease with increasing aquifer depth. Higher riverside infiltration rate nearby leads to higher FODGW values. The migration rate of NP isomers was classified as high, moderate or low. PMID:25886245

  14. Spatial distribution and migration of nonylphenol in groundwater following long-term wastewater irrigation

    NASA Astrophysics Data System (ADS)

    Wang, Shiyu; Wu, Wenyong; Liu, Fei; Yin, Shiyang; Bao, Zhe; Liu, Honglu

    2015-06-01

    Seen as a solution to water shortages, wastewater reuse for crop irrigation does however poses a risk owing to the potential release of organic contaminants into soil and water. The frequency of detection (FOD), concentration, and migration of nonylphenol (NP) isomers in reclaimed water (FODRW), surface water (FODSW), and groundwater (FODGW) were investigated in a long-term wastewater irrigation area in Beijing. The FODRW, FODSW and FODGW of any or all of 12 NP isomers were 66.7% to 100%, 76.9% to 100% and 13.3% to 60%, respectively. The mean (± standard deviation) NP concentrations of the reclaimed water, surface water, and groundwater (NPRW, NPSW, NPGW, repectively) were 469.4 ± 73.4 ng L- 1, 694.6 ± 248.7 ng- 1 and 244.4 ± 230.8 ng- 1, respectively. The existence of external pollution sources during water transmission and distribution resulted in NPSW exceeding NPRW. NP distribution in groundwater was related to the duration and quantity of wastewater irrigation, the sources of aquifer recharge, and was seen to decrease with increasing aquifer depth. Higher riverside infiltration rate nearby leads to higher FODGW values. The migration rate of NP isomers was classified as high, moderate or low.

  15. On the numerical solution of hyperbolic equations with singular source terms

    NASA Astrophysics Data System (ADS)

    Turk, Irfan; Ashyraliyev, Maksat

    2014-08-01

    A numerical study for hyperbolic equations having singular source terms is presented. Singular in the sense that within the spatial domain the source is defined by a Dirac delta function. Solutions of such problems will have discontinuities which forms an obstacle for standard numerical methods. In this paper, a fifth order flux implicit WENO method with non-uniform meshes is studied for approximate solutions of hyperbolic equations having singular source terms. Numerical examples are provided.

  16. Spatial Distribution of Soil Fauna In Long Term No Tillage

    NASA Astrophysics Data System (ADS)

    Corbo, J. Z. F.; Vieira, S. R.; Siqueira, G. M.

    2012-04-01

    The soil is a complex system constituted by living beings, organic and mineral particles, whose components define their physical, chemical and biological properties. Soil fauna plays an important role in soil and may reflect and interfere in its functionality. These organisms' populations may be influenced by management practices, fertilization, liming and porosity, among others. Such changes may reduce the composition and distribution of soil fauna community. Thus, this study aimed to determine the spatial variability of soil fauna in consolidated no-tillage system. The experimental area is located at Instituto Agronômico in Campinas (São Paulo, Brazil). The sampling was conducted in a Rhodic Eutrudox, under no tillage system and 302 points distributed in a 3.2 hectare area in a regular grid of 10.00 m x 10.00 m were sampled. The soil fauna was sampled with "Pitfall Traps" method and traps remained in the area for seven days. Data were analyzed using descriptive statistics to determine the main statistical moments (mean variance, coefficient of variation, standard deviation, skewness and kurtosis). Geostatistical tools were used to determine the spatial variability of the attributes using the experimental semivariogram. For the biodiversity analysis, Shannon and Pielou indexes and richness were calculated for each sample. Geostatistics has proven to be a great tool for mapping the spatial variability of groups from the soil epigeal fauna. The family Formicidae proved to be the most abundant and dominant in the study area. The parameters of descriptive statistics showed that all attributes studied showed lognormal frequency distribution for groups from the epigeal soil fauna. The exponential model was the most suited for the obtained data, for both groups of epigeal soil fauna (Acari, Araneae, Coleoptera, Formicidae and Coleoptera larva), and the other biodiversity indexes. The sampling scheme (10.00 m x 10.00 m) was not sufficient to detect the spatial

  17. 78 FR 41398 - SourceGas Distribution LLC; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... Energy Regulatory Commission SourceGas Distribution LLC; Notice of Filing Take notice that on June 27, 2013, SourceGas Distribution LLC (SourceGas) filed a Rate Election and revised Statement of Operating... and 284.224). SourceGas proposes to revise its fuel reimbursement quantity percentage to reflect...

  18. Correlating Pluto's Albedo Distribution to Long Term Insolation Patterns

    NASA Astrophysics Data System (ADS)

    Earle, Alissa M.; Binzel, Richard P.; Stern, S. Alan; Young, Leslie A.; Buratti, Bonnie J.; Ennico, Kimberly; Grundy, Will M.; Olkin, Catherine B.; Spencer, John R.; Weaver, Hal A.

    2015-11-01

    NASA's New Horizons' reconnaissance of the Pluto system has revealed striking albedo contrasts from polar to equatorial latitudes on Pluto, as well as sharp boundaries for longitudinal variations. These contrasts suggest Pluto undergoes dynamic evolution that drives the redistribution of volatiles. Using the New Horizons results as a template, in this talk we will explore the volatile migration process driven seasonally on Pluto considering multiple timescales. These timescales include the current orbit (248 years) as well as the timescales for obliquity precession (amplitude of 23 degrees over 3 Myrs) and regression of the orbital longitude of perihelion (3.7 Myrs). We will build upon the long-term insolation history model described by Earle and Binzel (2015, Icarus 250, 405-412) with the goal of identifying the most critical timescales that drive the features observed in Pluto’s current post-perihelion epoch. This work was supported by the NASA New Horizons Project.

  19. Source-term reevaluation for US commercial nuclear power reactors: a status report

    SciTech Connect

    Herzenberg, C.L.; Ball, J.R.; Ramaswami, D.

    1984-12-01

    Only results that had been discussed publicly, had been published in the open literature, or were available in preliminary reports as of September 30, 1984, are included here. More than 20 organizations are participating in source-term programs, which have been undertaken to examine severe accident phenomena in light-water power reactors (including the chemical and physical behavior of fission products under accident conditions), update and reevaluate source terms, and resolve differences between predictions and observations of radiation releases and related phenomena. Results from these source-term activities have been documented in over 100 publications to date.

  20. Source term model evaluations for the low-level waste facility performance assessment

    SciTech Connect

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  1. Distribution of astronomical sources in the second Equatorial Infrared Catalogue

    NASA Technical Reports Server (NTRS)

    Nagy, T. A.; Sweeney, L. H.; Lesh, J. R.; Mead, J. M.; Maran, S. P.; Heinsheimer, T. F.; Yates, F. F.

    1979-01-01

    Measurements of infrared (2.7-micron) source positions and flux densities have been derived based on an additional 60.6 hours of satellite observations beyond those considered in the preparation of the Equatorial Infrared Catalogue No. 1 (EIC-1). These data have been processed together with the EIC-1 data to produce EIC-2. The new catalog differs from EIC-1 as follows: there are 1278 sources; there is a larger percentage of unidentified sources; there are increased numbers of sources identified with Two-Micron Sky Survey sources, AFGL sources, AGK3 stars and SAO stars.

  2. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    SciTech Connect

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  3. Accident source terms for boiling water reactors with high burnup cores.

    SciTech Connect

    Gauntt, Randall O.; Powers, Dana Auburn; Leonard, Mark Thomas

    2007-11-01

    The primary objective of this report is to provide the technical basis for development of recommendations for updates to the NUREG-1465 Source Term for BWRs that will extend its applicability to accidents involving high burnup (HBU) cores. However, a secondary objective is to re-examine the fundamental characteristics of the prescription for fission product release to containment described by NUREG-1465. This secondary objective is motivated by an interest to understand the extent to which research into the release and behaviors of radionuclides under accident conditions has altered best-estimate calculations of the integral response of BWRs to severe core damage sequences and the resulting radiological source terms to containment. This report, therefore, documents specific results of fission product source term analyses that will form the basis for the HBU supplement to NUREG-1465. However, commentary is also provided on observed differences between the composite results of the source term calculations performed here and those reflected NUREG-1465 itself.

  4. Complex cell geometry and sources distribution model for Monte Carlo single cell dosimetry with iodine 125 radioimmunotherapy

    NASA Astrophysics Data System (ADS)

    Arnaud, F. X.; Paillas, S.; Pouget, J. P.; Incerti, S.; Bardiès, M.; Bordage, M. C.

    2016-01-01

    In cellular dosimetry, common assumptions consider concentric spheres for nucleus and cell and uniform radionuclides distribution. These approximations do not reflect reality, specially in the situation of radioimmunotherapy with Auger emitters, where very short-ranged electrons induce hyper localised energy deposition. A realistic cellular dosimetric model was generated to give account of the real geometry and activity distribution, for non-internalizing and internalizing antibodies (mAbs) labelled with Auger emitter I-125. The impact of geometry was studied by comparing the real geometry obtained from confocal microscopy for both cell and nucleus with volume equivalent concentric spheres. Non-uniform and uniform source distributions were considered for each mAbs distribution. Comparisons in terms of mean deposited energy per decay, energy deposition spectra and energy-volume histograms were calculated using Geant4. We conclude that realistic models are needed, especially when energy deposition is highly non-homogeneous due to source distribution.

  5. Transverse distribution of beam current oscillations of a 14 GHz electron cyclotron resonance ion source.

    PubMed

    Tarvainen, O; Toivanen, V; Komppula, J; Kalvas, T; Koivisto, H

    2014-02-01

    The temporal stability of oxygen ion beams has been studied with the 14 GHz A-ECR at JYFL (University of Jyvaskyla, Department of Physics). A sector Faraday cup was employed to measure the distribution of the beam current oscillations across the beam profile. The spatial and temporal characteristics of two different oscillation "modes" often observed with the JYFL 14 GHz ECRIS are discussed. It was observed that the low frequency oscillations below 200 Hz are distributed almost uniformly. In the high frequency oscillation "mode," with frequencies >300 Hz at the core of the beam, carrying most of the current, oscillates with smaller amplitude than the peripheral parts of the beam. The results help to explain differences observed between the two oscillation modes in terms of the transport efficiency through the JYFL K-130 cyclotron. The dependence of the oscillation pattern on ion source parameters is a strong indication that the mechanisms driving the fluctuations are plasma effects. PMID:24593488

  6. Spatial distribution and source apportionment of PCBs in sediments around İzmit industrial complexes, Turkey.

    PubMed

    Gedik, Kadir; Demircioğlu, Filiz; Imamoğlu, Ipek

    2010-11-01

    The spatial distribution, degree of pollution and major sources of PCBs were evaluated in surficial sediments within the heavily urbanized and industrialized İzmit Bay and its main freshwater inputs. ΣPCB concentrations range from 2.90 to 85.4ngg(-1) in marine sediments and from ND to 47.7ngg(-1) in freshwater sediments. Results suggest that high concentrations of ΣPCBs were localized around a chlor-alkali plant and an industry that handles bulk liquid, dry and drummed chemicals, and petroleum products in the Bay. Using a chemical mass balance receptor model (CMB), major sources of PCBs in the region were investigated. The CMB model identified Aroclor 1254 and 1260 to be the major PCB sources in marine sediments and the less chlorinated Aroclor 1248 and 1242 as the major PCB sources in freshwater sediments. The potential sources for the PCBs were briefly discussed in terms of their use in various industrial applications. PMID:20889182

  7. 78 FR 56685 - SourceGas Distribution LLC; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-13

    ... Energy Regulatory Commission SourceGas Distribution LLC; Notice of Application Take notice that on August 27, 2013, SourceGas Distribution LLC (SourceGas), 600 12th Street, Suite 300, Golden, Colorado 80401, filed in Docket No. CP13-540-000 an application pursuant to section 7(f) of the Natural Gas Act...

  8. 78 FR 6318 - SourceGas Distribution LLC; Notice of Petition for Rate Approval

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-30

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Petition for Rate Approval Take notice that on January 15, 2013, SourceGas Distribution LLC (SourceGas) filed a rate election pursuant...

  9. 77 FR 28374 - SourceGas Distribution LLC; Notice of Compliance Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Compliance Filing Take notice that on April 30, 2012, SourceGas Distribution LLC (SourceGas) filed a revised Statement of Operating...

  10. A second order operator splitting method for Allen-Cahn type equations with nonlinear source terms

    NASA Astrophysics Data System (ADS)

    Lee, Hyun Geun; Lee, June-Yub

    2015-08-01

    Allen-Cahn (AC) type equations with nonlinear source terms have been applied to a wide range of problems, for example, the vector-valued AC equation for phase separation and the phase-field equation for dendritic crystal growth. In contrast to the well developed first and second order methods for the AC equation, not many second order methods are suggested for the AC type equations with nonlinear source terms due to the difficulties in dealing with the nonlinear source term numerically. In this paper, we propose a simple and stable second order operator splitting method. A core idea of the method is to decompose the original equation into three subequations with the free-energy evolution term, the heat evolution term, and a nonlinear source term, respectively. It is important to combine these three subequations in proper order to achieve the second order accuracy and stability. We propose a method with a half-time free-energy evolution solver, a half-time heat evolution solver, a full-time midpoint solver for the nonlinear source term, and a half-time heat evolution solver followed by a final half-time free-energy evolution solver. We numerically demonstrate the second order accuracy of the new numerical method through the simulations of the phase separation and the dendritic crystal growth.

  11. Long Term Leaching of Chlorinated Solvents from Source Zones in Low Permeability Settings with Fractures

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J.; Troldborg, M.; Binning, P. J.; Broholm, M. M.; Lemming, G.; Damgaard, I.

    2008-12-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at two well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At the study sites (Sortebrovej and Vadsbyvej), the source areas are situated in a clayey till with fractures

  12. Autonomous distributed temperature sensing for long-term heated applications in remote areas

    NASA Astrophysics Data System (ADS)

    Kurth, A.-M.; Dawes, N.; Selker, J.; Schirmer, M.

    2012-10-01

    Distributed Temperature Sensing (DTS) is a fiber-optical method enabling simultaneous temperature measurements over long distances. Electrical resistance heating of the metallic components of the fiber-optic cable provides information on the thermal characteristics of the cable's environment, providing valuable insight into processes occurring in the surrounding medium, such as groundwater-surface water interactions, dam stability or soil moisture. Until now, heated applications required direct handling of the DTS instrument by a researcher, rendering long-term investigations in remote areas impractical due to the often difficult and time-consuming access to the field site. Remote-control and automation of the DTS instrument and heating processes, however, resolve the issue with difficult access. The data can also be remotely accessed and stored on a central database. The power supply can be grid-independent, although significant infrastructure investment is required here due to high power consumption during heated applications. Solar energy must be sufficient even in worst case scenarios, e.g. during long periods of intense cloud cover, to prevent system failure due to energy shortage. In combination with storage batteries and a low heating frequency, e.g. once per day or once per week (depending on the season and the solar radiation on site), issues of high power consumption may be resolved. Safety regulations dictate adequate shielding and ground-fault protection, to safeguard animals and humans from electricity and laser sources. In this paper the autonomous DTS system is presented to allow research with heated applications of DTS in remote areas for long-term investigations of temperature distributions in the environment.

  13. Autonomous distributed temperature sensing for long-term heated applications in remote areas

    NASA Astrophysics Data System (ADS)

    Kurth, A.-M.; Dawes, N.; Selker, J.; Schirmer, M.

    2013-02-01

    Distributed temperature sensing (DTS) is a fiber-optical method enabling simultaneous temperature measurements over long distances. Electrical resistance heating of the metallic components of the fiber-optic cable provides information on the thermal characteristics of the cable's environment, providing valuable insight into processes occurring in the surrounding medium, such as groundwater-surface water interactions, dam stability or soil moisture. Until now, heated applications required direct handling of the DTS instrument by a researcher, rendering long-term investigations in remote areas impractical due to the often difficult and time-consuming access to the field site. Remote control and automation of the DTS instrument and heating processes, however, resolve the issue with difficult access. The data can also be remotely accessed and stored on a central database. The power supply can be grid independent, although significant infrastructure investment is required here due to high power consumption during heated applications. Solar energy must be sufficient even in worst case scenarios, e.g. during long periods of intense cloud cover, to prevent system failure due to energy shortage. In combination with storage batteries and a low heating frequency, e.g. once per day or once per week (depending on the season and the solar radiation on site), issues of high power consumption may be resolved. Safety regulations dictate adequate shielding and ground-fault protection, to safeguard animals and humans from electricity and laser sources. In this paper the autonomous DTS system is presented to allow research with heated applications of DTS in remote areas for long-term investigations of temperature distributions in the environment.

  14. Shielding analysis of proton therapy accelerators: a demonstration using Monte Carlo-generated source terms and attenuation lengths.

    PubMed

    Lai, Bo-Lun; Sheu, Rong-Jiun; Lin, Uei-Tyng

    2015-05-01

    Monte Carlo simulations are generally considered the most accurate method for complex accelerator shielding analysis. Simplified models based on point-source line-of-sight approximation are often preferable in practice because they are intuitive and easy to use. A set of shielding data, including source terms and attenuation lengths for several common targets (iron, graphite, tissue, and copper) and shielding materials (concrete, iron, and lead) were generated by performing Monte Carlo simulations for 100-300 MeV protons. Possible applications and a proper use of the data set were demonstrated through a practical case study, in which shielding analysis on a typical proton treatment room was conducted. A thorough and consistent comparison between the predictions of our point-source line-of-sight model and those obtained by Monte Carlo simulations for a 360° dose distribution around the room perimeter showed that the data set can yield fairly accurate or conservative estimates for the transmitted doses, except for those near the maze exit. In addition, this study demonstrated that appropriate coupling between the generated source term and empirical formulae for radiation streaming can be used to predict a reasonable dose distribution along the maze. This case study proved the effectiveness and advantage of applying the data set to a quick shielding design and dose evaluation for proton therapy accelerators. PMID:25811254

  15. Chemotaxis Increases the Residence Time Distribution of Bacteria in Granular Media Containing Distributed Contaminant Sources

    NASA Astrophysics Data System (ADS)

    Adadevoh, J.; Triolo, S.; Ramsburg, C. A.; Ford, R.

    2015-12-01

    The use of chemotactic bacteria in bioremediation has the potential to increase access to, and biotransformation of, contaminant mass within the subsurface environment. This laboratory-scale study aimed to understand and quantify the influence of chemotaxis on residence times of pollutant-degrading bacteria within homogeneous treatment zones. Focus was placed on a continuous flow sand-packed column system in which a uniform distribution of naphthalene crystals created distributed sources of dissolved phase contaminant. A 10 mL pulse of Pseudomonas putida G7, which is chemotactic to naphthalene, and Pseudomonas putida G7 Y1, a non-chemotactic mutant strain, were simultaneously introduced into the sand-packed column at equal concentrations. Breakthrough curves obtained for the bacteria from column experiments conducted with and without naphthalene were used to quantify the effect of chemotaxis on transport parameters. In the presence of the chemoattractant, longitudinal dispersivity of PpG7 increased by a factor of 3 and percent recovery decreased from 21% to 12%. The results imply that pore-scale chemotaxis responses are evident at an interstitial fluid velocity of 1.7 m/d, which is within the range of typical groundwater flow. Within the context of bioremediation, chemotaxis may work to enhance bacterial residence times in zones of contamination thereby improving treatment.

  16. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  17. WATER QUALITY IN SOURCE WATER, TREATMENT, AND DISTRIBUTION SYSTEMS

    EPA Science Inventory

    Most drinking water utilities practice the multiple-barrier concept as the guiding principle for providing safe water. This chapter discusses multiple barriers as they relate to the basic criteria for selecting and protecting source waters, including known and potential sources ...

  18. Distribution, sources and health risk assessment of mercury in kindergarten dust

    NASA Astrophysics Data System (ADS)

    Sun, Guangyi; Li, Zhonggen; Bi, Xiangyang; Chen, Yupeng; Lu, Shuangfang; Yuan, Xin

    2013-07-01

    Mercury (Hg) contamination in urban area is a hot issue in environmental research. In this study, the distribution, sources and health risk of Hg in dust from 69 kindergartens in Wuhan, China, were investigated. In comparison with most other cities, the concentrations of total mercury (THg) and methylmercury (MeHg) were significantly elevated, ranging from 0.15 to 10.59 mg kg-1 and from 0.64 to 3.88 μg kg-1, respectively. Among the five different urban areas, the educational area had the highest concentrations of THg and MeHg. The GIS mapping was used to identify the hot-spot areas and assess the potential pollution sources of Hg. The emissions of coal-power plants and coking plants were the main sources of THg in the dust, whereas the contributions of municipal solid waste (MSW) landfills and iron and steel smelting related industries were not significant. However, the emission of MSW landfills was considered to be an important source of MeHg in the studied area. The result of health risk assessment indicated that there was a high adverse health effect of the kindergarten dust in terms of Hg contamination on the children living in the educational area (Hazard index (HI) = 6.89).

  19. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  20. Use of open source distribution for a machine tool controller

    NASA Astrophysics Data System (ADS)

    Shackleford, William P.; Proctor, Frederick M.

    2001-02-01

    In recent years a growing number of government and university las, non-profit organizations and even a few for- profit corporations have found that making their source code public is good for both developers and users. In machine tool control, a growing number of users are demanding that the controllers they buy be `open architecture,' which would allow third parties and end-users at least limited ability to modify, extend or replace the components of that controller. This paper examines the advantages and dangers of going one step further, and providing `open source' controllers by relating the experiences of users and developers of the Enhanced Machine Controller. We also examine some implications for the development of standards for open-architecture but closed-source controllers. Some of the questions we hope to answer include: How can the quality be maintained after the source code has been modified? Can the code be trusted to run on expensive machines and parts, or when the safety of the operator is an issue? Can `open- architecture' but closed-source controllers ever achieve the level of flexibility or extensibility that open-source controllers can?

  1. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1988-01-01

    The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.

  2. Uncertainties associated with the definition of a hydrologic source term for the Nevada Test Site

    SciTech Connect

    Smith, D.K.; Esser, B.K.; Thompson, J.L.

    1995-05-01

    The U.S. Department of Energy, Nevada Operations Office (DOE/NV) Environmental Restoration Division is seeking to evaluate groundwater contamination resulting from 30 years of underground nuclear testing at the Nevada Test Site (NTS). This evaluation requires knowledge about what radioactive materials are in the groundwater and how they are transported through the underground environment. This information coupled with models of groundwater flow (flow paths and flow rates) will enable predictions of the arrival of each radionuclide at a selected receptor site. Risk assessment models will then be used to calculate the expected environmental and human doses. The accuracy of our predictions depends on the validity of our hydrologic and risk assessment models and on the quality of the data for radionuclide concentrations in ground water at each underground nuclear test site. This paper summarizes what we currently know about radioactive material in NTS groundwater and suggests how we can best use our limited knowledge to proceed with initial modeling efforts. The amount of a radionuclide available for transport in groundwater at the site of an underground nuclear test is called the hydrologic source term. The radiologic source term is the total amount of residual radionuclides remaining after an underground nuclear test. The hydrologic source term is smaller than the radiologic source term because some or most of the radionuclide residual cannot be transported by groundwater. The radiologic source term has been determined for each of the underground nuclear tests fired at the NTS; however, the hydrologic source term has been estimated from measurements at only a few sites.

  3. Acetone in the atmosphere: Distribution, sources, and sinks

    NASA Technical Reports Server (NTRS)

    Singh, H. B.; O'Hara, D.; Herlth, D.; Sachse, W.; Blake, D. R.; Bradshaw, J. D.; Kanakidou, M.; Crutzen, P. J.

    1994-01-01

    Acetone (CH3COCH3) was found to be the dominant nonmethane organic species present in the atmosphere sampled primarily over eastern Canada (0-6 km, 35 deg-65 deg N) during ABLE3B (July to August 1990). A concentration range of 357 to 2310 ppt (= 10(exp -12) v/v) with a mean value of 1140 +/- 413 ppt was measured. Under extremely clean conditions, generally involving Arctic flows, lowest (background) mixing ratios of 550 +/- 100 ppt were present in much of the troposphere studied. Correlations between atmospheric mixing ratios of acetone and select species such as C2H2, CO, C3H8, C2Cl4 and isoprene provided important clues to its possible sources and to the causes of its atmospheric variability. Biomass burning as a source of acetone has been identified for the first time. By using atmospheric data and three-dimensional photochemical models, a global acetone source of 40-60 Tg (= 10(exp 12) g)/yr is estimated to be present. Secondary formation from the atmospheric oxidation of precursor hydrocarbons (principally propane, isobutane, and isobutene) provides the single largest source (51%). The remainder is attributable to biomass burning (26%), direct biogenic emissions (21%), and primary anthropogenic emissions (3%). Atmospheric removal of acetone is estimated to be due to photolysis (64%), reaction with OH radicals (24%), and deposition (12%). Model calculations also suggest that acetone photolysis contributed significantly to PAN formation (100-200 ppt) in the middle and upper troposphere of the sampled region and may be important globally. While the source-sink equation appears to be roughly balanced, much more atmospheric and source data, especially from the southern hemisphere, are needed to reliably quantify the atmospheric budget of acetone.

  4. 77 FR 10490 - SourceGas Distribution LLC; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission SourceGas Distribution LLC; Notice of Filing Take notice that on February 14, 2012, SourceGas Distribution LLC submitted a revised baseline filing of their Statement of...

  5. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  6. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  7. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  8. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  9. 16 CFR Table 4 to Part 1512 - Relative Energy Distribution of Sources

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Relative Energy Distribution of Sources 4... SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Table 4 Table 4 to Part 1512—Relative Energy Distribution of Sources Wave length (nanometers) Relative energy 380 9.79 390 12.09 400 14.71 410 17.68 420...

  10. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1990-01-01

    In the present study of the behavior of typical numerical methods in the case of a model advection equation having a parameter-dependent source term, two approaches to the incorporation of the source terms are used: MacCormack-type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. The latter are found to perform slightly better. The model scalar equation is used to show that the incorrectness of the propagation speeds of discontinuities observed in the stiff case is due to the introduction of nonequilibrium values through numerical dissipation in the advection step.

  11. The long-term problems of contaminated land: Sources, impacts and countermeasures

    SciTech Connect

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  12. Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.

    PubMed

    Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S

    2004-01-01

    New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form. PMID:15177366

  13. The Fukushima releases: an inverse modelling approach to assess the source term by using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Saunier, Olivier; Mathieu, Anne; Didier, Damien; Tombette, Marilyne; Quélo, Denis; Winiarek, Victor; Bocquet, Marc

    2013-04-01

    The Chernobyl nuclear accident and more recently the Fukushima accident highlighted that the largest source of error on consequences assessment is the source term estimation including the time evolution of the release rate and its distribution between radioisotopes. Inverse modelling methods have proved to be efficient to assess the source term due to accidental situation (Gudiksen, 1989, Krysta and Bocquet, 2007, Stohl et al 2011, Winiarek et al 2012). These methods combine environmental measurements and atmospheric dispersion models. They have been recently applied to the Fukushima accident. Most existing approaches are designed to use air sampling measurements (Winiarek et al, 2012) and some of them use also deposition measurements (Stohl et al, 2012, Winiarek et al, 2013). During the Fukushima accident, such measurements are far less numerous and not as well distributed within Japan than the dose rate measurements. To efficiently document the evolution of the contamination, gamma dose rate measurements were numerous, well distributed within Japan and they offered a high temporal frequency. However, dose rate data are not as easy to use as air sampling measurements and until now they were not used in inverse modelling approach. Indeed, dose rate data results from all the gamma emitters present in the ground and in the atmosphere in the vicinity of the receptor. They do not allow one to determine the isotopic composition or to distinguish the plume contribution from wet deposition. The presented approach proposes a way to use dose rate measurement in inverse modeling approach without the need of a-priori information on emissions. The method proved to be efficient and reliable when applied on the Fukushima accident. The emissions for the 8 main isotopes Xe-133, Cs-134, Cs-136, Cs-137, Ba-137m, I-131, I-132 and Te-132 have been assessed. The Daiichi power plant events (such as ventings, explosions…) known to have caused atmospheric releases are well identified in

  14. Using natural archives to track sources and long-term trends of pollution: an introduction

    USGS Publications Warehouse

    Jules Blais; Rosen, Michael R.; John Smol

    2015-01-01

    This book explores the myriad ways that environmental archives can be used to study the distribution and long-term trajectories of contaminants. The volume first focuses on reviews that examine the integrity of the historic record, including factors related to hydrology, post-depositional diffusion, and mixing processes. This is followed by a series of chapters dealing with the diverse archives available for long-term studies of environmental pollution.

  15. An Alternative Treatment of Trace Chemical Constituents in Calculated Chemical Source Terms for Hanford Tank Farms Safety Analsyes

    SciTech Connect

    Huckaby, James L.

    2006-09-26

    Hanford Site high-level radioactive waste tank accident analyses require chemical waste toxicity source terms to assess potential accident consequences. Recent reviews of the current methodology used to generate source terms and the need to periodically update the sources terms has brought scrutiny to the manner in which trace waste constituents are included in the source terms. This report examines the importance of trace constituents to the chemical waste source terms, which are calculated as sums of fractions (SOFs), and recommends three changes to the manner in which trace constituents are included in the calculation SOFs.

  16. Monitoring Design for Source Identification in Water Distribution Systems

    EPA Science Inventory

    The design of sensor networks for the purpose of monitoring for contaminants in water distribution systems is currently an active area of research. Much of the effort has been directed at the contamination detection problem and the expression of public health protection objective...

  17. The planetary distribution of heat sources and sinks during FGGE

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Wei, M. Y.

    1985-01-01

    Heating distributions from analysis of the National Meteorological Center and European Center for Medium Range Weather Forecasts data sets; methods used and problems involved in the inference of diabatic heating; the relationship between differential heating and energy transport; and recommendations on the inference of heat soruces and heat sinks for the planetary show are discussed.

  18. Acoustic Source Localization via Distributed Sensor Networks using Tera-scale Optical-Core Devices

    SciTech Connect

    Imam, Neena; Barhen, Jacob; Wardlaw, Michael

    2008-01-01

    For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. The complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot be met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on an optical-core digital processing platform recently introduced by Lenslet Inc. They investigate key concepts of threat-detection algorithms such as Time Difference Of Arrival (TDOA) estimation via sensor data correlation in the time domain with the purpose of implementation on the optical-core processor. they illustrate their results with the aid of numerical simulation and actual optical hardware runs. The major accomplishments of this research, in terms of computational speedup and numerical accurcy achieved via the deployment of optical processing technology, should be of substantial interest to the acoustic signal processing community.

  19. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    SciTech Connect

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  20. Long-term variability in bright hard X-ray sources: 5+ years of BATSE data

    NASA Technical Reports Server (NTRS)

    Robinson, C. R.; Harmon, B. A.; McCollough, M. L.; Paciesas, W. S.; Sahi, M.; Scott, D. M.; Wilson, C. A.; Zhang, S. N.; Deal, K. J.

    1997-01-01

    The operation of the Compton Gamma Ray Observatory (CGRO)/burst and transient source experiment (BATSE) continues to provide data for inclusion into a data base for the analysis of long term variability in bright, hard X-ray sources. The all-sky capability of BATSE provides up to 30 flux measurements/day for each source. The long baseline and the various rising and setting occultation flux measurements allow searches for periodic and quasi-periodic signals with periods of between several hours to hundreds of days to be conducted. The preliminary results from an analysis of the hard X-ray variability in 24 of the brightest BATSE sources are presented. Power density spectra are computed for each source and profiles are presented of the hard X-ray orbital modulations in some X-ray binaries, together with amplitude modulations and variations in outburst durations and intensities in recurrent X-ray transients.

  1. The distribution and source of boulders on asteroid 4179 Toutatis

    NASA Astrophysics Data System (ADS)

    Jiang, Yun; Ji, Jianghui; Huang, Jiangchuan; Marchi, Simone; Li, Yuan; Ip, Wing-Huen

    2016-01-01

    Boulders are ubiquitous on the surfaces of asteroids and their spatial and size distributions provide information for the geological evolution and collisional history of parent bodies. We identify more than 200 boulders on near-Earth asteroid 4179 Toutatis based on images obtained by Chang'e-2 flyby. The cumulative boulder size frequency distribution (SFD) gives a power-index of -4.4 +/- 0.1, which is clearly steeper than those of boulders on Itokawa and Eros, indicating much high degree of fragmentation. Correlation analyses with craters suggest that most boulders cannot solely be produced as products of cratering, but are probably survived fragments from the parent body of Toutatis, accreted after its breakup. Similar to Itokawa, Toutatis probably has a rubble-pile structure, but owns a different preservation state of boulders.

  2. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  3. Experimental Investigation and 3D Finite Element Prediction of Temperature Distribution during Travelling Heat Sourced from Oxyacetylene Flame

    NASA Astrophysics Data System (ADS)

    Umar Alkali, Adam; Lenggo Ginta, Turnad; Majdi Abdul-Rani, Ahmad

    2015-04-01

    This paper presents a 3D transient finite element modelling of the workpiece temperature field produced during the travelling heat sourced from oxyacetylene flame. The proposed model was given in terms of preheat-only test applicable during thermally enhanced machining using the oxyacetylene flame as a heat source. The FEA model as well as the experimental test investigated the surface temperature distribution on 316L stainless steel at scanning speed of 100mm/min, 125mm/min 160mm/min, 200mm/min and 250mm/min. The parametric properties of the heat source maintained constant are; lead distance Ld =10mm, focus height Fh=7.5mm, oxygen gas pressure Poxy=15psi and acetylene gas pressure Pacty=25psi. An experimental validation of the temperature field induced on type 316L stainless steel reveal that temperature distribution increases when the travelling speed decreases.

  4. Characterizing short-term stability for Boolean networks over any distribution of transfer functions

    NASA Astrophysics Data System (ADS)

    Seshadhri, C.; Smith, Andrew M.; Vorobeychik, Yevgeniy; Mayo, Jackson R.; Armstrong, Robert C.

    2016-07-01

    We present a characterization of short-term stability of Kauffman's N K (random) Boolean networks under arbitrary distributions of transfer functions. Given such a Boolean network where each transfer function is drawn from the same distribution, we present a formula that determines whether short-term chaos (damage spreading) will happen. Our main technical tool which enables the formal proof of this formula is the Fourier analysis of Boolean functions, which describes such functions as multilinear polynomials over the inputs. Numerical simulations on mixtures of threshold functions and nested canalyzing functions demonstrate the formula's correctness.

  5. Characterizing short-term stability for Boolean networks over any distribution of transfer functions.

    PubMed

    Seshadhri, C; Smith, Andrew M; Vorobeychik, Yevgeniy; Mayo, Jackson R; Armstrong, Robert C

    2016-07-01

    We present a characterization of short-term stability of Kauffman's NK (random) Boolean networks under arbitrary distributions of transfer functions. Given such a Boolean network where each transfer function is drawn from the same distribution, we present a formula that determines whether short-term chaos (damage spreading) will happen. Our main technical tool which enables the formal proof of this formula is the Fourier analysis of Boolean functions, which describes such functions as multilinear polynomials over the inputs. Numerical simulations on mixtures of threshold functions and nested canalyzing functions demonstrate the formula's correctness. PMID:27575142

  6. ACT-ARA: Code System for the Calculation of Changes in Radiological Source Terms with Time

    Energy Science and Technology Software Center (ESTSC)

    1988-02-01

    The program calculates the source term activity as a function of time for parent isotopes as well as daughters. Also, at each time, the "probable release" is produced. Finally, the program determines the time integrated probable release for each isotope over the time period of interest.

  7. Fission product source term research at Oak Ridge National Laboratory. [PWR; BWR

    SciTech Connect

    Malinauskas, A.P.

    1985-01-01

    The purpose of this work is to describe some of the research being performed at ORNL in support of the effort to describe, as realistically as possible, fission product source terms for nuclear reactor accidents. In order to make this presentation manageable, only those studies directly concerned with fission product behavior, as opposed to thermal hydraulics, accident sequence progression, etc., will be discussed.

  8. Enhancement of the source term algorithm for emergency response at the Savannah River Site

    SciTech Connect

    Simpkins, A.A.; O`Kula, K.R.; Taylor, R.P.; Kearnaghan, G.P.

    1992-12-31

    The purpose of this work is to use the results of the Savannah River Site K-Reactor Probabilistic Safety Assessment to determine the accident sequences and source terms for beyond design basis accidents. Additionally, the methodology necessary to allow the Reactor Accident Program to incorporate this information is to be discussed.

  9. Enhancement of the source term algorithm for emergency response at the Savannah River Site

    SciTech Connect

    Simpkins, A.A.; O'Kula, K.R.; Taylor, R.P.; Kearnaghan, G.P.

    1992-01-01

    The purpose of this work is to use the results of the Savannah River Site K-Reactor Probabilistic Safety Assessment to determine the accident sequences and source terms for beyond design basis accidents. Additionally, the methodology necessary to allow the Reactor Accident Program to incorporate this information is to be discussed.

  10. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    ERIC Educational Resources Information Center

    Hall, Matthew L.; Bavelier, Daphne

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…

  11. Sensitivities to source-term parameters of emergency planning zone boundaries for waste management facilities

    SciTech Connect

    Mueller, C.J.

    1995-07-01

    This paper reviews the key parameters comprising airborne radiological and chemical release source terms, discusses the ranges over which values of these parameters occur for plausible but severe waste management facility accidents, and relates the concomitant sensitivities of emergency planning zone boundaries predicted on calculated distances to early severe health effects.

  12. High resolution stationary digital breast tomosynthesis using distributed carbon nanotube x-ray source array

    PubMed Central

    Qian, Xin; Tucker, Andrew; Gidcumb, Emily; Shan, Jing; Yang, Guang; Calderon-Colon, Xiomara; Sultana, Shabana; Lu, Jianping; Zhou, Otto; Spronk, Derrek; Sprenger, Frank; Zhang, Yiheng; Kennedy, Don; Farbizio, Tom; Jing, Zhenxue

    2012-01-01

    binning, the projection resolution along the scanning direction increased from 4.0 cycles/mm [at 10% modulation-transfer-function (MTF)] in DBT to 5.1 cycles/mm in s-DBT at magnification factor of 1.08. The improvement is more pronounced for faster scanning speeds, wider angular coverage, and smaller detector pixel sizes. The scanning speed depends on the detector, the number of views, and the imaging dose. With 240 ms detector readout time, the s-DBT system scanning time is 6.3 s for a 15-view, 100 mAs scan regardless of the angular coverage. The scanning speed can be reduced to less than 4 s when detectors become faster. Initial phantom studies showed good quality reconstructed images. Conclusions: A prototype s-DBT scanner has been developed and evaluated by retrofitting the Selenia rotating gantry DBT scanner with a spatially distributed CNT x-ray source array. Preliminary results show that it improves system spatial resolution substantially by eliminating image blur due to x-ray focal spot motion. The scanner speed of s-DBT system is independent of angular coverage and can be increased with faster detector without image degration. The accelerated lifetime measurement demonstrated the long term stability of CNT x-ray source array with typical clinical operation lifetime over 3 years. PMID:22482630

  13. High resolution stationary digital breast tomosynthesis using distributed carbon nanotube x-ray source array

    SciTech Connect

    Qian Xin; Tucker, Andrew; Gidcumb, Emily; Shan Jing; Yang Guang; Calderon-Colon, Xiomara; Sultana, Shabana; Lu Jianping; Zhou, Otto; Spronk, Derrek; Sprenger, Frank; Zhang Yiheng; Kennedy, Don; Farbizio, Tom; Jing Zhenxue

    2012-04-15

    , the projection resolution along the scanning direction increased from 4.0 cycles/mm [at 10% modulation-transfer-function (MTF)] in DBT to 5.1 cycles/mm in s-DBT at magnification factor of 1.08. The improvement is more pronounced for faster scanning speeds, wider angular coverage, and smaller detector pixel sizes. The scanning speed depends on the detector, the number of views, and the imaging dose. With 240 ms detector readout time, the s-DBT system scanning time is 6.3 s for a 15-view, 100 mAs scan regardless of the angular coverage. The scanning speed can be reduced to less than 4 s when detectors become faster. Initial phantom studies showed good quality reconstructed images. Conclusions: A prototype s-DBT scanner has been developed and evaluated by retrofitting the Selenia rotating gantry DBT scanner with a spatially distributed CNT x-ray source array. Preliminary results show that it improves system spatial resolution substantially by eliminating image blur due to x-ray focal spot motion. The scanner speed of s-DBT system is independent of angular coverage and can be increased with faster detector without image degration. The accelerated lifetime measurement demonstrated the long term stability of CNT x-ray source array with typical clinical operation lifetime over 3 years.

  14. Occurrence of arsenic contamination in Canada: sources, behavior and distribution.

    PubMed

    Wang, Suiling; Mulligan, Catherine N

    2006-08-01

    Recently there has been increasing anxieties concerning arsenic related problems. Occurrence of arsenic contamination has been reported worldwide. In Canada, the main natural arsenic sources are weathering and erosion of arsenic-containing rocks and soil, while tailings from historic and recent gold mine operations and wood preservative facilities are the principal anthropogenic sources. Across Canada, the 24-h average concentration of arsenic in the atmosphere is generally less than 0.3 microg/m3. Arsenic concentrations in natural uncontaminated soil and sediments range from 4 to 150 mg/kg. In uncontaminated surface and ground waters, the arsenic concentration ranges from 0.001 to 0.005 mg/L. As a result of anthropogenic inputs, elevated arsenic levels, above ten to thousand times the Interim Maximum Acceptable Concentration (IMAC), have been reported in air, soil and sediment, surface water and groundwater, and biota in several regions. Most arsenic is of toxic inorganic forms. It is critical to recognize that such contamination imposes serious harmful effects on various aquatic and terrestrial organisms and human health ultimately. Serious incidences of acute and chronic arsenic poisonings have been revealed. Through examination of the available literature, screening and selecting existing data, this paper provides an analysis of the currently available information on recognized problem areas, and an overview of current knowledge of the principal hydrogeochemical processes of arsenic transportation and transformation. However, a more detailed understanding of local sources of arsenic and mechanisms of arsenic release is required. More extensive studies will be required for building practical guidance on avoiding and reducing arsenic contamination. Bioremediation and hyperaccumulation are emerging innovative technologies for the remediation of arsenic contaminated sites. Natural attenuation may be utilized as a potential in situ remedial option. Further

  15. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    SciTech Connect

    Carbajo, J.J.

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  16. Solid angle subtended by a cylindrical detector at a point source in terms of elliptic integrals

    NASA Astrophysics Data System (ADS)

    Prata, M. J.

    2003-07-01

    The solid angle subtended by a right circular cylinder at a point source located at an arbitrary position generally consists of a sum of two terms: that defined by the cylindrical surface (Ω cyl) and the other by either of the end circles (Ω circ) . We derive an expression for Ω cyl in terms of elliptic integrals of the first and third kinds and give similar expressions for Ω circ using integrals of the first and second kinds. These latter can be used alternatively to an expression also in terms of elliptic integrals, due to Philip A. Macklin and included as a footnote in Masket (Rev. Sci. Instrum. 28 (3) (1957) 191). The solid angle subtended by the whole cylinder when the source is located at an arbitrary location can then be calculated using elliptic integrals.

  17. Laboratory experiments designed to provide limits on the radionuclide source term for the NNWSI Project

    SciTech Connect

    Oversby, V.M.; McCright, R.D.

    1984-11-01

    The Nevada Nuclear Waste Storage Investigations Project is investigating the suitability of the tuffaceous rocks at Yucca Mountain Nevada for potential use as a high-level nuclear waste repository. The horizon under investigation lies above the water table, and therefore offers a setting that differs substantially from other potential repository sites. The unsaturated zone environment allows a simple, but effective, waste package design. The source term for radionuclide release from the waste package will be based on laboratory experiments that determine the corrosion rates and mechanisms for the metal container and the dissolution rate of the waste form under expected long term conditions. This paper describes the present status of laboratory results and outlines the approach to be used in combining the data to develop a realistic source term for release of radionuclides from the waste package. 16 refs., 3 figs., 1 tab.

  18. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE PAGESBeta

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    2016-01-18

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/137Cs versus 134Cs/137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain

  19. Low-level radioactive waste source terms for the 1992 integrated data base

    SciTech Connect

    Loghry, S L; Kibbey, A H; Godbee, H W; Icenhour, A S; DePaoli, S M

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF{sub 6}) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and {open_quotes}other{close_quotes}. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF{sub 6} conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992.

  20. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  1. A Systematic Search for Short-term Variability of EGRET Sources

    NASA Technical Reports Server (NTRS)

    Wallace, P. M.; Griffis, N. J.; Bertsch, D. L.; Hartman, R. C.; Thompson, D. J.; Kniffen, D. A.; Bloom, S. D.

    2000-01-01

    The 3rd EGRET Catalog of High-energy Gamma-ray Sources contains 170 unidentified sources, and there is great interest in the nature of these sources. One means of determining source class is the study of flux variability on time scales of days; pulsars are believed to be stable on these time scales while blazers are known to be highly variable. In addition, previous work has demonstrated that 3EG J0241-6103 and 3EG J1837-0606 are candidates for a new gamma-ray source class. These sources near the Galactic plane display transient behavior but cannot be associated with any known blazers. Although, many instances of flaring AGN have been reported, the EGRET database has not been systematically searched for occurrences of short-timescale (approximately 1 day) variability. These considerations have led us to conduct a systematic search for short-term variability in EGRET data, covering all viewing periods through proposal cycle 4. Six 3EG catalog sources are reported here to display variability on short time scales; four of them are unidentified. In addition, three non-catalog variable sources are discussed.

  2. Simulation of dose distribution for iridium-192 brachytherapy source type-H01 using MCNPX

    SciTech Connect

    Purwaningsih, Anik

    2014-09-30

    Dosimetric data for a brachytherapy source should be known before it used for clinical treatment. Iridium-192 source type H01 was manufactured by PRR-BATAN aimed to brachytherapy is not yet known its dosimetric data. Radial dose function and anisotropic dose distribution are some primary keys in brachytherapy source. Dose distribution for Iridium-192 source type H01 was obtained from the dose calculation formalism recommended in the AAPM TG-43U1 report using MCNPX 2.6.0 Monte Carlo simulation code. To know the effect of cavity on Iridium-192 type H01 caused by manufacturing process, also calculated on Iridium-192 type H01 if without cavity. The result of calculation of radial dose function and anisotropic dose distribution for Iridium-192 source type H01 were compared with another model of Iridium-192 source.

  3. Simulation of dose distribution for iridium-192 brachytherapy source type-H01 using MCNPX

    NASA Astrophysics Data System (ADS)

    Purwaningsih, Anik

    2014-09-01

    Dosimetric data for a brachytherapy source should be known before it used for clinical treatment. Iridium-192 source type H01 was manufactured by PRR-BATAN aimed to brachytherapy is not yet known its dosimetric data. Radial dose function and anisotropic dose distribution are some primary keys in brachytherapy source. Dose distribution for Iridium-192 source type H01 was obtained from the dose calculation formalism recommended in the AAPM TG-43U1 report using MCNPX 2.6.0 Monte Carlo simulation code. To know the effect of cavity on Iridium-192 type H01 caused by manufacturing process, also calculated on Iridium-192 type H01 if without cavity. The result of calculation of radial dose function and anisotropic dose distribution for Iridium-192 source type H01 were compared with another model of Iridium-192 source.

  4. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean

  5. Making Learning Memorable: Distributed Practice and Long-Term Retention by Special Needs Students.

    ERIC Educational Resources Information Center

    Crawford, S. A. S.; Baine, D.

    1992-01-01

    This paper considers reasons why distributed practice is relatively little used as a method for increasing long-term retention with special needs students and proposes an instructional strategy in which intervals between practice are scheduled according to a student's mastery of the material. (DB)

  6. A General Model for Preferential and Triadic Choice in Terms of Central F Distribution Functions.

    ERIC Educational Resources Information Center

    Ennis, Daniel M; Johnson, Norman L.

    1994-01-01

    A model for preferential and triadic choice is derived in terms of weighted sums of central F distribution functions. It is a probabilistic generalization of Coombs' (1964) unfolding model from which special cases can be derived easily. This model for binary choice can be easily related to preference ratio judgments. (SLD)

  7. Long-Term Probability Distribution of Wind Turbine Planetary Bearing Loads (Poster)

    SciTech Connect

    Jiang, Z.; Xing, Y.; Guo, Y.; Dong, W.; Moan, T.; Gao, Z.

    2013-04-01

    Among the various causes of bearing damage and failure, metal fatigue of the rolling contact surface is the dominant failure mechanism. The fatigue life is associated with the load conditions under which wind turbines operate in the field. Therefore, it is important to understand the long-term distribution of the bearing loads under various environmental conditions. The National Renewable Energy Laboratory's 750-kW Gearbox Reliability Collaborative wind turbine is studied in this work. A decoupled analysis using several computer codes is carried out. The global aero-elastic simulations are performed using HAWC2. The time series of the drivetrain loads and motions from the global dynamic analysis are fed to a drivetrain model in SIMPACK. The time-varying internal pressure distribution along the raceway is obtained analytically. A series of probability distribution functions are then used to fit the long-term statistical distribution at different locations along raceways. The long-term distribution of the bearing raceway loads are estimated under different environmental conditions. Finally, the bearing fatigue lives are calculated.

  8. Search for correlated radio and optical events in long-term studies of extragalactic sources

    NASA Technical Reports Server (NTRS)

    Pomphrey, R. B.; Smith, A. G.; Leacock, R. J.; Olsson, C. N.; Scott, R. L.; Pollock, J. T.; Edwards, P.; Dent, W. A.

    1976-01-01

    For the first time, long-term records of radio and optical fluxes of a large sample of variable extragalactic sources have been assembled and compared, with linear cross-correlation analysis being used to reinforce the visual comparisons. Only in the case of the BL Lac object OJ 287 is the correlation between radio and optical records strong. In the majority of cases there is no evidence of significant correlation, although nine sources show limited or weak evidence of correlation. The results do not support naive extrapolation of the expanding source model. The general absence of strong correlation between the radio and optical regions has important implications for the energetics of events occurring in such sources.

  9. Fire Source Localization Based on Distributed Temperature Sensing by a Dual-Line Optical Fiber System.

    PubMed

    Sun, Miao; Tang, Yuquan; Yang, Shuang; Li, Jun; Sigrist, Markus W; Dong, Fengzhong

    2016-01-01

    We propose a method for localizing a fire source using an optical fiber distributed temperature sensor system. A section of two parallel optical fibers employed as the sensing element is installed near the ceiling of a closed room in which the fire source is located. By measuring the temperature of hot air flows, the problem of three-dimensional fire source localization is transformed to two dimensions. The method of the source location is verified with experiments using burning alcohol as fire source, and it is demonstrated that the method represents a robust and reliable technique for localizing a fire source also for long sensing ranges. PMID:27275822

  10. Fire Source Localization Based on Distributed Temperature Sensing by a Dual-Line Optical Fiber System

    PubMed Central

    Sun, Miao; Tang, Yuquan; Yang, Shuang; Li, Jun; Sigrist, Markus W.; Dong, Fengzhong

    2016-01-01

    We propose a method for localizing a fire source using an optical fiber distributed temperature sensor system. A section of two parallel optical fibers employed as the sensing element is installed near the ceiling of a closed room in which the fire source is located. By measuring the temperature of hot air flows, the problem of three-dimensional fire source localization is transformed to two dimensions. The method of the source location is verified with experiments using burning alcohol as fire source, and it is demonstrated that the method represents a robust and reliable technique for localizing a fire source also for long sensing ranges. PMID:27275822

  11. Accident source terms for Light-Water Nuclear Power Plants. Final report

    SciTech Connect

    Soffer, L.; Burson, S.B.; Ferrell, C.M.; Lee, R.Y.; Ridgely, J.N.

    1995-02-01

    In 1962 tile US Atomic Energy Commission published TID-14844, ``Calculation of Distance Factors for Power and Test Reactors`` which specified a release of fission products from the core to the reactor containment for a postulated accident involving ``substantial meltdown of the core``. This ``source term``, tile basis for tile NRC`s Regulatory Guides 1.3 and 1.4, has been used to determine compliance with tile NRC`s reactor site criteria, 10 CFR Part 100, and to evaluate other important plant performance requirements. During the past 30 years substantial additional information on fission product releases has been developed based on significant severe accident research. This document utilizes this research by providing more realistic estimates of the ``source term`` release into containment, in terms of timing, nuclide types, quantities and chemical form, given a severe core-melt accident. This revised ``source term`` is to be applied to the design of future light water reactors (LWRs). Current LWR licensees may voluntarily propose applications based upon it.

  12. Identifying Synonymy between SNOMED Clinical Terms of Varying Length Using Distributional Analysis of Electronic Health Records

    PubMed Central

    Henriksson, Aron; Conway, Mike; Duneld, Martin; Chapman, Wendy W.

    2013-01-01

    Medical terminologies and ontologies are important tools for natural language processing of health record narratives. To account for the variability of language use, synonyms need to be stored in a semantic resource as textual instantiations of a concept. Developing such resources manually is, however, prohibitively expensive and likely to result in low coverage. To facilitate and expedite the process of lexical resource development, distributional analysis of large corpora provides a powerful data-driven means of (semi-)automatically identifying semantic relations, including synonymy, between terms. In this paper, we demonstrate how distributional analysis of a large corpus of electronic health records – the MIMIC-II database – can be employed to extract synonyms of SNOMED CT preferred terms. A distinctive feature of our method is its ability to identify synonymous relations between terms of varying length. PMID:24551362

  13. Reconstruction of Far-Field Tsunami Amplitude Distributions from Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Geist, Eric L.; Parsons, Tom

    2016-04-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  14. Reconstruction of far-field tsunami amplitude distributions from earthquake sources

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2016-01-01

    The probability distribution of far-field tsunami amplitudes is explained in relation to the distribution of seismic moment at subduction zones. Tsunami amplitude distributions at tide gauge stations follow a similar functional form, well described by a tapered Pareto distribution that is parameterized by a power-law exponent and a corner amplitude. Distribution parameters are first established for eight tide gauge stations in the Pacific, using maximum likelihood estimation. A procedure is then developed to reconstruct the tsunami amplitude distribution that consists of four steps: (1) define the distribution of seismic moment at subduction zones; (2) establish a source-station scaling relation from regression analysis; (3) transform the seismic moment distribution to a tsunami amplitude distribution for each subduction zone; and (4) mix the transformed distribution for all subduction zones to an aggregate tsunami amplitude distribution specific to the tide gauge station. The tsunami amplitude distribution is adequately reconstructed for four tide gauge stations using globally constant seismic moment distribution parameters established in previous studies. In comparisons to empirical tsunami amplitude distributions from maximum likelihood estimation, the reconstructed distributions consistently exhibit higher corner amplitude values, implying that in most cases, the empirical catalogs are too short to include the largest amplitudes. Because the reconstructed distribution is based on a catalog of earthquakes that is much larger than the tsunami catalog, it is less susceptible to the effects of record-breaking events and more indicative of the actual distribution of tsunami amplitudes.

  15. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults

    PubMed Central

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-01-01

    Background Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. Methods and Results We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (P<0.01). These results were consistent by sex. Conclusions These findings highlight potential shortcomings of using short-term risk tools for primary prevention strategies because a substantial proportion of Peruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. PMID:26254303

  16. An altitude and distance correction to the source fluence distribution of TGFs

    NASA Astrophysics Data System (ADS)

    Nisi, R. S.; Østgaard, N.; Gjesteland, T.; Collier, A. B.

    2014-10-01

    The source fluence distribution of terrestrial gamma ray flashes (TGFs) has been extensively discussed in recent years, but few have considered how the TGF fluence distribution at the source, as estimated from satellite measurements, depends on the distance from satellite foot point and assumed production altitude. As the absorption of the TGF photons increases significantly with lower source altitude and larger distance between the source and the observing satellite, these might be important factors. We have addressed the issue by using the tropopause pressure distribution as an approximation of the TGF production altitude distribution and World Wide Lightning Location Network spheric measurements to determine the distance. The study is made possible by the increased number of Ramaty High Energy Solar Spectroscopic Imager (RHESSI) TGFs found in the second catalog of the RHESSI data. One find is that the TGF/lightning ratio for the tropics probably has an annual variability due to an annual variability in the Dobson-Brewer circulation. The main result is an indication that the altitude distribution and distance should be considered when investigating the source fluence distribution of TGFs, as this leads to a softening of the inferred distribution of source brightness.

  17. An altitude and distance correction to the source fluence distribution of TGFs

    PubMed Central

    Nisi, R S; Østgaard, N; Gjesteland, T; Collier, A B

    2014-01-01

    The source fluence distribution of terrestrial gamma ray flashes (TGFs) has been extensively discussed in recent years, but few have considered how the TGF fluence distribution at the source, as estimated from satellite measurements, depends on the distance from satellite foot point and assumed production altitude. As the absorption of the TGF photons increases significantly with lower source altitude and larger distance between the source and the observing satellite, these might be important factors. We have addressed the issue by using the tropopause pressure distribution as an approximation of the TGF production altitude distribution and World Wide Lightning Location Network spheric measurements to determine the distance. The study is made possible by the increased number of Ramaty High Energy Solar Spectroscopic Imager (RHESSI) TGFs found in the second catalog of the RHESSI data. One find is that the TGF/lightning ratio for the tropics probably has an annual variability due to an annual variability in the Dobson-Brewer circulation. The main result is an indication that the altitude distribution and distance should be considered when investigating the source fluence distribution of TGFs, as this leads to a softening of the inferred distribution of source brightness. PMID:26167434

  18. Optimal source distribution for binaural synthesis over loudspeakers

    NASA Astrophysics Data System (ADS)

    Takeuchi, Takashi; Nelson, Philip A.

    2002-12-01

    When binaural sound signals are presented with loudspeakers, the system inversion involved gives rise to a number of problems such as a loss of dynamic range and a lack of robustness to small errors and room reflections. The amplification required by the system inversion results in loss of dynamic range. The control performance of such a system deteriorates severely due to small errors resulting from, e.g., misalignment of the system and individual differences in the head related transfer functions at certain frequencies. The required large sound radiation results in severe reflection which also reduces the control performance. A method of overcoming these fundamental problems is proposed in this paper. A conceptual monopole transducer is introduced whose position varies continuously as frequency varies. This gives a minimum processing requirement of the binaural signals for the control to be achieved and all the above problems either disappear or are minimized. The inverse filters have flat amplitude response and the reproduced sound is not colored even outside the relatively large ``sweet area.'' A number of practical solutions are suggested for the realization of such optimally distributed transducers. One of them is a discretization that enables the use of conventional transducer units.

  19. Decoy-state quantum key distribution with a leaky source

    NASA Astrophysics Data System (ADS)

    Tamaki, Kiyoshi; Curty, Marcos; Lucamarini, Marco

    2016-06-01

    In recent years, there has been a great effort to prove the security of quantum key distribution (QKD) with a minimum number of assumptions. Besides its intrinsic theoretical interest, this would allow for larger tolerance against device imperfections in the actual implementations. However, even in this device-independent scenario, one assumption seems unavoidable, that is, the presence of a protected space devoid of any unwanted information leakage in which the legitimate parties can privately generate, process and store their classical data. In this paper we relax this unrealistic and hardly feasible assumption and introduce a general formalism to tackle the information leakage problem in most of existing QKD systems. More specifically, we prove the security of optical QKD systems using phase and intensity modulators in their transmitters, which leak the setting information in an arbitrary manner. We apply our security proof to cases of practical interest and show key rates similar to those obtained in a perfectly shielded environment. Our work constitutes a fundamental step forward in guaranteeing implementation security of quantum communication systems.

  20. Measurements of Infrared and Acoustic Source Distributions in Jet Plumes

    NASA Technical Reports Server (NTRS)

    Agboola, Femi A.; Bridges, James; Saiyed, Naseem

    2004-01-01

    The aim of this investigation was to use the linear phased array (LPA) microphones and infrared (IR) imaging to study the effects of advanced nozzle-mixing techniques on jet noise reduction. Several full-scale engine nozzles were tested at varying power cycles with the linear phased array setup parallel to the jet axis. The array consisted of 16 sparsely distributed microphones. The phased array microphone measurements were taken at a distance of 51.0 ft (15.5 m) from the jet axis, and the results were used to obtain relative overall sound pressure levels from one nozzle design to the other. The IR imaging system was used to acquire real-time dynamic thermal patterns of the exhaust jet from the nozzles tested. The IR camera measured the IR radiation from the nozzle exit to a distance of six fan diameters (X/D(sub FAN) = 6), along the jet plume axis. The images confirmed the expected jet plume mixing intensity, and the phased array results showed the differences in sound pressure level with respect to nozzle configurations. The results show the effects of changes in configurations to the exit nozzles on both the flows mixing patterns and radiant energy dissipation patterns. By comparing the results from these two measurements, a relationship between noise reduction and core/bypass flow mixing is demonstrated.

  1. A Systematic Search for Short-term Variability of EGRET Sources

    NASA Technical Reports Server (NTRS)

    Wallace, P. M.; Bertsch, D. L.; Bloom, S. D.; Griffis, N. J.; Hunter, S. D.; Kniffen, D. A.; Thompson, D. J.

    1999-01-01

    The 3rd EGRET Catalog contains 170 unidentified high-energy (E>100 MeV) gamma-ray sources, and there is great interest in the nature of these sources. One means of determining sources class is the study of flux variability on time scales of days; pulsars are believed to be stable on these scales while blazars are known to be highly variable. In addition, previous work has led to the discovery of 2CG 135+01 and GRO J1838-04, candidates for a new high-energy gamma-ray source class. These sources display transient behavior but cannot be associated with any known blazars. These considerations have led us to conduct a systematic search for short-term variability in EGRET data, covering all viewing periods through cycle 4. Three unidentified sources show some evidence of variability on short time scales; the source displaying the most convincing variability, 3EG J2006-2321, is not easily identified as a blazar.

  2. Analytic solutions of the time-dependent quasilinear diffusion equation with source and loss terms

    SciTech Connect

    Hassan, M.H.A. ); Hamza, E.A. )

    1993-08-01

    A simplified one-dimensional quasilinear diffusion equation describing the time evolution of collisionless ions in the presence of ion-cyclotron-resonance heating, sources, and losses is solved analytically for all harmonics of the ion cyclotron frequency. Simple time-dependent distribution functions which are initially Maxwellian and vanish at high energies are obtained and calculated numerically for the first four harmonics of resonance heating. It is found that the strongest ion tail of the resulting anisotropic distribution function is driven by heating at the second harmonic followed by heating at the fundamental frequency.

  3. Differential dose contributions on total dose distribution of 125I brachytherapy source

    PubMed Central

    Camgöz, B.; Yeğin, G.; Kumru, M.N.

    2010-01-01

    This work provides an improvement of the approach using Monte Carlo simulation for the Amersham Model 6711 125I brachytherapy seed source, which is well known by many theoretical and experimental studies. The source which has simple geometry was researched with respect to criteria of AAPM Tg-43 Report. The approach offered by this study involves determination of differential dose contributions that come from virtual partitions of a massive radioactive element of the studied source to a total dose at analytical calculation point. Some brachytherapy seeds contain multi-radioactive elements so the dose at any point is a total of separate doses from each element. It is momentous to know well the angular and radial dose distributions around the source that is located in cancerous tissue for clinical treatments. Interior geometry of a source is effective on dose characteristics of a distribution. Dose information of inner geometrical structure of a brachytherapy source cannot be acquired by experimental methods because of limits of physical material and geometry in the healthy tissue, so Monte Carlo simulation is a required approach of the study. EGSnrc Monte Carlo simulation software was used. In the design of a simulation, the radioactive source was divided into 10 rings, partitioned but not separate from each other. All differential sources were simulated for dose calculation, and the shape of dose distribution was determined comparatively distribution of a single-complete source. In this work anisotropy function was examined also mathematically. PMID:24376927

  4. Comparing two micrometeorological techniques for estimating trace gas emissions from distributed sources

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Measuring trace gas emission from distributed sources such as treatment lagoons, treatment wetlands, land spread of manure, and feedlots requires micrometeorological methods. In this study, we tested the accuracy of two relatively new micrometeorological techniques, vertical radial plume mapping (VR...

  5. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... earnings and profits. In determining the source of a distribution, consideration should be given first, to.... (b) If the earnings and profits of the taxable year (computed as of the close of the year...

  6. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... earnings and profits. In determining the source of a distribution, consideration should be given first, to.... (b) If the earnings and profits of the taxable year (computed as of the close of the year...

  7. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... earnings and profits. In determining the source of a distribution, consideration should be given first, to.... (b) If the earnings and profits of the taxable year (computed as of the close of the year...

  8. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and profits. In determining the source of a distribution, consideration should be given first, to the... earnings and profits of the taxable year (computed as of the close of the year without diminution by...

  9. 26 CFR 1.316-2 - Sources of distribution in general.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... and profits. In determining the source of a distribution, consideration should be given first, to the... earnings and profits of the taxable year (computed as of the close of the year without diminution by...

  10. GEOCHEMISTRY OF PAHS IN AQUATIC ENVIRONMENTS: A SYNTHESIS OF DISTRIBUTION, SOURCE, PERSISTENCE, PARTITIONING AND BIOAVAILABILITY

    EPA Science Inventory

    On the basis of their distributions, sources, persistence, partitioning and bioavailability, polycyclic aromatic hydrocarbons (PAHs) are a unique class of persistent organic pollutants (POPs) contaminating the aquatic environment. They are of particular interest to geochemists an...