Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
Radiological analysis of plutonium glass batches with natural/enriched boron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
2000-06-22
The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less
Directional Unfolded Source Term (DUST) for Compton Cameras.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean
2018-03-01
A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.
Circular current loops, magnetic dipoles and spherical harmonic analysis.
Alldredge, L.R.
1980-01-01
Spherical harmonic analysis (SHA) is the most used method of describing the Earth's magnetic field, even though spherical harmonic coefficients (SHC) almost completely defy interpretation in terms of real sources. Some moderately successful efforts have been made to represent the field in terms of dipoles placed in the core in an effort to have the model come closer to representing real sources. Dipole sources are only a first approximation to the real sources which are thought to be a very complicated network of electrical currents in the core of the Earth. -Author
BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Lotz
1997-02-15
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, Steven M.; Harding, Lee
The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.
Auditing the multiply-related concepts within the UMLS
Mougin, Fleur; Grabar, Natalia
2014-01-01
Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853
Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source.more » The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.« less
Mohammadi, A; Hassanzadeh, M; Gharib, M
2016-02-01
In this study, shielding calculation and criticality safety analysis were carried out for general material testing reactor (MTR) research reactors interim storage and relevant transportation cask. During these processes, three major terms were considered: source term, shielding, and criticality calculations. The Monte Carlo transport code MCNP5 was used for shielding calculation and criticality safety analysis and ORIGEN2.1 code for source term calculation. According to the results obtained, a cylindrical cask with body, top, and bottom thicknesses of 18, 13, and 13 cm, respectively, was accepted as the dual-purpose cask. Furthermore, it is shown that the total dose rates are below the normal transport criteria that meet the standards specified. Copyright © 2015 Elsevier Ltd. All rights reserved.
Long-term variability in bright hard X-ray sources: 5+ years of BATSE data
NASA Technical Reports Server (NTRS)
Robinson, C. R.; Harmon, B. A.; McCollough, M. L.; Paciesas, W. S.; Sahi, M.; Scott, D. M.; Wilson, C. A.; Zhang, S. N.; Deal, K. J.
1997-01-01
The operation of the Compton Gamma Ray Observatory (CGRO)/burst and transient source experiment (BATSE) continues to provide data for inclusion into a data base for the analysis of long term variability in bright, hard X-ray sources. The all-sky capability of BATSE provides up to 30 flux measurements/day for each source. The long baseline and the various rising and setting occultation flux measurements allow searches for periodic and quasi-periodic signals with periods of between several hours to hundreds of days to be conducted. The preliminary results from an analysis of the hard X-ray variability in 24 of the brightest BATSE sources are presented. Power density spectra are computed for each source and profiles are presented of the hard X-ray orbital modulations in some X-ray binaries, together with amplitude modulations and variations in outburst durations and intensities in recurrent X-ray transients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.
Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei
2017-04-01
Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.
The numerical dynamic for highly nonlinear partial differential equations
NASA Technical Reports Server (NTRS)
Lafon, A.; Yee, H. C.
1992-01-01
Problems associated with the numerical computation of highly nonlinear equations in computational fluid dynamics are set forth and analyzed in terms of the potential ranges of spurious behaviors. A reaction-convection equation with a nonlinear source term is employed to evaluate the effects related to spatial and temporal discretizations. The discretization of the source term is described according to several methods, and the various techniques are shown to have a significant effect on the stability of the spurious solutions. Traditional linearized stability analyses cannot provide the level of confidence required for accurate fluid dynamics computations, and the incorporation of nonlinear analysis is proposed. Nonlinear analysis based on nonlinear dynamical systems complements the conventional linear approach and is valuable in the analysis of hypersonic aerodynamics and combustion phenomena.
NASA Astrophysics Data System (ADS)
Conti, P.; Testi, D.; Grassi, W.
2017-11-01
This work reviews and compares suitable models for the thermal analysis of forced convection over a heat source in a porous medium. The set of available models refers to an infinite medium in which a fluid moves over different three heat source geometries: i.e. the moving infinite line source, the moving finite line source, and the moving infinite cylindrical source. In this perspective, the present work presents a plain and handy compendium of the above-mentioned models for forced external convection in porous media; besides, we propose a dimensionless analysis to figure out the reciprocal deviation among available models, helping the selection of the most suitable one in the specific case of interest. Under specific conditions, the advection term becomes ineffective in terms of heat transfer performances, allowing the use of purely-conductive models. For that reason, available analytical and numerical solutions for purely-conductive media are also reviewed and compared, again, by dimensionless criteria. Therefore, one can choose the simplest solution, with significant benefits in terms of computational effort and interpretation of the results. The main outcomes presented in the paper are: the conditions under which the system can be considered subject to a Darcy flow, the minimal distance beyond which the finite dimension of the heat source does not affect the thermal field, and the critical fluid velocity needed to have a significant contribution of the advection term in the overall heat transfer process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.
1995-04-01
This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less
On the numerical treatment of nonlinear source terms in reaction-convection equations
NASA Technical Reports Server (NTRS)
Lafon, A.; Yee, H. C.
1992-01-01
The objectives of this paper are to investigate how various numerical treatments of the nonlinear source term in a model reaction-convection equation can affect the stability of steady-state numerical solutions and to show under what conditions the conventional linearized analysis breaks down. The underlying goal is to provide part of the basic building blocks toward the ultimate goal of constructing suitable numerical schemes for hypersonic reacting flows, combustions and certain turbulence models in compressible Navier-Stokes computations. It can be shown that nonlinear analysis uncovers much of the nonlinear phenomena which linearized analysis is not capable of predicting in a model reaction-convection equation.
Auditing the multiply-related concepts within the UMLS.
Mougin, Fleur; Grabar, Natalia
2014-10-01
This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, D.; Brunett, A.; Passerini, S.
Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
2011-09-01
a NSS that lies in this negative explosion positive CLVD quadrant due to the large degree of tectonic release in this event that reversed the phase...Mellman (1986) in their analysis of fundamental model Love and Rayleigh wave amplitude and phase for nuclear and tectonic release source terms, and...1986). Estimating explosion and tectonic release source parameters of underground nuclear explosions from Rayleigh and Love wave observations, Air
NASA Astrophysics Data System (ADS)
Bonhoff, H. A.; Petersson, B. A. T.
2010-08-01
For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.
Analysis and Synthesis of Tonal Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Allen, Matthew P.; Rizzi, Stephen A.; Burdisso, Ricardo; Okcu, Selen
2012-01-01
Fixed and rotary wing aircraft operations can have a significant impact on communities in proximity to airports. Simulation of predicted aircraft flyover noise, paired with listening tests, is useful to noise reduction efforts since it allows direct annoyance evaluation of aircraft or operations currently in the design phase. This paper describes efforts to improve the realism of synthesized source noise by including short term fluctuations, specifically for inlet-radiated tones resulting from the fan stage of turbomachinery. It details analysis performed on an existing set of recorded turbofan data to isolate inlet-radiated tonal fan noise, then extract and model short term tonal fluctuations using the analytic signal. Methodologies for synthesizing time-variant tonal and broadband turbofan noise sources using measured fluctuations are also described. Finally, subjective listening test results are discussed which indicate that time-variant synthesized source noise is perceived to be very similar to recordings.
NASA Technical Reports Server (NTRS)
Cunefare, K. A.; Koopmann, G. H.
1991-01-01
This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.
Performance evaluation of WAVEWATCH III model in the Persian Gulf using different wind resources
NASA Astrophysics Data System (ADS)
Kazeminezhad, Mohammad Hossein; Siadatmousavi, Seyed Mostafa
2017-07-01
The third-generation wave model, WAVEWATCH III, was employed to simulate bulk wave parameters in the Persian Gulf using three different wind sources: ERA-Interim, CCMP, and GFS-Analysis. Different formulations for whitecapping term and the energy transfer from wind to wave were used, namely the Tolman and Chalikov (J Phys Oceanogr 26:497-518, 1996), WAM cycle 4 (BJA and WAM4), and Ardhuin et al. (J Phys Oceanogr 40(9):1917-1941, 2010) (TEST405 and TEST451 parameterizations) source term packages. The obtained results from numerical simulations were compared to altimeter-derived significant wave heights and measured wave parameters at two stations in the northern part of the Persian Gulf through statistical indicators and the Taylor diagram. Comparison of the bulk wave parameters with measured values showed underestimation of wave height using all wind sources. However, the performance of the model was best when GFS-Analysis wind data were used. In general, when wind veering from southeast to northwest occurred, and wind speed was high during the rotation, the model underestimation of wave height was severe. Except for the Tolman and Chalikov (J Phys Oceanogr 26:497-518, 1996) source term package, which severely underestimated the bulk wave parameters during stormy condition, the performances of other formulations were practically similar. However, in terms of statistics, the Ardhuin et al. (J Phys Oceanogr 40(9):1917-1941, 2010) source terms with TEST405 parameterization were the most successful formulation in the Persian Gulf when compared to in situ and altimeter-derived observations.
Making the right long-term prescription for medical equipment financing.
Conbeer, George P
2007-06-01
For hospital financial executives charged with assessing new technologies, obtaining access to sufficient information to support an in-depth analysis can be a daunting challenge. The information should come not only from direct sources, such as the equipment manufacturer, but also from indirect sources, such as leasing companies. A thorough knowledge of financing methods--including tax-exempt bonds, bank debt, standard leasing, tax-exempt leasing, and equipment rental terms-is critical.
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Two-micron Laser Atmospheric Wind Sounder (LAWS) pointing/tracking study
NASA Technical Reports Server (NTRS)
Manlief, Scott
1995-01-01
The objective of the study was to identify and model major sources of short-term pointing jitter for a free-flying, full performance 2 micron LAWS system and evaluate the impact of the short-term jitter on wind-measurement performance. A fast steering mirror controls system was designed for the short-term jitter compensation. The performance analysis showed that the short-term jitter performance of the controls system over the 5.2 msec round-trip time for a realistic spacecraft environment was = 0.3 micro rad, rms, within the specified value of less than 0.5 micro rad, rms, derived in a 2 micron LAWS System Study. Disturbance modes were defined for: (1) the Bearing and Power Transfer Assembly (BAPTA) scan bearing, (2) the spacecraft reaction wheel torques, and (3) the solar array drive torques. The scan bearing disturbance was found to be the greatest contributing noise source to the jitter performance. Disturbances from the fast steering mirror reaction torques and a boom-mounted cross-link antenna clocking were also considered but were judged to be small compared to the three principal disturbance sources above and were not included in the final controls analysis.
Source-term development for a contaminant plume for use by multimedia risk assessment models
NASA Astrophysics Data System (ADS)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.
2000-02-01
Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.
Funding analysis for long-term planning : final report
DOT National Transportation Integrated Search
2003-07-01
In existence since 1956, the Highway Trust Fund (HTF) is the source of nearly all Federal highway funding and roughly four-fifths of all Federal transit funding. The Highway Trust Fund is integral to the long-term transportation planning of all 50 St...
Upper and lower bounds of ground-motion variabilities: implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino
2017-04-01
One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).
SUS Source Level Error Analysis
1978-01-20
RIECIP1IEN’ CATALOG NUMBER * ITLE (and SubaltIe) S. TYP aof REPORT & _V9RCO SUS~ SOURCE LEVEL ERROR ANALYSIS & Fia 1.r,. -. pAURWORONTIUMm N (s)$S...Fourier Transform (FFTl) SUS Signal model ___ 10 TRA&C (CeEOINIMII1& ro"* *140O tidat n9#*#*Y a"d 0e~ntiff 6T 69*.4 apbt The report provides an analysis ...of major terms which contribute to signal analysis error in a proposed experiment to c-librate sourr - I levels of SUS (Signal Underwater Sound). A
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohd, Shukri; Holford, Karen M.; Pullin, Rhys
2014-02-12
Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less
Possible Dual Earthquake-Landslide Source of the 13 November 2016 Kaikoura, New Zealand Tsunami
NASA Astrophysics Data System (ADS)
Heidarzadeh, Mohammad; Satake, Kenji
2017-10-01
A complicated earthquake ( M w 7.8) in terms of rupture mechanism occurred in the NE coast of South Island, New Zealand, on 13 November 2016 (UTC) in a complex tectonic setting comprising a transition strike-slip zone between two subduction zones. The earthquake generated a moderate tsunami with zero-to-crest amplitude of 257 cm at the near-field tide gauge station of Kaikoura. Spectral analysis of the tsunami observations showed dual peaks at 3.6-5.7 and 5.7-56 min, which we attribute to the potential landslide and earthquake sources of the tsunami, respectively. Tsunami simulations showed that a source model with slip on an offshore plate-interface fault reproduces the near-field tsunami observation in terms of amplitude, but fails in terms of tsunami period. On the other hand, a source model without offshore slip fails to reproduce the first peak, but the later phases are reproduced well in terms of both amplitude and period. It can be inferred that an offshore source is necessary to be involved, but it needs to be smaller in size than the plate interface slip, which most likely points to a confined submarine landslide source, consistent with the dual-peak tsunami spectrum. We estimated the dimension of the potential submarine landslide at 8-10 km.
LONG TERM HYDROLOGICAL IMPACT ASSESSMENT (LTHIA)
LTHIA is a universal Urban Sprawl analysis tool that is available to all at no charge through the Internet. It estimates impacts on runoff, recharge and nonpoint source pollution resulting from past or proposed land use changes. It gives long-term average annual runoff for a lan...
McDonald, Brian C; Goldstein, Allen H; Harley, Robert A
2015-04-21
A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
Geocoronal hydrogen studies using Fabry Perot interferometers, part 2: Long-term observations
NASA Astrophysics Data System (ADS)
Nossal, S. M.; Mierkiewicz, E. J.; Roesler, F. L.; Reynolds, R. J.; Haffner, L. M.
2006-09-01
Long-term data sets are required to investigate sources of natural variability in the upper atmosphere. Understanding the influence of sources of natural variability such as the solar cycle is needed to characterize the thermosphere + exosphere, to understand coupling processes between atmospheric regions, and to isolate signatures of natural variability from those due to human-caused change. Multi-year comparisons of thermospheric + exospheric Balmer α emissions require cross-calibrated and well-understood instrumentation, a stable calibration source, reproducible observing conditions, separation of the terrestrial from the Galactic emission line, and consistent data analysis accounting for differences in viewing geometry. We discuss how we address these criteria in the acquisition and analysis of a mid-latitude geocoronal Balmer α column emission data set now spanning two solar cycles and taken mainly from Wisconsin and Kitt Peak, Arizona. We also discuss results and outstanding challenges for increasing the accuracy and use of these observations.
NASA Astrophysics Data System (ADS)
Hu, Minpeng; Liu, Yanmei; Wang, Jiahui; Dahlgren, Randy A.; Chen, Dingjiang
2018-06-01
Source apportionment is critical for guiding development of efficient watershed nitrogen (N) pollution control measures. The ReNuMa (Regional Nutrient Management) model, a semi-empirical, semi-process-oriented model with modest data requirements, has been widely used for riverine N source apportionment. However, the ReNuMa model contains limitations for addressing long-term N dynamics by ignoring temporal changes in atmospheric N deposition rates and N-leaching lag effects. This work modified the ReNuMa model by revising the source code to allow yearly changes in atmospheric N deposition and incorporation of N-leaching lag effects into N transport processes. The appropriate N-leaching lag time was determined from cross-correlation analysis between annual watershed individual N source inputs and riverine N export. Accuracy of the modified ReNuMa model was demonstrated through analysis of a 31-year water quality record (1980-2010) from the Yongan watershed in eastern China. The revisions considerably improved the accuracy (Nash-Sutcliff coefficient increased by ∼0.2) of the modified ReNuMa model for predicting riverine N loads. The modified model explicitly identified annual and seasonal changes in contributions of various N sources (i.e., point vs. nonpoint source, surface runoff vs. groundwater) to riverine N loads as well as the fate of watershed anthropogenic N inputs. Model results were consistent with previously modeled or observed lag time length as well as changes in riverine chloride and nitrate concentrations during the low-flow regime and available N levels in agricultural soils of this watershed. The modified ReNuMa model is applicable for addressing long-term changes in riverine N sources, providing decision-makers with critical information for guiding watershed N pollution control strategies.
Analysis of jet-airfoil interaction noise sources by using a microphone array technique
NASA Astrophysics Data System (ADS)
Fleury, Vincent; Davy, Renaud
2016-03-01
The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.
Anthropogenic emissions from a variety of sectors including mobile sources have decreased substantially over the past decades despite continued growth in population and economic activity. In this study, we analyze 1990-2010 trends in emission inventories, ambient observations and...
40 CFR 406.11 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Corn Wet Milling Subcategory § 406.11 Specialized definitions... and methods of analysis set forth in 40 CFR part 401 shall apply to this subpart. (b) The term corn shall mean the shelled corn delivered to a plant before processing. (c) The term standard bushel shall...
EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young
2003-02-27
Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less
Sources and Nature of Cost Analysis Data Base Reference Manual.
1983-07-01
COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON
NASA Astrophysics Data System (ADS)
Murillo, J.; García-Navarro, P.
2012-02-01
In this work, the source term discretization in hyperbolic conservation laws with source terms is considered using an approximate augmented Riemann solver. The technique is applied to the shallow water equations with bed slope and friction terms with the focus on the friction discretization. The augmented Roe approximate Riemann solver provides a family of weak solutions for the shallow water equations, that are the basis of the upwind treatment of the source term. This has proved successful to explain and to avoid the appearance of instabilities and negative values of the thickness of the water layer in cases of variable bottom topography. Here, this strategy is extended to capture the peculiarities that may arise when defining more ambitious scenarios, that may include relevant stresses in cases of mud/debris flow. The conclusions of this analysis lead to the definition of an accurate and robust first order finite volume scheme, able to handle correctly transient problems considering frictional stresses in both clean water and debris flow, including in this last case a correct modelling of stopping conditions.
Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael
2015-05-15
In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessment of macroseismic intensity in the Nile basin, Egypt
NASA Astrophysics Data System (ADS)
Fergany, Elsayed
2018-01-01
This work intends to assess deterministic seismic hazard and risk analysis in terms of the maximum expected intensity map of the Egyptian Nile basin sector. Seismic source zone model of Egypt was delineated based on updated compatible earthquake catalog in 2015, focal mechanisms, and the common tectonic elements. Four effective seismic source zones were identified along the Nile basin. The observed macroseismic intensity data along the basin was used to develop intensity prediction equation defined in terms of moment magnitude. Expected maximum intensity map was proven based on the developed intensity prediction equation, identified effective seismic source zones, and maximum expected magnitude for each zone along the basin. The earthquake hazard and risk analysis was discussed and analyzed in view of the maximum expected moment magnitude and the maximum expected intensity values for each effective source zone. Moderate expected magnitudes are expected to put high risk at Cairo and Aswan regions. The results of this study could be a recommendation for the planners in charge to mitigate the seismic risk at these strategic zones of Egypt.
Effect of source location and listener location on ILD cues in a reverberant room
NASA Astrophysics Data System (ADS)
Ihlefeld, Antje; Shinn-Cunningham, Barbara G.
2004-05-01
Short-term interaural level differences (ILDs) were analyzed for simulations of the signals that would reach a listener in a reverberant room. White noise was convolved with manikin head-related impulse responses measured in a classroom to simulate different locations of the source relative to the manikin and different manikin positions in the room. The ILDs of the signals were computed within each third-octave band over a relatively short time window to investigate how reliably ILD cues encode source laterality. Overall, the mean of the ILD magnitude increases with lateral angle and decreases with distance, as expected. Increasing reverberation decreases the mean ILD magnitude and increases the variance of the short-term ILD, so that the spatial information carried by ILD cues is degraded by reverberation. These results suggest that the mean ILD is not a reliable cue for determining source laterality in a reverberant room. However, by taking into account both the mean and variance, the distribution of high-frequency short-term ILDs provides some spatial information. This analysis suggests that, in order to use ILDs to judge source direction in reverberant space, listeners must accumulate information about how the short-term ILD varies over time. [Work supported by NIDCD and AFOSR.
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Repeat immigration: A previously unobserved source of heterogeneity?
Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D
2017-07-01
Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.
Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments
Liang, Taiee; Bauer, Johannes M.; Liu, James C.; ...
2016-12-01
A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less
Development of surrogate models for the prediction of the flow around an aircraft propeller
NASA Astrophysics Data System (ADS)
Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros
2018-05-01
In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.
NASA Technical Reports Server (NTRS)
Jung, Y. K.; Udalski, A.; Yee, J. C.; Sumi, T.; Gould, A.; Han, C.; Albrow, M. D.; Lee, C.-U.; Bennett, D. P.; Suzuki, D.
2017-01-01
In the process of analyzing an observed light curve, one often confronts various scenarios that can mimic the planetary signals causing difficulties in the accurate interpretation of the lens system. In this paper, we present the analysis of the microlensing event OGLE-2016-BLG-0733. The light curve of the event shows a long-term asymmetric perturbation that would appear to be due to a planet. From the detailed modeling of the lensing light curve, however, we find that the perturbation originates from the binarity of the source rather than the lens. This result demonstrates that binary sources with roughly equal-luminosity components can mimic long-term perturbations induced by planets with projected separations near the Einstein ring. The result also represents the importance of the consideration of various interpretations in planet-like perturbations and of high-cadence observations for ensuring the unambiguous detection of the planet.
Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liang, Taiee; Bauer, Johannes M.; Liu, James C.
A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less
Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)
NASA Astrophysics Data System (ADS)
Kasibhatla, P.
2004-12-01
In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.
Effect of Americium-241 Content on Plutonium Radiation Source Terms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
1998-12-28
The management of excess plutonium by the US Department of Energy includes a number of storage and disposition alternatives. Savannah River Site (SRS) is supporting DOE with plutonium disposition efforts, including the immobilization of certain plutonium materials in a borosilicate glass matrix. Surplus plutonium inventories slated for vitrification include materials with elevated levels of Americium-241. The Am-241 content of plutonium materials generally reflects in-growth of the isotope due to decay of plutonium and is age-dependent. However, select plutonium inventories have Am-241 levels considerably above the age-based levels. Elevated levels of americium significantly impact radiation source terms of plutonium materials andmore » will make handling of the materials more difficult. Plutonium materials are normally handled in shielded glove boxes, and the work entails both extremity and whole body exposures. This paper reports results of an SRS analysis of plutonium materials source terms vs. the Americium-241 content of the materials. Data with respect to dependence and magnitude of source terms on/vs. Am-241 levels are presented and discussed. The investigation encompasses both vitrified and un-vitrified plutonium oxide (PuO2) batches.« less
NASA Astrophysics Data System (ADS)
Smith, R. A.; Moore, R. B.; Shanley, J. B.; Miller, E. K.; Kamman, N. C.; Nacci, D.
2009-12-01
Mercury (Hg) concentrations in fish and aquatic wildlife are complex functions of atmospheric Hg deposition rate, terrestrial and aquatic watershed characteristics that influence Hg methylation and export, and food chain characteristics determining Hg bioaccumulation. Because of the complexity and incomplete understanding of these processes, regional-scale models of fish tissue Hg concentration are necessarily empirical in nature, typically constructed through regression analysis of fish tissue Hg concentration data from many sampling locations on a set of potential explanatory variables. Unless the data sets are unusually long and show clear time trends, the empirical basis for model building must be based solely on spatial correlation. Predictive regional scale models are highly useful for improving understanding of the relevant biogeochemical processes, as well as for practical fish and wildlife management and human health protection. Mechanistically, the logical arrangement of explanatory variables is to multiply each of the individual Hg source terms (e.g. dry, wet, and gaseous deposition rates, and residual watershed Hg) for a given fish sampling location by source-specific terms pertaining to methylation, watershed transport, and biological uptake for that location (e.g. SO4 availability, hill slope, lake size). This mathematical form has the desirable property that predicted tissue concentration will approach zero as all individual source terms approach zero. One complication with this form, however, is that it is inconsistent with the standard linear multiple regression equation in which all terms (including those for sources and physical conditions) are additive. An important practical disadvantage of a model in which the Hg source terms are additive (rather than multiplicative) with their modifying factors is that predicted concentration is not zero when all sources are zero, making it unreliable for predicting the effects of large future reductions in Hg deposition. In this paper we compare the results of using several different linear and non-linear models in an analysis of watershed and fish Hg data for 450 New England lakes. The differences in model results pertain to both their utility in interpreting methylation and export processes as well as in fisheries management.
Analysis of CERN computing infrastructure and monitoring data
NASA Astrophysics Data System (ADS)
Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.
2015-12-01
Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.
Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan C.; Gauntt, Randall O.
Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less
Numerical models analysis of energy conversion process in air-breathing laser propulsion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong Yanji; Song Junling; Cui Cunyan
Energy source was considered as a key essential in this paper to describe energy conversion process in air-breathing laser propulsion. Some secondary factors were ignored when three independent modules, ray transmission module, energy source term module and fluid dynamic module, were established by simultaneous laser radiation transportation equation and fluid mechanics equation. The incidence laser beam was simulated based on ray tracing method. The calculated results were in good agreement with those of theoretical analysis and experiments.
ERIC Educational Resources Information Center
Fidan, Nuray Kurtdede; Ergün, Mustafa
2016-01-01
In this study, social, literary and technological sources used by classroom teachers in social studies courses are analyzed in terms of frequency. The study employs mixed methods research and is designed following the convergent parallel design. In the qualitative part of the study, phenomenological method was used and in the quantitative…
Source and long-term behavior of transuranic aerosols in the WIPP environment.
Thakur, P; Lemons, B G
2016-10-01
Source and long-term behavior transuranic aerosols ((239+240)Pu, (238)Pu, and (241)Am) in the ambient air samples collected at and near the Waste Isolation Pilot Plant (WIPP) deep geologic repository site were investigated using historical data from an independent monitoring program conducted by the Carlsbad Environmental Monitoring and Research Center and an oversight monitoring program conducted by the management and operating contractor for WIPP at and near the facility. An analysis of historical data indicates frequent detections of (239+240)Pu and (241)Am, whereas (238)Pu is detected infrequently. Peaks in (239+240)Pu and (241)Am concentrations in ambient air generally occur from March to June timeframe, which is when strong and gusty winds in the area frequently give rise to blowing dust. Long-term measurements of plutonium isotopes (1985-2015) in the WIPP environment suggest that the resuspension of previously contaminated soils is likely the primary source of plutonium in the ambient air samples from WIPP and its vicinity. There is no evidence that WIPP is a source of environmental contamination that can be considered significant by any health-based standard.
A model for jet-noise analysis using pressure-gradient correlations on an imaginary cone
NASA Technical Reports Server (NTRS)
Norum, T. D.
1974-01-01
The technique for determining the near and far acoustic field of a jet through measurements of pressure-gradient correlations on an imaginary conical surface surrounding the jet is discussed. The necessary analytical developments are presented, and their feasibility is checked by using a point source as the sound generator. The distribution of the apparent sources on the cone, equivalent to the point source, is determined in terms of the pressure-gradient correlations.
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin
2016-12-01
Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.
The Iterative Reweighted Mixed-Norm Estimate for Spatio-Temporal MEG/EEG Source Reconstruction.
Strohmeier, Daniel; Bekhti, Yousra; Haueisen, Jens; Gramfort, Alexandre
2016-10-01
Source imaging based on magnetoencephalography (MEG) and electroencephalography (EEG) allows for the non-invasive analysis of brain activity with high temporal and good spatial resolution. As the bioelectromagnetic inverse problem is ill-posed, constraints are required. For the analysis of evoked brain activity, spatial sparsity of the neuronal activation is a common assumption. It is often taken into account using convex constraints based on the l 1 -norm. The resulting source estimates are however biased in amplitude and often suboptimal in terms of source selection due to high correlations in the forward model. In this work, we demonstrate that an inverse solver based on a block-separable penalty with a Frobenius norm per block and a l 0.5 -quasinorm over blocks addresses both of these issues. For solving the resulting non-convex optimization problem, we propose the iterative reweighted Mixed Norm Estimate (irMxNE), an optimization scheme based on iterative reweighted convex surrogate optimization problems, which are solved efficiently using a block coordinate descent scheme and an active set strategy. We compare the proposed sparse imaging method to the dSPM and the RAP-MUSIC approach based on two MEG data sets. We provide empirical evidence based on simulations and analysis of MEG data that the proposed method improves on the standard Mixed Norm Estimate (MxNE) in terms of amplitude bias, support recovery, and stability.
Observed ground-motion variabilities and implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, F.; Bora, S. S.; Bindi, D.; Specht, S.; Drouet, S.; Derras, B.; Pina-Valdes, J.
2016-12-01
One of the key challenges of seismology is to be able to calibrate and analyse the physical factors that control earthquake and ground-motion variabilities. Within the framework of empirical ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-field records and modern regression algorithms allow to decompose these residuals into between-event and a within-event residual components. The between-event term quantify all the residual effects of the source (e.g. stress-drops) which are not accounted by magnitude term as the only source parameter of the model. Between-event residuals provide a new and rather robust way to analyse the physical factors that control earthquake source properties and associated variabilities. We first will show the correlation between classical stress-drops and between-event residuals. We will also explain why between-event residuals may be a more robust way (compared to classical stress-drop analysis) to analyse earthquake source-properties. We will finally calibrate between-events variabilities using recent high-quality global accelerometric datasets (NGA-West 2, RESORCE) and datasets from recent earthquakes sequences (Aquila, Iquique, Kunamoto). The obtained between-events variabilities will be used to evaluate the variability of earthquake stress-drops but also the variability of source properties which cannot be explained by a classical Brune stress-drop variations. We will finally use the between-event residual analysis to discuss regional variations of source properties, differences between aftershocks and mainshocks and potential magnitude dependencies of source characteristics.
Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D
2015-08-01
Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
NASA Astrophysics Data System (ADS)
Roustan, Yelva; Duhanyan, Nora; Bocquet, Marc; Winiarek, Victor
2013-04-01
A sensitivity study of the numerical model, as well as, an inverse modelling approach applied to the atmospheric dispersion issues after the Chernobyl disaster are both presented in this paper. On the one hand, the robustness of the source term reconstruction through advanced data assimilation techniques was tested. On the other hand, the classical approaches for sensitivity analysis were enhanced by the use of an optimised forcing field which otherwise is known to be strongly uncertain. The POLYPHEMUS air quality system was used to perform the simulations of radionuclide dispersion. Activity concentrations in air and deposited to the ground of iodine-131, caesium-137 and caesium-134 were considered. The impact of the implemented parameterizations of the physical processes (dry and wet depositions, vertical turbulent diffusion), of the forcing fields (meteorology and source terms) and of the numerical configuration (horizontal resolution) were investigated for the sensitivity study of the model. A four dimensional variational scheme (4D-Var) based on the approximate adjoint of the chemistry transport model was used to invert the source term. The data assimilation is performed with measurements of activity concentrations in air extracted from the Radioactivity Environmental Monitoring (REM) database. For most of the investigated configurations (sensitivity study), the statistics to compare the model results to the field measurements as regards the concentrations in air are clearly improved while using a reconstructed source term. As regards the ground deposited concentrations, an improvement can only be seen in case of satisfactorily modelled episode. Through these studies, the source term and the meteorological fields are proved to have a major impact on the activity concentrations in air. These studies also reinforce the use of reconstructed source term instead of the usual estimated one. A more detailed parameterization of the deposition process seems also to be able to improve the simulation results. For deposited activities the results are more complex probably due to a strong sensitivity to some of the meteorological fields which remain quite uncertain.
Analyzing Student and Employer Satisfaction with Cooperative Education through Multiple Data Sources
ERIC Educational Resources Information Center
Jiang, Yuheng Helen; Lee, Sally Wai Yin; Golab, Lukasz
2015-01-01
This paper reports on the analysis of three years research of undergraduate cooperative work term postings and employer and employee evaluations. The objective of the analysis was to determine the factors affecting student and employer success and satisfaction with the work-integrated learning experience. It was found that students performed…
Radiological Source Terms for Tank Farms Safety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
COWLEY, W.L.
2000-06-27
This document provides Unit Liter Dose factors, atmospheric dispersion coefficients, breathing rates and instructions for using and customizing these factors for use in calculating radiological doses for accident analyses in the Hanford Tank Farms.
Murakami, Toshiki; Suzuki, Yoshihiro; Oishi, Hiroyuki; Ito, Kenichi; Nakao, Toshio
2013-05-15
A unique method to trace the source of "difficult-to-settle fine particles," which are a causative factor of long-term turbidity in reservoirs was developed. This method is characterized by cluster analysis of XRD (X-ray diffraction) data and homology comparison of major component compositions between "difficult-to-settle fine particles" contained in landslide soil samples taken from the upstream of a dam, and suspended "long-term turbid water particles" in the reservoir, which is subject to long-term turbidity. The experiment carried out to validate the proposed method, demonstrated a high possibility of being able to make an almost identical match between "difficult-to-settle fine particles" taken from landslide soils at specific locations and "long-term turbid water particles" taken from a reservoir. This method has the potential to determine substances causing long-term turbidity and the locations of soils from which those substances came. Appropriate countermeasures can then be taken at those specific locations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Mix It up: Variety Is Key to a Well-Rounded Data-Analysis Plan
ERIC Educational Resources Information Center
Easton, Lois Brown
2008-01-01
Variety may be the spice of life, but in terms of data sources, variety is more than a spice--it is one of the basic food groups. Alternative data sources, such as student interviews and walk-throughs, are essential for a well-balanced diet. Data from test scores alone, whether from norm-referenced or criterion-referenced tests, state, district,…
Analysis of an entrainment model of the jet in a crossflow
NASA Technical Reports Server (NTRS)
Chang, H. S.; Werner, J. E.
1972-01-01
A theoretical model has been proposed for the problem of a round jet in an incompressible cross-flow. The method of matched asymptotic expansions has been applied to this problem. For the solution to the flow problem in the inner region, the re-entrant wake flow model was used with the re-entrant flow representing the fluid entrained by the jet. Higher order corrections are obtained in terms of this basic solution. The perturbation terms in the outer region was found to be a line distribution of doublets and sources. The line distribution of sources represents the combined effect of the entrainment and the displacement.
NASA Astrophysics Data System (ADS)
Ni, X. Y.; Huang, H.; Du, W. P.
2017-02-01
The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David
A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less
Sample Based Unit Liter Dose Estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
JENSEN, L.
The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new datamore » to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting {mu}Ci/g or {mu}Ci/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000).« less
Learning Discriminative Sparse Models for Source Separation and Mapping of Hyperspectral Imagery
2010-10-01
allowing spectroscopic analysis. The data acquired by these spectrometers play significant roles in biomedical, environmental, land-survey, and...noisy in nature , so there are differences between the true and the observed signals. In addition, there are distortions associated with atmosphere... handwriting classification, showing advantages of using both terms instead of only using the reconstruction term as in previous approaches. C. Dictionary
Baeza, A; Corbacho, J A; Guillén, J; Salas, A; Mora, J C
2011-05-01
The present work studied the radioacitivity impact of a coal-fired power plant (CFPP), a NORM industry, on the water of the Regallo river which the plant uses for cooling. Downstream, this river passes through an important irrigated farming area, and it is a tributary of the Ebro, one of Spain's largest rivers. Although no alteration of the (210)Po or (232)Th content was detected, the (234,238)U and (226)Ra contents of the water were significantly greater immediately below CFPP's discharge point. The (226)Ra concentration decreased progressively downstream from the discharge point, but the uranium content increased significantly again at two sampling points 8 km downstream from the CFPP's effluent. This suggested the presence of another, unexpected uranium source term different from the CFPP. The input from this second uranium source term was even greater than that from the CFPP. Different hypotheses were tested (a reservoir used for irrigation, remobilization from sediments, and the effect of fertilizers used in the area), with it finally being demonstrated that the source was the fertilizers used in the adjacent farming areas. Copyright © 2011 Elsevier Ltd. All rights reserved.
Wennberg, Richard; Cheyne, Douglas
2014-05-01
To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Brown, J. F.; Hendy, Steve
2001-01-01
In spite of repeated efforts to explain itself to a wider audience, behavior analysis remains a largely misunderstood and isolated discipline. In this article we argue that this situation is in part due to the terms we use in our technical discussions. In particular, reinforcement and punishment, with their vernacular associations of reward and retribution, are a source of much misunderstanding. Although contemporary thinking within behavior analysis holds that reinforcement and punishment are Darwinian processes whereby behavioral variants are selected and deselected by their consequences, the continued use of the terms reinforcement and punishment to account for behavioral evolution obscures this fact. To clarify and simplify matters, we propose replacing the terms reinforcement and punishment with selection and deselection, respectively. These changes would provide a terminological meeting point with other selectionist sciences, thereby increasing the likelihood that behavior analysis will contribute to Darwinian science. PMID:22478361
ERIC Educational Resources Information Center
Alavi, Seyed Mohammad; Bordbar, Soodeh
2017-01-01
Differential Item Functioning (DIF) analysis is a key element in evaluating educational test fairness and validity. One of the frequently cited sources of construct-irrelevant variance is gender which has an important role in the university entrance exam; therefore, it causes bias and consequently undermines test validity. The present study aims…
The Rush toward Universal Public Pre-K: A Media Analysis
ERIC Educational Resources Information Center
Brown, Carolyn A.; Wright, Travis S.
2011-01-01
Research has shown for decades that early childhood education contributes to long term increases in student achievement for all children, but what is motivating the current movement toward universal Pre-k? This study used a content analysis of five major print media sources to explore how the media is framing the public pre-K movement.We looked…
NASA Astrophysics Data System (ADS)
Koliopanos, Filippos; Vasilopoulos, Georgios
2018-06-01
Aims: We study the temporal and spectral characteristics of SMC X-3 during its recent (2016) outburst to probe accretion onto highly magnetized neutron stars (NSs) at the Eddington limit. Methods: We obtained XMM-Newton observations of SMC X-3 and combined them with long-term observations by Swift. We performed a detailed analysis of the temporal and spectral behavior of the source, as well as its short- and long-term evolution. We have also constructed a simple toy-model (based on robust theoretical predictions) in order to gain insight into the complex emission pattern of SMC X-3. Results: We confirm the pulse period of the system that has been derived by previous works and note that the pulse has a complex three-peak shape. We find that the pulsed emission is dominated by hard photons, while at energies below 1 keV, the emission does not pulsate. We furthermore find that the shape of the pulse profile and the short- and long-term evolution of the source light-curve can be explained by invoking a combination of a "fan" and a "polar" beam. The results of our temporal study are supported by our spectroscopic analysis, which reveals a two-component emission, comprised of a hard power law and a soft thermal component. We find that the latter produces the bulk of the non-pulsating emission and is most likely the result of reprocessing the primary hard emission by optically thick material that partly obscures the central source. We also detect strong emission lines from highly ionized metals. The strength of the emission lines strongly depends on the phase. Conclusions: Our findings are in agreement with previous works. The energy and temporal evolution as well as the shape of the pulse profile and the long-term spectra evolution of the source are consistent with the expected emission pattern of the accretion column in the super-critical regime, while the large reprocessing region is consistent with the analysis of previously studied X-ray pulsars observed at high accretion rates. This reprocessing region is consistent with recently proposed theoretical and observational works that suggested that highly magnetized NSs occupy a considerable fraction of ultraluminous X-ray sources.
Nikoloski, Zoran
2015-01-01
Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization. PMID:25706533
Arnold, Anne; Sajitz-Hermstein, Max; Nikoloski, Zoran
2015-01-01
Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization.
Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China
NASA Astrophysics Data System (ADS)
Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema
2018-04-01
Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.
Next generation data harmonization
NASA Astrophysics Data System (ADS)
Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg
2015-05-01
Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.
A Variable Frequency, Mis-Match Tolerant, Inductive Plasma Source
NASA Astrophysics Data System (ADS)
Rogers, Anthony; Kirchner, Don; Skiff, Fred
2014-10-01
Presented here is a survey and analysis of an inductively coupled, magnetically confined, singly ionized Argon plasma generated by a square-wave, variable frequency plasma source. The helicon-style antenna is driven directly by the class ``D'' amplifier without matching network for increased efficiency while maintaining independent control of frequency and applied power at the feed point. The survey is compared to similar data taken using a traditional exciter--power amplifier--matching network source. Specifically, the flexibility of this plasma source in terms of the independent control of electron plasma temperature and density is discussed in comparison to traditional source arrangements. Supported by US DOE Grant DE-FG02-99ER54543.
Access to safe water in rural Artibonite, Haiti 16 months after the onset of the cholera epidemic.
Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas
2013-10-01
Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term.
Advances in audio source seperation and multisource audio content retrieval
NASA Astrophysics Data System (ADS)
Vincent, Emmanuel
2012-06-01
Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.
NASA Astrophysics Data System (ADS)
Camero-Arranz, Ascension; Finger, M. H.; Wilson-Hodge, C.; Caballero, I.; Kretschmar, P.; Jenke, P. A.; Beklen, E.
2010-03-01
We present a long-term timing analysis of the accreting X-ray pulsar A 0535+26 using data from Fermi/GBM, RXTE and Swift/BAT. A new orbital ephemeris is obtained from normal outbursts experienced by this source since 2005, and a long-term pulse profile study is carried out. In this study we include results from the current outburst. This outburst is believed to be much larger than the previous ones.
Beyond the double banana: improved recognition of temporal lobe seizures in long-term EEG.
Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger; Alving, Jørgen; Fabricius, Martin Ejler; Scherg, Michael; Neufeld, Miri Y; Pressler, Ronit; Kjaer, Troels W; van Emde Boas, Walter; Beniczky, Sándor
2014-02-01
To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source analysis was investigated. Forty EEG samples were reviewed by 7 EEG experts in various montages (longitudinal and transversal bipolar, common average, source derivation, source montage, current source density, and reference-free montages) using 2 electrode arrays (10-20 and the extended one). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest sensitivity and identified ictal activity at earlier time-point than visual inspection. There was no significant difference concerning specificity. The findings advocate for the use of these digital EEG technology-derived analysis methods in clinical practice.
NASA Astrophysics Data System (ADS)
Zhao, Yang; Dai, Rui-Na; Xiao, Xiang; Zhang, Zong; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe
2017-02-01
Two-person neuroscience, a perspective in understanding human social cognition and interaction, involves designing immersive social interaction experiments as well as simultaneously recording brain activity of two or more subjects, a process termed "hyperscanning." Using newly developed imaging techniques, the interbrain connectivity or hyperlink of various types of social interaction has been revealed. Functional near-infrared spectroscopy (fNIRS)-hyperscanning provides a more naturalistic environment for experimental paradigms of social interaction and has recently drawn much attention. However, most fNIRS-hyperscanning studies have computed hyperlinks using sensor data directly while ignoring the fact that the sensor-level signals contain confounding noises, which may lead to a loss of sensitivity and specificity in hyperlink analysis. In this study, on the basis of independent component analysis (ICA), a source-level analysis framework is proposed to investigate the hyperlinks in a fNIRS two-person neuroscience study. The performance of five widely used ICA algorithms in extracting sources of interaction was compared in simulative datasets, and increased sensitivity and specificity of hyperlink analysis by our proposed method were demonstrated in both simulative and real two-person experiments.
Mapping water availability, projected use and cost in the western United States
NASA Astrophysics Data System (ADS)
Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara
2014-05-01
New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.
Numerical Simulations of Reacting Flows Using Asynchrony-Tolerant Schemes for Exascale Computing
NASA Astrophysics Data System (ADS)
Cleary, Emmet; Konduri, Aditya; Chen, Jacqueline
2017-11-01
Communication and data synchronization between processing elements (PEs) are likely to pose a major challenge in scalability of solvers at the exascale. Recently developed asynchrony-tolerant (AT) finite difference schemes address this issue by relaxing communication and synchronization between PEs at a mathematical level while preserving accuracy, resulting in improved scalability. The performance of these schemes has been validated for simple linear and nonlinear homogeneous PDEs. However, many problems of practical interest are governed by highly nonlinear PDEs with source terms, whose solution may be sensitive to perturbations caused by communication asynchrony. The current work applies the AT schemes to combustion problems with chemical source terms, yielding a stiff system of PDEs with nonlinear source terms highly sensitive to temperature. Examples shown will use single-step and multi-step CH4 mechanisms for 1D premixed and nonpremixed flames. Error analysis will be discussed both in physical and spectral space. Results show that additional errors introduced by the AT schemes are negligible and the schemes preserve their accuracy. We acknowledge funding from the DOE Computational Science Graduate Fellowship administered by the Krell Institute.
Multi-Detector Analysis System for Spent Nuclear Fuel Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reber, Edward Lawrence; Aryaeinejad, Rahmat; Cole, Jerald Donald
1999-09-01
The Spent Nuclear Fuel (SNF) Non-Destructive Analysis (NDA) program at INEEL is developing a system to characterize SNF for fissile mass, radiation source term, and fissile isotopic content. The system is based on the integration of the Fission Assay Tomography System (FATS) and the Gamma-Neutron Analysis Technique (GNAT) developed under programs supported by the DOE Office of Non-proliferation and National Security. Both FATS and GNAT were developed as separate systems to provide information on the location of special nuclear material in weapons configuration (FATS role), and to measure isotopic ratios of fissile material to determine if the material was frommore » a weapon (GNAT role). FATS is capable of not only determining the presence and location of fissile material but also the quantity of fissile material present to within 50%. GNAT determines the ratios of the fissile and fissionable material by coincidence methods that allow the two prompt (immediately) produced fission fragments to be identified. Therefore, from the combination of FATS and GNAT, MDAS is able to measure the fissile material, radiation source term, and fissile isotopics content.« less
ERIC Educational Resources Information Center
González-Valiente, Carlos Luis
2015-01-01
The paper presents a bibliometric analysis on the topic of Information Technology (IT) in the field of Educational Sciences, aimed at envisioning the research emerging trends. The ERIC database is used as a consultation source; the results were subjected to productivity by authors, journals, and term co-occurrence analysis indicators for the…
NASA Astrophysics Data System (ADS)
Anita, G.; Selva, J.; Laura, S.
2011-12-01
We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).
Fermi Large Area Telescope Second Source Catalog
NASA Technical Reports Server (NTRS)
Nolan, P. L.; Abdo, A. A.; Ackermann, M.; Ajello, M; Allafort, A.; Antolini, E; Bonnell, J.; Cannon, A.; Celik O.; Corbet, R.;
2012-01-01
We present the second catalog of high-energy gamma-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24-month period. The Second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurements in 5 energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 11eV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely gamma-ray-producing source classes.
NASA Astrophysics Data System (ADS)
Kooymana, Timothée; Buiron, Laurent; Rimpault, Gérald
2017-09-01
Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long and short term neutron and gamma source is carried out while in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.
Accuracy-preserving source term quadrature for third-order edge-based discretization
NASA Astrophysics Data System (ADS)
Nishikawa, Hiroaki; Liu, Yi
2017-09-01
In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
NASA Astrophysics Data System (ADS)
Rengarajan, Rajagopalan
Moderate resolution remote sensing data offers the potential to monitor the long and short term trends in the condition of the Earth's resources at finer spatial scales and over longer time periods. While improved calibration (radiometric and geometric), free access (Landsat, Sentinel, CBERS), and higher level products in reflectance units have made it easier for the science community to derive the biophysical parameters from these remotely sensed data, a number of issues still affect the analysis of multi-temporal datasets. These are primarily due to sources that are inherent in the process of imaging from single or multiple sensors. Some of these undesired or uncompensated sources of variation include variation in the view angles, illumination angles, atmospheric effects, and sensor effects such as Relative Spectral Response (RSR) variation between different sensors. The complex interaction of these sources of variation would make their study extremely difficult if not impossible with real data, and therefore, a simulated analysis approach is used in this study. A synthetic forest canopy is produced using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and its measured BRDFs are modeled using the RossLi canopy BRDF model. The simulated BRDF matches the real data to within 2% of the reflectance in the red and the NIR spectral bands studied. The BRDF modeling process is extended to model and characterize the defoliation of a forest, which is used in factor sensitivity studies to estimate the effect of each factor for varying environment and sensor conditions. Finally, a factorial experiment is designed to understand the significance of the sources of variation, and regression based analysis are performed to understand the relative importance of the factors. The design of experiment and the sensitivity analysis conclude that the atmospheric attenuation and variations due to the illumination angles are the dominant sources impacting the at-sensor radiance.
NASA Astrophysics Data System (ADS)
Lee, Jangho; Kim, Kwang-Yul
2018-02-01
CSEOF analysis is applied for the springtime (March, April, May) daily PM10 concentrations measured at 23 Ministry of Environment stations in Seoul, Korea for the period of 2003-2012. Six meteorological variables at 12 pressure levels are also acquired from the ERA Interim reanalysis datasets. CSEOF analysis is conducted for each meteorological variable over East Asia. Regression analysis is conducted in CSEOF space between the PM10 concentrations and individual meteorological variables to identify associated atmospheric conditions for each CSEOF mode. By adding the regressed loading vectors with the mean meteorological fields, the daily atmospheric conditions are obtained for the first five CSEOF modes. Then, HYSPLIT model is run with the atmospheric conditions for each CSEOF mode in order to back trace the air parcels and dust reaching Seoul. The K-means clustering algorithm is applied to identify major source regions for each CSEOF mode of the PM10 concentrations in Seoul. Three main source regions identified based on the mean fields are: (1) northern Taklamakan Desert (NTD), (2) Gobi Desert and (GD), and (3) East China industrial area (ECI). The main source regions for the mean meteorological fields are consistent with those of previous study; 41% of the source locations are located in GD followed by ECI (37%) and NTD (21%). Back trajectory calculations based on CSEOF analysis of meteorological variables identify distinct source characteristics associated with each CSEOF mode and greatly facilitate the interpretation of the PM10 variability in Seoul in terms of transportation route and meteorological conditions including the source area.
Real Otto and Diesel Engine Cycles.
ERIC Educational Resources Information Center
Giedd, Ronald
1983-01-01
A thermodynamic analysis of the properties of otto/diesel engines during the time they operate with open chambers illustrates applicability of thermodynamics to real systems, demonstrates how delivered power is controlled, and explains the source of air pollution in terms of thermodynamic laws. (Author/JN)
Analysis of Unmanned Systems in Military Logistics
2016-12-01
opportunities to employ unmanned systems to support logistic operations. 14. SUBJECT TERMS unmanned systems, robotics , UAVs, UGVs, USVs, UUVs, military...Industrial Robots at Warehouses / Distribution Centers .............................................................................. 17 2. Unmanned...Autonomous Robot Gun Turret. Source: Blain (2010)................................................... 33 Figure 4. Robot Sentries for Base Patrol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manibog, F.R.
1982-01-01
This study presents the methodology and results of: (1) a rural energy survey that was conducted in a Philippine island community; and (2) a cost-effectiveness analysis of selected conventional and renewable-energy technologies. The rural energy survey section compares different survey techniques and analyzes energy utilization by providing: (1) a breakdown of energy flows and use patterns; (2) information on energy prices, ownership patterns, social relations, and their effects in terms of differential access to energy sources; (3) per household and per capita consumption figures; and (4) a village energy-consumption table. Correlation analysis is used to determine if the stratified, independentmore » socio-economic variables are indicators for dependent energy variables. Results of the economic analysis indicate that renewable-energy technologies are already least-cost alternatives to diesel generation in the village case study. The sensitivity analysis also shows that these technologies remain the least-cost options even if their capital costs were underestimated. The findings of the study are useful to the current Philippine renewable-energy program in terms of providing: (1) information essential for determining end-users' priority energy needs and for improving technology choice and project design; and (2) justification for promoting auto-generation based on renewable energy sources as alternatives to diesel fuel.« less
Shim, Kyusung; Do, Nhu Tri; An, Beongku
2017-01-01
In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286
NASA Astrophysics Data System (ADS)
Lavrieux, Marlène; Meusburger, Katrin; Birkholz, Axel; Alewell, Christine
2017-04-01
Slope destabilization and associated sediment transfer are among the major causes of aquatic ecosystems and surface water quality impairment. Through land uses and agricultural practices, human activities modify the soil erosive risk and the catchment connectivity, becoming a key factor of sediment dynamics. Hence, restoration and management plans of water bodies can only be efficient if the sediment sources and the proportion attributable to different land uses and agricultural practices are identified. Several sediment fingerprinting methods, based on the geochemical (elemental composition), color, magnetic or isotopic (137Cs) sediment properties, are currently in use. However, these tools are not suitable for a land-use based fingerprinting. New organic geochemical approaches are now developed to discriminate source-soil contributions under different land-uses: The compound-specific stable isotopes (CSSI) technique, based on the biomarkers isotopic signature (here, fatty acids δ13C) variability within the plant species, The analysis of highly specific (i.e. source-family- or even source-species-specific) biomarkers assemblages, which use is until now mainly restricted to palaeoenvironmental reconstructions, and which offer also promising prospects for tracing current sediment origin. The approach was applied to reconstruct the spatio-temporal variability of the main sediment sources of Baldegg Lake (Lucern Canton, Switzerland), which suffers from a substantial eutrophication, despite several restoration attempts during the last 40 years. The sediment supplying areas and the exported volumes were identified using CSSI technique and highly specific biomarkers, coupled to a sediment connectivity model. The sediment origin variability was defined through the analysis of suspended river sediments sampled at high flow conditions (short term), and by the analysis of a lake sediment core covering the last 130 years (long term). The results show the utility of biomarkers and CSSI to track organic sources in contrasted land-use settings. Associated to other fingerprinting methods, this approach could in the future become a decision support tool for catchments management.
NASA Astrophysics Data System (ADS)
Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.
2017-09-01
Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.
Brusseau, M. L.; Hatton, J.; DiGuiseppi, W.
2011-01-01
The long-term impact of source-zone remediation efforts was assessed for a large site contaminated by trichloroethene. The impact of the remediation efforts (soil vapor extraction and in-situ chemical oxidation) was assessed through analysis of plume-scale contaminant mass discharge, which was measured using a high-resolution data set obtained from 23 years of operation of a large pump-and-treat system. The initial contaminant mass discharge peaked at approximately 7 kg/d, and then declined to approximately 2 kg/d. This latter value was sustained for several years prior to the initiation of source-zone remediation efforts. The contaminant mass discharge in 2010, measured several years after completion of the two source-zone remediation actions, was approximately 0.2 kg/d, which is ten times lower than the value prior to source-zone remediation. The time-continuous contaminant mass discharge data can be used to evaluate the impact of the source-zone remediation efforts on reducing the time required to operate the pump-and-treat system, and to estimate the cost savings associated with the decreased operational period. While significant reductions have been achieved, it is evident that the remediation efforts have not completely eliminated contaminant mass discharge and associated risk. Remaining contaminant mass contributing to the current mass discharge is hypothesized to comprise poorly-accessible mass in the source zones, as well as aqueous (and sorbed) mass present in the extensive lower-permeability units located within and adjacent to the contaminant plume. The fate of these sources is an issue of critical import to the remediation of chlorinated-solvent contaminated sites, and development of methods to address these sources will be required to achieve successful long-term management of such sites and to ultimately transition them to closure. PMID:22115080
Aerosols in the Atmosphere: Sources, Transport, and Multi-decadal Trends
NASA Technical Reports Server (NTRS)
Chin, M.; Diehl, T.; Bian, H.; Kucsera, T.
2016-01-01
We present our recent studies with global modeling and analysis of atmospheric aerosols. We have used the Goddard Chemistry Aerosol Radiation and Transport (GOCART) model and satellite and in situ data to investigate (1) long-term variations of aerosols over polluted and dust source regions and downwind ocean areas in the past three decades and the cause of the changes and (2) anthropogenic and volcanic contributions to the sulfate aerosol in the upper tropospherelower stratosphere.
Zhang, Zili; Wang, Jian; Lu, Wenju
2018-05-01
Exposure to nitrogen dioxide (NO 2 ) has long been linked to elevated mortality and morbidity from epidemiological evidences. However, questions remain unclear whether NO 2 acts directly on human health or being an indicator of other ambient pollutants. In this study, random-effect meta-analyses were performed on examining exposure to nitrogen oxide (NO x ) and its association with chronic obstructive pulmonary disease (COPD). The overall relative risk (RR) of COPD risk related to a 10 μg/m 3 increase in NO 2 exposure increased by 2.0%. The pooled effect on prevalence was 17% with an increase of 10 μg/m 3 in NO 2 concentration, and 1.3% on hospital admissions, and 2.6% on mortality. The RR of COPD cases related to NO 2 long-term exposure was 2.5 and 1.4% in short-term exposure. The COPD effect related with a 10 μg/m 3 increase in exposure to a general outdoor-sourced NO 2 was 1.7 and 17.8% to exposure to an exclusively traffic-sourced NO 2 ; importantly, we did observe the effect of NO 2 on COPD mortality with a large majority in lag0. Long-term traffic exerted more severe impairments on COPD prevalence than long-term or short-term outdoor effect; long-term mortality effect on COPD was serious in single model from this meta-analysis. Overall, our study reported consistent evidence of the potential positive association between NO 2 and COPD risk.
Lopes, Marta S; Araus, José L
2008-09-01
Long-term differences in photosynthesis, respiration and growth of plants receiving distinct nitrogen (N) sources imply that N metabolism generates signals that regulate metabolism and development. The molecular basis of these signals remains unclear. Here we studied the gene expression profiles of barley (Hordeum vulgare L. cv. Graphic) seedlings fertilized either with ammonium (NH4+), with ammonium and nitrate (NH4+:NO3-), or with nitrate (NO3-) only. Our transcriptome analysis after 48 h of growth in these N sources showed major changes in the expression of genes involved in N metabolism (nitrate reductase), signalling (protein kinases and protein phosphatases), photosynthesis (chlorophyll a/b-binding protein and a PsbQ domain), where increases in NO3- as compared with NH4+ were observed. Moreover, NH4+ assimilation induced genes participating in C and sugars metabolism (phosphoglycerate kinase, glucosyltranferase and galactokinase), respiration (cytochrome c oxidase), protein fate (heat shock proteins) and development (MTN3-like protein). These changes in gene expression could well explain the long-term growth depression observed in NH4+ plants. Even if a few genes participating in protein fate (proteases) and development (OsNAC5) were upregulated in NH4+ as compared with NH4+:NO3-, the general pattern of expression was quite similar between these two N sources. Taken together, these results indicated that other downstream mechanisms should be involved in the synergetic long-term response of NH4+:NO3-.
NASA Astrophysics Data System (ADS)
Malviya, Devesh; Borage, Mangesh Balkrishna; Tiwari, Sunil
2017-12-01
This paper investigates the possibility of application of Resonant Immittance Converters (RICs) as a current source for the current-fed symmetrical Capacitor-Diode Voltage Multiplier (CDVM) with LCL-T Resonant Converter (RC) as an example. Firstly, detailed characterization of the current-fed symmetrical CDVM is carried out using repeated simulations followed by the normalization of the simulation results in order to derive the closed-form curve fit equations to predict the operating modes, output voltage and ripple in terms of operating parameters. RICs, due to their ability to convert voltage source into a current source, become a possible candidate for the realization of current source for the current-fed symmetrical CDVM. Detailed analysis, optimization and design of LCL-T RC with CDVM is performed in this paper. A step by step design procedure for the design of CDVM and the converter is proposed. A 5-stage prototype symmetrical CDVM driven by LCL-T RC to produce 2.5 kV, 50 mA dc output voltage is designed, built and tested to validate the findings of the analysis and simulation.
1988-05-01
Seeciv Limited- System for varying Senses term filter capacity output until some Figure 2. Original limited-capacity channel model (Frim Broadbent, 1958) S...2 Figure 2. Original limited-capacity channel model (From Broadbent, 1958) .... 10 Figure 3. Experimental...unlimited variety of human voices for digital recording sources. Synthesis by Analysis Analysis-synthesis methods electronically model the human voice
Patricia Lebow; Richard Ziobro; Linda Sites; Tor Schultz; David Pettry; Darrel Nicholas; Stan Lebow; Pascal Kamdem; Roger Fox; Douglas Crawford
2006-01-01
Leaching of wood preservatives affects the long-term efficacy and environmental impact of treated wood. Soil properties and wood characteristicscan affectleaching of woad preservatives, but these effects are not well understood. This paper reports a statistical analysis of the effects of soil and wood properties on leaching of arsenic (As) and copper (Cu) from southern...
Understanding the Climate of Deceit.
ERIC Educational Resources Information Center
Kincheloe, Joe L.; Staley, George
1983-01-01
Briefly discusses propaganda of the past four decades, defines the term, reviews its earliest uses, and outlines today's propaganda vehicles--mass media, special interest groups, and marketing techniques. A propaganda analysis program for educating today's youth is proposed which includes eight questions for evaluating the source of media…
NASA Astrophysics Data System (ADS)
Jordan, Phil; Melland, Alice; Shore, Mairead; Mellander, Per-Erik; Shortle, Ger; Ryan, David; Crockford, Lucy; Macintosh, Katrina; Campbell, Julie; Arnscheidt, Joerg; Cassidy, Rachel
2014-05-01
A complete appraisal of material fluxes in flowing waters is really only possibly with high time resolution data synchronous with measurements of discharge. Defined by Kirchner et al. (2004; Hydrological Processes, 18/7) as the high-frequency wave of the future and with regard to disentangling signal noise from process pattern, this challenge has been met in terms of nutrient flux monitoring by automated bankside analysis. In Ireland over a ten-year period, time-series nutrient data collected on a sub-hourly basis in rivers have been used to distinguish fluxes from different catchment sources and pathways and to provide more certain temporal pictures of flux for the comparative definition of catchment nutrient dynamics. In catchments where nutrient fluxes are particularly high and exhibit a mix of extreme diffuse and point source influences, high time resolution data analysis indicates that there are no satisfactory statistical proxies for seasonal or annual flux predictions that use coarse datasets. Or at least exposes the limits of statistical approaches to catchment scale and hydrological response. This has profound implications for catchment monitoring programmes that rely on modelled relationships. However, using high resolution monitoring for long term assessments of catchment mitigation measures comes with further challenges. Sustaining continuous wet chemistry analysis at river stations is resource intensive in terms of capital, maintenance and quality assurance. Furthermore, big data capture requires investment in data management systems and analysis. These two institutional challenges are magnified when considering the extended time period required to identify the influences of land-based nutrient control measures on water based systems. Separating the 'climate signal' from the 'source signal' in river nutrient flux data is a major analysis challenge; more so when tackled with anything but higher resolution data. Nevertheless, there is scope to lower costs in bankside analysis through technology development, and the scientific advantages of these data are clear and exciting. When integrating its use with policy appraisal, it must be made clear that the advances in river process understanding from high resolution monitoring data capture come as a package with the ability to make more informed decisions through an investment in better information.
Eslinger, P W; Biegalski, S R; Bowyer, T W; Cooper, M W; Haas, D A; Hayes, J C; Hoffman, I; Korpach, E; Yi, J; Miley, H S; Rishel, J P; Ungar, K; White, B; Woods, V T
2014-01-01
Systems designed to monitor airborne radionuclides released from underground nuclear explosions detected radioactive fallout across the northern hemisphere resulting from the Fukushima Dai-ichi Nuclear Power Plant accident in March 2011. Sampling data from multiple International Modeling System locations are combined with atmospheric transport modeling to estimate the magnitude and time sequence of releases of (133)Xe. Modeled dilution factors at five different detection locations were combined with 57 atmospheric concentration measurements of (133)Xe taken from March 18 to March 23 to estimate the source term. This analysis suggests that 92% of the 1.24 × 10(19) Bq of (133)Xe present in the three operating reactors at the time of the earthquake was released to the atmosphere over a 3 d period. An uncertainty analysis bounds the release estimates to 54-129% of available (133)Xe inventory. Copyright © 2013 Elsevier Ltd. All rights reserved.
Survey on the Performance of Source Localization Algorithms.
Fresno, José Manuel; Robles, Guillermo; Martínez-Tarifa, Juan Manuel; Stewart, Brian G
2017-11-18
The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton-Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm.
Survey on the Performance of Source Localization Algorithms
2017-01-01
The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton–Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm. PMID:29156565
Cost of care of haemophilia with inhibitors.
Di Minno, M N D; Di Minno, G; Di Capua, M; Cerbone, A M; Coppola, A
2010-01-01
In Western countries, the treatment of patients with inhibitors is presently the most challenging and serious issue in haemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the healthcare of patients in this setting. Being designed to address questions of resource allocation and effectiveness, decision models are the golden standard to reliably assess the overall economic implications of haemophilia with inhibitors in terms of mortality, bleeding-related morbidity, and severity of arthropathy. However, presently, most data analyses stem from retrospective short-term evaluations, that only allow for the analysis of direct health costs. In the setting of chronic diseases, the cost-utility analysis, that takes into account the beneficial effects of a given treatment/healthcare intervention in terms of health-related quality of life, is likely to be the most appropriate approach. To calculate net benefits, the quality adjusted life year, that significantly reflects such health gain, has to be compared with specific economic impacts. Differences in data sources, in medical practice and/or in healthcare systems and costs, imply that most current pharmacoeconomic analyses are confined to a narrow healthcare payer perspective. Long-term/lifetime prospective or observational studies, devoted to a careful definition of when to start a treatment; of regimens (dose and type of product) to employ, and of inhibitor population (children/adults, low-responding/high responding inhibitors) to study, are thus urgently needed to allow for newer insights, based on reliable data sources into resource allocation, effectiveness and cost-utility analysis in the treatment of haemophiliacs with inhibitors.
Bouza, Marcos; Orejas, Jaime; López-Vidal, Silvia; Pisonero, Jorge; Bordel, Nerea; Pereiro, Rosario; Sanz-Medel, Alfredo
2016-05-23
Atmospheric pressure glow discharges have been widely used in the last decade as ion sources in ambient mass spectrometry analyses. Here, an in-house flowing atmospheric pressure afterglow (FAPA) has been developed as an alternative ion source for differential mobility analysis (DMA). The discharge source parameters (inter-electrode distance, current and helium flow rate) determining the atmospheric plasma characteristics have been optimized in terms of DMA spectral simplicity with the highest achievable sensitivity while keeping an adequate plasma stability and so the FAPA working conditions finally selected were: 35 mA, 1 L min(-1) of He and an inter-electrode distance of 8 mm. Room temperature in the DMA proved to be adequate for the coupling and chemical analysis with the FAPA source. Positive and negative ions for different volatile organic compounds were tested and analysed by FAPA-DMA using a Faraday cup as a detector and proper operation in both modes was possible (without changes in FAPA operational parameters). The FAPA ionization source showed simpler ion mobility spectra with narrower peaks and a better, or similar, sensitivity than conventional UV-photoionization for DMA analysis in positive mode. Particularly, the negative mode proved to be a promising field of further research for the FAPA ion source coupled to ion mobility, clearly competitive with other more conventional plasmas such as corona discharge.
The importance of quadrupole sources in prediction of transonic tip speed propeller noise
NASA Technical Reports Server (NTRS)
Hanson, D. B.; Fink, M. R.
1978-01-01
A theoretical analysis is presented for the harmonic noise of high speed, open rotors. Far field acoustic radiation equations based on the Ffowcs-Williams/Hawkings theory are derived for a static rotor with thin blades and zero lift. Near the plane of rotation, the dominant sources are the volume displacement and the rho U(2) quadrupole, where u is the disturbance velocity component in the direction blade motion. These sources are compared in both the time domain and the frequency domain using two dimensional airfoil theories valid in the subsonic, transonic, and supersonic speed ranges. For nonlifting parabolic arc blades, the two sources are equally important at speeds between the section critical Mach number and a Mach number of one. However, for moderately subsonic or fully supersonic flow over thin blade sections, the quadrupole term is negligible. It is concluded for thin blades that significant quadrupole noise radiation is strictly a transonic phenomenon and that it can be suppressed with blade sweep. Noise calculations are presented for two rotors, one simulating a helicopter main rotor and the other a model propeller. For the latter, agreement with test data was substantially improved by including the quadrupole source term.
Access to Safe Water in Rural Artibonite, Haiti 16 Months after the Onset of the Cholera Epidemic
Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas
2013-01-01
Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term. PMID:24106191
Cascella, Raffaella; Stocchi, Laura; Strafella, Claudia; Mezzaroma, Ivano; Mannazzu, Marco; Vullo, Vincenzo; Montella, Francesco; Parruti, Giustino; Borgiani, Paola; Sangiuolo, Federica; Novelli, Giuseppe; Pirazzoli, Antonella; Zampatti, Stefania; Giardina, Emiliano
2015-01-01
Our work aimed to designate the optimal DNA source for pharmacogenetic assays, such as the screening for HLA-B*57:01 allele. A saliva and four buccal swab samples were taken from 104 patients. All the samples were stored at different time and temperature conditions and then genotyped for the HLA-B*57:01 allele by SSP-PCR and classical/capillary electrophoresis. The genotyping analysis reported different performance rates depending on the storage conditions of the samples. Given our results, the buccal swab demonstrated to be more resistant and stable in time with respect to the saliva. Our investigation designates the buccal swab as the optimal DNA source for pharmacogenetic assays in terms of resistance, low infectivity, low-invasiveness and easy sampling, and safe transport in centralized medical centers providing specialized pharmacogenetic tests.
"A Strategy of Distinction" Unfolds: Unsettling the Undergraduate Outbound Mobility Experience
ERIC Educational Resources Information Center
Sidhu, Ravinder; Dall'Alba, Gloria
2017-01-01
Although short-term mobility programmes are increasingly promoted to university students as sources of competitive advantage, there is little research on academic learnings arising from these initiatives. A "field analysis" of outbound mobility is undertaken to identify convergences and disjunctures between institutional discourses,…
40 CFR 408.11 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... STANDARDS CANNED AND PRESERVED SEAFOOD PROCESSING POINT SOURCE CATEGORY Farm-Raised Catfish Processing... apply to this subpart. (b) The term oil and grease shall mean those components of a waste water amenable to measurement by the method described in Methods for Chemical Analysis of Water and Wastes, 1971...
Fermi large area telescope second source catalog
Nolan, P. L.; Abdo, A. A.; Ackermann, M.; ...
2012-03-28
Here, we present the second catalog of high-energy γ-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are fluxmore » measurements in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. Furthermore, we provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. Finally, the 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely γ-ray-producing source classes.« less
FERMI LARGE AREA TELESCOPE SECOND SOURCE CATALOG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nolan, P. L.; Ajello, M.; Allafort, A.
We present the second catalog of high-energy {gamma}-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurementsmore » in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely {gamma}-ray-producing source classes.« less
NASA Astrophysics Data System (ADS)
Ghosh, Dipak; Sarkar, Sharmila; Sen, Sanjib; Roy, Jaya
1995-06-01
In this paper the behavior of factorial moments with rapidity window size, which is usually explained in terms of ``intermittency,'' has been interpreted by simple quantum statistical properties of the emitting system using the concept of ``modified two-source model'' as recently proposed by Ghosh and Sarkar [Phys. Lett. B 278, 465 (1992)]. The analysis has been performed using our own data of 16Ag/Br and 24Ag/Br interactions at a few tens of GeV energy regime.
Tao, Jun; Zhang, Leiming; Zhang, Renjian; Wu, Yunfei; Zhang, Zhisheng; Zhang, Xiaoling; Tang, Yixi; Cao, Junji; Zhang, Yuanhang
2016-02-01
Daily PM2.5 samples were collected at an urban site in Beijing during four one-month periods in 2009-2010, with each period in a different season. Samples were subject to chemical analysis for various chemical components including major water-soluble ions, organic carbon (OC) and water-soluble organic carbon (WSOC), element carbon (EC), trace elements, anhydrosugar levoglucosan (LG), and mannosan (MN). Three sets of source profiles of PM2.5 were first identified through positive matrix factorization (PMF) analysis using single or combined biomass tracers - non-sea salt potassium (nss-K(+)), LG, and a combination of nss-K(+) and LG. The six major source factors of PM2.5 included secondary inorganic aerosol, industrial pollution, soil dust, biomass burning, traffic emission, and coal burning, which were estimated to contribute 31±37%, 39±28%, 14±14%, 7±7%, 5±6%, and 4±8%, respectively, to PM2.5 mass if using the nss-K(+) source profiles, 22±19%, 29±17%, 20±20%, 13±13%, 12±10%, and 4±6%, respectively, if using the LG source profiles, and 21±17%, 31±18%, 19±19%, 11±12%, 14±11%, and 4±6%, respectively, if using the combined nss-K(+) and LG source profiles. The uncertainties in the estimation of biomass burning contributions to WSOC due to the different choices of biomass burning tracers were around 3% annually and up to 24% seasonally in terms of absolute percentage contributions, or on a factor of 1.7 annually and up to a factor of 3.3 seasonally in terms of the actual concentrations. The uncertainty from the major source (e.g. industrial pollution) was on a factor of 1.9 annually and up to a factor of 2.5 seasonally in the estimated WSOC concentrations. Copyright © 2015 Elsevier B.V. All rights reserved.
Sensitivity Analysis Tailored to Constrain 21st Century Terrestrial Carbon-Uptake
NASA Astrophysics Data System (ADS)
Muller, S. J.; Gerber, S.
2013-12-01
The long-term fate of terrestrial carbon (C) in response to climate change remains a dominant source of uncertainty in Earth-system model projections. Increasing atmospheric CO2 could be mitigated by long-term net uptake of C, through processes such as increased plant productivity due to "CO2-fertilization". Conversely, atmospheric conditions could be exacerbated by long-term net release of C, through processes such as increased decomposition due to higher temperatures. This balance is an important area of study, and a major source of uncertainty in long-term (>year 2050) projections of planetary response to climate change. We present results from an innovative application of sensitivity analysis to LM3V, a dynamic global vegetation model (DGVM), intended to identify observed/observable variables that are useful for constraining long-term projections of C-uptake. We analyzed the sensitivity of cumulative C-uptake by 2100, as modeled by LM3V in response to IPCC AR4 scenario climate data (1860-2100), to perturbations in over 50 model parameters. We concurrently analyzed the sensitivity of over 100 observable model variables, during the extant record period (1970-2010), to the same parameter changes. By correlating the sensitivities of observable variables with the sensitivity of long-term C-uptake we identified model calibration variables that would also constrain long-term C-uptake projections. LM3V employs a coupled carbon-nitrogen cycle to account for N-limitation, and we find that N-related variables have an important role to play in constraining long-term C-uptake. This work has implications for prioritizing field campaigns to collect global data that can help reduce uncertainties in the long-term land-atmosphere C-balance. Though results of this study are specific to LM3V, the processes that characterize this model are not completely divorced from other DGVMs (or reality), and our approach provides valuable insights into how data can be leveraged to be better constrain projections for the land carbon sink.
Cramer-Rao bound analysis of wideband source localization and DOA estimation
NASA Astrophysics Data System (ADS)
Yip, Lean; Chen, Joe C.; Hudson, Ralph E.; Yao, Kung
2002-12-01
In this paper, we derive the Cramér-Rao Bound (CRB) for wideband source localization and DOA estimation. The resulting CRB formula can be decomposed into two terms: one that depends on the signal characteristic and one that depends on the array geometry. For a uniformly spaced circular array (UCA), a concise analytical form of the CRB can be given by using some algebraic approximation. We further define a DOA beamwidth based on the resulting CRB formula. The DOA beamwidth can be used to design the sampling angular spacing for the Maximum-likelihood (ML) algorithm. For a randomly distributed array, we use an elliptical model to determine the largest and smallest effective beamwidth. The effective beamwidth and the CRB analysis of source localization allow us to design an efficient algorithm for the ML estimator. Finally, our simulation results of the Approximated Maximum Likelihood (AML) algorithm are demonstrated to match well to the CRB analysis at high SNR.
Reducing mortality risk by targeting specific air pollution sources: Suva, Fiji.
Isley, C F; Nelson, P F; Taylor, M P; Stelcer, E; Atanacio, A J; Cohen, D D; Mani, F S; Maata, M
2018-01-15
Health implications of air pollution vary dependent upon pollutant sources. This work determines the value, in terms of reduced mortality, of reducing ambient particulate matter (PM 2.5 : effective aerodynamic diameter 2.5μm or less) concentration due to different emission sources. Suva, a Pacific Island city with substantial input from combustion sources, is used as a case-study. Elemental concentration was determined, by ion beam analysis, for PM 2.5 samples from Suva, spanning one year. Sources of PM 2.5 have been quantified by positive matrix factorisation. A review of recent literature has been carried out to delineate the mortality risk associated with these sources. Risk factors have then been applied for Suva, to calculate the possible mortality reduction that may be achieved through reduction in pollutant levels. Higher risk ratios for black carbon and sulphur resulted in mortality predictions for PM 2.5 from fossil fuel combustion, road vehicle emissions and waste burning that surpass predictions for these sources based on health risk of PM 2.5 mass alone. Predicted mortality for Suva from fossil fuel smoke exceeds the national toll from road accidents in Fiji. The greatest benefit for Suva, in terms of reduced mortality, is likely to be accomplished by reducing emissions from fossil fuel combustion (diesel), vehicles and waste burning. Copyright © 2017. Published by Elsevier B.V.
Semiotic foundation for multisensor-multilook fusion
NASA Astrophysics Data System (ADS)
Myler, Harley R.
1998-07-01
This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.
Sortie laboratory, phase B technical summary. [design and operational requirements
NASA Technical Reports Server (NTRS)
1973-01-01
The design and operational requirements which evolved from Sortie Lab (SL) analysis are summarized. A source of requirements for systems is given along with experimental support for the SL, baseline. Basic design data covered include: configuration definition, mission analysis, experimental integration, safety, and logistics. A technical summary outlines characteristics which reflect the influence of the growth in SL capability and the results of the mission and operational analysis. Each of the selected areas is described in terms of objectives, equipment, operational concept, and support requirements.
77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...
Mrad Nakhlé, M; Farah, W; Ziade, N; Abboud, M; Gerard, J; Zaarour, R; Saliba, N; Dabar, G; Abdel Massih, T; Zoghbi, A; Coussa-Koniski, M-L; Annesi-Maesano, I
2013-12-01
The effects of air pollution on human health have been the subject of much public health research. Several techniques and methods of analysis have been developed. Thus, Beirut Air Pollution and Health Effects (BAPHE) was designed to develop a methodology adapted to the context of the city of Beirut in order to quantify the short-term health effects of air pollution. The quality of data collected from emergency units was analyzed in order to properly estimate hospitalizations via these units. This study examined the process of selecting and validating health and pollution indicators. The different sources of data from emergency units were not correlated. BAPHE was therefore reoriented towards collecting health data from the emergency registry of each hospital. A pilot study determined the appropriate health indicators for BAPHE and created a classification methodology for data collection. In Lebanon, several studies have attempted to indirectly assess the impact of air pollution on health. They had limitations and weaknesses and offered no recommendations regarding the sources and quality of data. The present analysis will be useful for BAPHE and for planning further studies. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Improving the limits of detection of low background alpha emission measurements
NASA Astrophysics Data System (ADS)
McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.
2018-01-01
Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.
NASA Technical Reports Server (NTRS)
Wu, S. T.
2000-01-01
The project has progressed successfully during this period of performance. The highlights of the Gamma Ray Astronomy teams efforts are: (1) Support daily BATSE data operations, including receipt, archival and dissemination of data, quick-look science analysis, rapid gamma-ray burst and transient monitoring and response efforts, instrument state-of-health monitoring, and instrument commanding and configuration; (2) On-going scientific analysis, including production and maintenance of gamma-ray burst, pulsed source and occultation source catalogs, gamma-ray burst spectroscopy, studies of the properties of pulsars and black holes, and long-term monitoring of hard x-ray sources; (3) Maintenance and continuous improvement of BATSE instrument response and calibration data bases; (4) Investigation of the use of solid state detectors for eventual application and instrument to perform all sky monitoring of X-Ray and Gamma sources with high sensitivity; and (5) Support of BATSE outreach activities, including seminars, colloquia and World Wide Web pages. The highlights of this efforts can be summarized in the publications and presentation list.
Zeng, Yonghui; Jiao, Nianzhi
2007-06-01
Anoxygenic photosynthesis, performed primarily by anoxygenic photosynthetic bacteria (APB), has been supposed to arise on Earth more than 3 billion years ago. The long established APB are distributed in almost every corner where light can reach. However, the relationship between APB phylogeny and source environments has been largely unexplored. Here we retrieved the pufM sequences and related source information of 89 pufM containing species from the public database. Phylogenetic analysis revealed that horizontal gene transfer (HGT) most likely occurred within 11 out of a total 21 pufM subgroups, not only among species within the same class but also among species of different phyla or subphyla. A clear source environment feature related phylogenetic distribution pattern was observed, with all species from oxic habitats and those from anoxic habitats clustering into independent subgroups, respectively. HGT among ancient APB and subsequent long term evolution and adaptation to separated niches may have contributed to the coupling of environment and pufM phylogeny.
Defining Adapted Physical Activity: International Perspectives
ERIC Educational Resources Information Center
Hutzler, Yeshayahu; Sherrill, Claudine
2007-01-01
The purpose of this study was to describe international perspectives concerning terms, definitions, and meanings of adapted physical activity (APA) as (a) activities or service delivery, (b) a profession, and (c) an academic field of study. Gergen's social constructionism, our theory, guided analysis of multiple sources of data via qualitative…
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.
1980-01-01
The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.
Idowu, I A; Alkhaddar, R M; Atherton, W
2014-08-01
Mecoprop-p herbicide is often found in wells and water abstractions in many areas around Europe, the UK inclusive. There is a growing environmental and public health concern about mecoprop-p herbicide pollution in ground and surface water in England. Reviews suggest that extensive work has been carried out on the contribution of mecoprop-p herbicides from agricultural use whilst more work needs to be carried out on the contribution of mecoprop-p herbicide from non-agricultural use. The study covers two landfill sites in Weaver/Gowy Catchment. Mecoprop-p herbicide concentrations in the leachate quality range between 0.06 and 290 microg l1 in cells. High concentration ofmecoprop-p herbicide in the leachate quality suggests that there is a possible source term in the waste stream. This paper addresses the gap by exploring possible source terms of mecoprop-p herbicide contamination on landfill sites and evaluates the impact of public purchase, use and disposal alongside climate change on seasonal variations in mecoprop-p concentrations. Mecoprop-p herbicide was found to exceed the EU drinking water quality standards at the unsaturated zone/aquifer with observed average concentrations ranging between 0.005 and 7.96 microg l1. A route map for mecoprop-p herbicide source term contamination is essential for mitigation and pollution management with emphasis on both consumer and producer responsibility towards use of mecoprop-p product. In addition, improvement in data collection on mecoprop-p concentrations and detailed seasonal herbicide sales for non-agricultural purposes are needed to inform the analysis and decision process.
Dynamic power balance analysis in JET
NASA Astrophysics Data System (ADS)
Matthews, G. F.; Silburn, S. A.; Challis, C. D.; Eich, T.; Iglesias, D.; King, D.; Sieglin, B.; Contributors, JET
2017-12-01
The full scale realisation of nuclear fusion as an energy source requires a detailed understanding of power and energy balance in current experimental devices. In this we explore whether a global power balance model in which some of the calibration factors applied to the source or sink terms are fitted to the data can provide insight into possible causes of any discrepancies in power and energy balance seen in the JET tokamak. We show that the dynamics in the power balance can only be properly reproduced by including the changes in the thermal stored energy which therefore provides an additional opportunity to cross calibrate other terms in the power balance equation. Although the results are inconclusive with respect to the original goal of identifying the source of the discrepancies in the energy balance, we do find that with optimised parameters an extremely good prediction of the total power measured at the outer divertor target can be obtained over a wide range of pulses with time resolution up to ∼25 ms.
NASA Astrophysics Data System (ADS)
Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.
1992-09-01
A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.
EXCITATION OF A BURIED MAGMATIC PIPE: A SEISMIC SOURCE MODEL FOR VOLCANIC TREMOR.
Chouet, Bernard
1985-01-01
A model of volcanic tremor is presented in which the modes of vibration of a volcanic pipe are excited by the motion of the fluid within the pipe in response to a short-term perturbation in pressure. The model shows the relative importance of the various parts constituting this composite source in the radiated elastic field at near and intermediate distances. The paper starts with the presentation of the elastic field radiated by the source, and proceeds with an analysis of the energy balance between hydraulic and elastic motions. Next, the hydraulic excitation of the source is addressed and, finally, the ground response to this excitation is analyzed in the simple case of a pipe buried in a homogeneous half space.
NASA Astrophysics Data System (ADS)
Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah
2016-04-01
Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered through classroom observations and interviews in four Turkish elementary schools. Focus group interviews with 47 students and individual interviews with 17 teachers and 10 parents were conducted. Participants identified a wide range of SIS, including TV, magazines, newspapers, internet, peers, teachers, families, science centers/museums, science exhibitions, textbooks, science books, and science camps. Students reported using various SIS in school-based and non-school contexts to satisfy their cognitive, affective, personal, and social integrative needs. SIS were used for science courses, homework/project assignments, examination/test preparations, and individual science-related research. Students assessed SIS in terms of the perceived accessibility of the sources, the quality of the content, and the content presentation. In particular, some sources such as teachers, families, TV, science magazines, textbooks, and science centers/museums ("directive sources") predictably led students to other sources such as teachers, families, internet, and science books ("directed sources"). A small number of sources crossed context boundaries, being useful in both school and out. Results shed light on the connection between science education and science communication in terms of promoting science learning.
Bai, Mingsian R; Li, Yi; Chiang, Yi-Hao
2017-10-01
A unified framework is proposed for analysis and synthesis of two-dimensional spatial sound field in reverberant environments. In the sound field analysis (SFA) phase, an unbaffled 24-element circular microphone array is utilized to encode the sound field based on the plane-wave decomposition. Depending on the sparsity of the sound sources, the SFA stage can be implemented in two manners. For sparse-source scenarios, a one-stage algorithm based on compressive sensing algorithm is utilized. Alternatively, a two-stage algorithm can be used, where the minimum power distortionless response beamformer is used to localize the sources and Tikhonov regularization algorithm is used to extract the source amplitudes. In the sound field synthesis (SFS), a 32-element rectangular loudspeaker array is employed to decode the target sound field using pressure matching technique. To establish the room response model, as required in the pressure matching step of the SFS phase, an SFA technique for nonsparse-source scenarios is utilized. Choice of regularization parameters is vital to the reproduced sound field. In the SFS phase, three SFS approaches are compared in terms of localization performance and voice reproduction quality. Experimental results obtained in a reverberant room are presented and reveal that an accurate room response model is vital to immersive rendering of the reproduced sound field.
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente
2016-01-01
This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated. PMID:26927113
Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente
2016-02-26
This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated.
NASA Astrophysics Data System (ADS)
Yamazawa, Hiromi; Terasaka, Yuta; Mizutani, Kenta; Sugiura, Hiroki; Hirao, Shigekazu
2017-04-01
Understanding on the release of radioactivity into the atmosphere from the accidental units of Fukushima Daiichi Nuclear Power Station have been improved owing to recent analyses of atmospheric concentrations of radionuclide. Our analysis of gamma-ray spectra from monitoring posts located about 100 km to the south of the site revealed temporal changes of atmospheric concentrations of several key nuclides including noble gas Xe-133 in addition to radio-iodine and cesium nuclides, including I-131 and Cs-137, at a 10 minute interval. By using the atmospheric concentration data, in combination with an inverse atmospheric transport modelling with a Bayesian statistical method, a modification was proposed for the widely used Katata's source term. A source term for Xe-133 was also proposed. Although the atmospheric concentration data and the source terms help us understand the atmospheric transport processes of radionuclides, they still have significant uncertainty due to limitations in availability of the concentration data. There still remain limitations in the atmospheric transport modeling. The largest uncertainty in the model is in the deposition processes. It had been pointed out that, in the 100 km range from the accidental site, there were locations at which the ambient dose rate significantly increased a few hours before precipitation detectors recorded the start of rain. According to our analysis, the dose rate increase was not directly caused by the air-borne radioactivity but by deposition. This phenomenon can be attributed to a deposition process in which evaporating precipitation enhances efficiency of deposition even in a case where no precipitation is observed at ground level.
The Tea-Carbon Dioxide Laser as a Means of Generating Ultrasound in Solids
NASA Astrophysics Data System (ADS)
Taylor, Gregory Stuart
1990-01-01
Available from UMI in association with The British Library. Requires signed TDF. The aim of this thesis is to characterise the interaction between pulsed, high power, 10.6 mu m radiation and solids. The work is considered both in the general context of laser generation of ultrasound and specifically to gain a deeper understanding of the interaction between a laser supported plasma and a solid. The predominant experimental tools used are the homodyne Michelson interferometer and a range of electromagnetic acoustic transducers. To complement the ultrasonic data, various plasma inspection techniques, such as high speed, streak camera photography and reflection photometry, have been used to correlate the plasma properties with those of the ultrasonic transients. The work involving the characterisation of a laser supported plasma with a solid, which is based on previous experimental and theoretical analysis, gives an increased understanding of the plasma's ultrasonic generation mechanism. The ability to record the entire plasma-sample interaction, time history yields information of the internal dynamics of the plasma growth and shock wave generation. The interaction of the radiation with a solid is characterised in both the plasma breakdown and non-breakdown regimes by a wide ultrasonic source. The variation in source diameter enables the transition from a point to a near planar ultrasonic source to be studied. The resultant ultrasonic modifications are examined in terms of the wave structure and the directivity pattern. The wave structure is analysed in terms of existing wide source, bulk wave theories and extended to consider the effects on surface and Lamb waves. The directivity patterns of the longitudinal and shear waves are analysed in terms of top-hat and non -uniform source profiles, giving additional information into the radiation-solid interaction. The wide, one dimensional source analysis is continued to a two dimensional, extended ultrasonic source, generated on non-metals by the optical penetration of radiation within the target. The generation of ultrasound in both metals and non-metals, using the CO_2 laser, is shown to be an efficient process and may be employed almost totally non-destructively. Such a laser may therefore be used effectively on a greatly enhanced range of materials than those tested to-date via laser generation, resulting in the increased suitability of the laser technique within the field of Non Destructive Testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dockter, Randy E.
2017-07-31
The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needsmore » if potential problems are identified.« less
Hanford Site Composite Analysis Technical Approach Description: Atmospheric Transport Modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, B.; Lehman, L. L.
2017-10-02
The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs,more » if potential problems are identified.« less
Hanford Site Composite Analysis Technical Approach Description: Waste Form Release.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardie, S.; Paris, B.; Apted, M.
2017-09-14
The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs,more » if potential problems are identified.« less
Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, K. J.
2017-09-14
The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needsmore » if potential problems are identified.« less
NASA Astrophysics Data System (ADS)
Jiao, Yi; Duan, Zhe
2017-01-01
In a diffraction-limited storage ring, half integer resonances can have strong effects on the beam dynamics, associated with the large detuning terms from the strong focusing and strong sextupoles as required for an ultralow emittance. In this study, the limitation of half integer resonances on the available momentum acceptance (MA) was statistically analyzed based on one design of the High Energy Photon Source (HEPS). It was found that the probability of MA reduction due to crossing of half integer resonances is closely correlated with the level of beta beats at the nominal tunes, but independent of the error sources. The analysis indicated that for the presented HEPS lattice design, the rms amplitude of beta beats should be kept below 1.5% horizontally and 2.5% vertically to reach a small MA reduction probability of about 1%.
Modeling and observations of an elevated, moving infrasonic source: Eigenray methods.
Blom, Philip; Waxler, Roger
2017-04-01
The acoustic ray tracing relations are extended by the inclusion of auxiliary parameters describing variations in the spatial ray coordinates and eikonal vector due to changes in the initial conditions. Computation of these parameters allows one to define the geometric spreading factor along individual ray paths and assists in identification of caustic surfaces so that phase shifts can be easily identified. A method is developed leveraging the auxiliary parameters to identify propagation paths connecting specific source-receiver geometries, termed eigenrays. The newly introduced method is found to be highly efficient in cases where propagation is non-planar due to horizontal variations in the propagation medium or the presence of cross winds. The eigenray method is utilized in analysis of infrasonic signals produced by a multi-stage sounding rocket launch with promising results for applications of tracking aeroacoustic sources in the atmosphere and specifically to analysis of motor performance during dynamic tests.
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie
2014-02-01
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less
GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.
Zheng, Qi; Wang, Xiu-Jie
2008-07-01
Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/
NASA Astrophysics Data System (ADS)
Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.
2016-12-01
A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
Martian methane plume models for defining Mars rover methane source search strategies
NASA Astrophysics Data System (ADS)
Nicol, Christopher; Ellery, Alex; Lynch, Brian; Cloutis, Ed
2018-07-01
The detection of atmospheric methane on Mars implies an active methane source. This introduces the possibility of a biotic source with the implied need to determine whether the methane is indeed biotic in nature or geologically generated. There is a clear need for robotic algorithms which are capable of manoeuvring a rover through a methane plume on Mars to locate its source. We explore aspects of Mars methane plume modelling to reveal complex dynamics characterized by advection and diffusion. A statistical analysis of the plume model has been performed and compared to analyses of terrestrial plume models. Finally, we consider a robotic search strategy to find a methane plume source. We find that gradient-based techniques are ineffective, but that more sophisticated model-based search strategies are unlikely to be available in near-term rover missions.
NASA Astrophysics Data System (ADS)
Reyes-Villegas, Ernesto; Green, David C.; Priestman, Max; Canonaco, Francesco; Coe, Hugh; Prévôt, André S. H.; Allan, James D.
2016-12-01
The multilinear engine (ME-2) factorization tool is being widely used following the recent development of the Source Finder (SoFi) interface at the Paul Scherrer Institute. However, the success of this tool, when using the a value approach, largely depends on the inputs (i.e. target profiles) applied as well as the experience of the user. A strategy to explore the solution space is proposed, in which the solution that best describes the organic aerosol (OA) sources is determined according to the systematic application of predefined statistical tests. This includes trilinear regression, which proves to be a useful tool for comparing different ME-2 solutions. Aerosol Chemical Speciation Monitor (ACSM) measurements were carried out at the urban background site of North Kensington, London from March to December 2013, where for the first time the behaviour of OA sources and their possible environmental implications were studied using an ACSM. Five OA sources were identified: biomass burning OA (BBOA), hydrocarbon-like OA (HOA), cooking OA (COA), semivolatile oxygenated OA (SVOOA) and low-volatility oxygenated OA (LVOOA). ME-2 analysis of the seasonal data sets (spring, summer and autumn) showed a higher variability in the OA sources that was not detected in the combined March-December data set; this variability was explored with the triangle plots f44 : f43 f44 : f60, in which a high variation of SVOOA relative to LVOOA was observed in the f44 : f43 analysis. Hence, it was possible to conclude that, when performing source apportionment to long-term measurements, important information may be lost and this analysis should be done to short periods of time, such as seasonally. Further analysis on the atmospheric implications of these OA sources was carried out, identifying evidence of the possible contribution of heavy-duty diesel vehicles to air pollution during weekdays compared to those fuelled by petrol.
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
Ma, Xiao-xue; Wang, La-chun; Liao, Ling-ling
2015-01-01
Identifying the temp-spatial distribution and sources of water pollutants is of great significance for efficient water quality management pollution control in Wenruitang River watershed, China. A total of twelve water quality parameters, including temperature, pH, dissolved oxygen (DO), total nitrogen (TN), ammonia nitrogen (NH4+ -N), electrical conductivity (EC), turbidity (Turb), nitrite-N (NO2-), nitrate-N(NO3-), phosphate-P(PO4(3-), total organic carbon (TOC) and silicate (SiO3(2-)), were analyzed from September, 2008 to October, 2009. Geographic information system(GIS) and principal component analysis(PCA) were used to determine the spatial distribution and to apportion the sources of pollutants. The results demonstrated that TN, NH4+ -N, PO4(3-) were the main pollutants during flow period, wet period, dry period, respectively, which was mainly caused by urban point sources and agricultural and rural non-point sources. In spatial terms, the order of pollution was tertiary river > secondary river > primary river, while the water quality was worse in city zones than in the suburb and wetland zone regardless of the river classification. In temporal terms, the order of pollution was dry period > wet period > flow period. Population density, land use type and water transfer affected the water quality in Wenruitang River.
Long-Term Stability of Radio Sources in VLBI Analysis
NASA Technical Reports Server (NTRS)
Engelhardt, Gerald; Thorandt, Volkmar
2010-01-01
Positional stability of radio sources is an important requirement for modeling of only one source position for the complete length of VLBI data of presently more than 20 years. The stability of radio sources can be verified by analyzing time series of radio source coordinates. One approach is a statistical test for normal distribution of residuals to the weighted mean for each radio source component of the time series. Systematic phenomena in the time series can thus be detected. Nevertheless, an inspection of rate estimation and weighted root-mean-square (WRMS) variations about the mean is also necessary. On the basis of the time series computed by the BKG group in the frame of the ICRF2 working group, 226 stable radio sources with an axis stability of 10 as could be identified. They include 100 ICRF2 axes-defining sources which are determined independently of the method applied in the ICRF2 working group. 29 stable radio sources with a source structure index of less than 3.0 can also be used to increase the number of 295 ICRF2 defining sources.
Situational Interest in Engineering Design Activities
NASA Astrophysics Data System (ADS)
Bonderup Dohn, Niels
2013-08-01
The aim of the present mixed-method study was to investigate task-based situational interest of sixth grade students (n = 46), between 12 and 14 years old, during an eight-week engineering design programme in a Science & Technology-class. Students' interests were investigated by means of a descriptive interpretative analysis of qualitative data from classroom observations and informal interviews. The analysis was complemented by a self-report survey to validate findings and determine prevalence. The analysis revealed four main sources of interest: designing inventions, trial-and-error experimentation, achieved functionality of invention, and collaboration. These sources differ in terms of stimuli factors, such as novelty, autonomy (choice), social involvement, self-generation of interest, and task goal orientation. The study shows that design tasks stimulated interest, but only to the extent that students were able to self-regulate their learning strategies.
Construction and Validation of Textbook Analysis Grids for Ecology and Environmental Education
ERIC Educational Resources Information Center
Caravita, Silvia; Valente, Adriana; Luzi, Daniela; Pace, Paul; Valanides, Nicos; Khalil, Iman; Berthou, Guillemette; Kozan-Naumescu, Adrienne; Clement, Pierre
2008-01-01
Knowledge about ecology and environmental education (EE) constitutes a basic tool for promoting a sustainable future, and was a target area of the BIOHEAD-Citizen Project. School textbooks were considered as representative sources of evidence in terms of ecology and environmental education, and were used for comparison among the countries…
Chlorine bleaches - A significant long term source of mercury pollution
NASA Technical Reports Server (NTRS)
Siegel, S. M.; Eshleman, A.
1975-01-01
Products of industrial electrolysis of brine - NaOCl-based bleaches and NaOH - yielded 17 to 1290 ppb of Hg upon flameless atomic absorption analysis. Compared with current U.S. rejection value of 5 ppb for potable waters, the above levels seem sufficiently high to be a matter of environmental concern.
Methods for nuclear air-cleaning-system accident-consequence assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.
1982-01-01
This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less
Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.
2016-01-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674
Sensitivity Analysis for some Water Pollution Problem
NASA Astrophysics Data System (ADS)
Le Dimet, François-Xavier; Tran Thu, Ha; Hussaini, Yousuff
2014-05-01
Sensitivity Analysis for Some Water Pollution Problems Francois-Xavier Le Dimet1 & Tran Thu Ha2 & M. Yousuff Hussaini3 1Université de Grenoble, France, 2Vietnamese Academy of Sciences, 3 Florida State University Sensitivity analysis employs some response function and the variable with respect to which its sensitivity is evaluated. If the state of the system is retrieved through a variational data assimilation process, then the observation appears only in the Optimality System (OS). In many cases, observations have errors and it is important to estimate their impact. Therefore, sensitivity analysis has to be carried out on the OS, and in that sense sensitivity analysis is a second order property. The OS can be considered as a generalized model because it contains all the available information. This presentation proposes a method to carry out sensitivity analysis in general. The method is demonstrated with an application to water pollution problem. The model involves shallow waters equations and an equation for the pollutant concentration. These equations are discretized using a finite volume method. The response function depends on the pollutant source, and its sensitivity with respect to the source term of the pollutant is studied. Specifically, we consider: • Identification of unknown parameters, and • Identification of sources of pollution and sensitivity with respect to the sources. We also use a Singular Evolutive Interpolated Kalman Filter to study this problem. The presentation includes a comparison of the results from these two methods. .
Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D
2017-03-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.
NASA Astrophysics Data System (ADS)
Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Krause, Stefan
2017-04-01
At the river catchment scale, storm events can drive highly variable behaviour in nutrient and water fluxes, yet short-term dynamics are frequently missed by low resolution sampling regimes. In addition, nutrient source contributions can vary significantly within and between storm events. Our inability to identify and characterise time dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here, we utilise an 8-month high-frequency (hourly) time series of streamflow, nitrate concentration (NO3) and fluorescent dissolved organic matter concentration (FDOM) derived from optical in-situ sensors located in a headwater agricultural catchment. We characterised variability in flow and nutrient dynamics across 29 storm events. Storm events represented 31% of the time series and contributed disproportionately to nutrient loads (43% of NO3 and 36% of CDOM) relative to their duration. Principal components analysis of potential hydroclimatological controls on nutrient fluxes demonstrated that a small number of components, representing >90% of variance in the dataset, were highly significant model predictors of inter-event variability in catchment nutrient export. Hysteresis analysis of nutrient concentration-discharge relationships suggested spatially discrete source zones existed for NO3 and FDOM, and that activation of these zones varied on an event-specific basis. Our results highlight the benefits of high-frequency in-situ monitoring for characterising complex short-term nutrient dynamics and unravelling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights are fundamental to underpinning the development of targeted management measures to reduce nutrient loading of surface waters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, D.; Sarkar, S.; Sen, S.
1995-06-01
In this paper the behavior of factorial moments with rapidity window size, which is usually explained in terms of ``intermittency,`` has been interpreted by simple quantum statistical properties of the emitting system using the concept of ``modified two-source model`` as recently proposed by Ghosh and Sarkar [Phys. Lett. B 278, 465 (1992)]. The analysis has been performed using our own data of {sup 16}O-Ag/Br and {sup 24}Mg-Ag/Br interactions at a few tens of GeV energy regime.
Acoustic constituents of prosodic typology
NASA Astrophysics Data System (ADS)
Komatsu, Masahiko
Different languages sound different, and considerable part of it derives from the typological difference of prosody. Although such difference is often referred to as lexical accent types (stress accent, pitch accent, and tone; e.g. English, Japanese, and Chinese respectively) and rhythm types (stress-, syllable-, and mora-timed rhythms; e.g. English, Spanish, and Japanese respectively), it is unclear whether these types are determined in terms of acoustic properties, The thesis intends to provide a potential basis for the description of prosody in terms of acoustics. It argues for the hypothesis that the source component of the source-filter model (acoustic features) approximately corresponds to prosody (linguistic features) through several experimental-phonetic studies. The study consists of four parts. (1) Preliminary experiment: Perceptual language identification tests were performed using English and Japanese speech samples whose frequency spectral information (i.e. non-source component) is heavily reduced. The results indicated that humans can discriminate languages with such signals. (2) Discussion on the linguistic information that the source component contains: This part constitutes the foundation of the argument of the thesis. Perception tests of consonants with the source signal indicated that the source component carries the information on broad categories of phonemes that contributes to the creation of rhythm. (3) Acoustic analysis: The speech samples of Chinese, English, Japanese, and Spanish, differing in prosodic types, were analyzed. These languages showed difference in acoustic characteristics of the source component. (4) Perceptual experiment: A language identification test for the above four languages was performed using the source signal with its acoustic features parameterized. It revealed that humans can discriminate prosodic types solely with the source features and that the discrimination is easier as acoustic information increases. The series of studies showed the correspondence of the source component to prosodic features. In linguistics, prosodic types have not been discussed purely in terms of acoustics; they are usually related to the function of prosody or phonological units such as phonemes. The present thesis focuses on acoustics and makes a contribution to establishing the crosslinguistic description system of prosody.
IGR J16318-4848: 7 Years of INTEGRAL Observations
NASA Technical Reports Server (NTRS)
Barragan, Laura; Wilms, Joern; kreykenbohm, Ingo; Hanke, manfred; Fuerst, Felix; Pottschmidt, Katja; Rothschild, Richard
2011-01-01
Since the discovery of IGR 116318-4848 in 2003 January, INTEGRAL has accumulated more than 5.8 Ms in IBIS/ISGRI. We present the first extensive analysis of the archival INTEGRAL data (IBIS/ISGRI, and JEM-X when available) for this source, together with the observations carried out by XMM-Newton (twice in 2003, and twice in 2004) and Suzaku (2006). The source is very variable in the long-term, with periods of low activity, where the source is almost not detected, and flares with a luminosity approximately 10 times greater than its average value (5.4 cts/s). IGR 116318-4848 is a HMXB containing a sgB[e] star and a compact object (most probably a neutron star) deeply embedded in the stellar wind of the mass donor. The variability of the source (also in the short-term) can be ascribed to the wind of the optical star being very clumpy. We study the variation of the spectral parameters in time scales of INTEGRAL revolutions. The photoelectric absorption is, with NH around 10(exp 24)/ square cm, unusually high. During brighter phases the strong K-alpha iron line known from XMM-Newton and Suzaku observations is also detectable with the JEM-X instrument.
Porous elastic system with nonlinear damping and sources terms
NASA Astrophysics Data System (ADS)
Freitas, Mirelson M.; Santos, M. L.; Langa, José A.
2018-02-01
We study the long-time behavior of porous-elastic system, focusing on the interplay between nonlinear damping and source terms. The sources may represent restoring forces, but may also be focusing thus potentially amplifying the total energy which is the primary scenario of interest. By employing nonlinear semigroups and the theory of monotone operators, we obtain several results on the existence of local and global weak solutions, and uniqueness of weak solutions. Moreover, we prove that such unique solutions depend continuously on the initial data. Under some restrictions on the parameters, we also prove that every weak solution to our system blows up in finite time, provided the initial energy is negative and the sources are more dominant than the damping in the system. Additional results are obtained via careful analysis involving the Nehari Manifold. Specifically, we prove the existence of a unique global weak solution with initial data coming from the "good" part of the potential well. For such a global solution, we prove that the total energy of the system decays exponentially or algebraically, depending on the behavior of the dissipation in the system near the origin. We also prove the existence of a global attractor.
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Coarse Grid CFD for underresolved simulation
NASA Astrophysics Data System (ADS)
Class, Andreas G.; Viellieber, Mathias O.; Himmel, Steffen R.
2010-11-01
CFD simulation of the complete reactor core of a nuclear power plant requires exceedingly huge computational resources so that this crude power approach has not been pursued yet. The traditional approach is 1D subchannel analysis employing calibrated transport models. Coarse grid CFD is an attractive alternative technique based on strongly under-resolved CFD and the inviscid Euler equations. Obviously, using inviscid equations and coarse grids does not resolve all the physics requiring additional volumetric source terms modelling viscosity and other sub-grid effects. The source terms are implemented via correlations derived from fully resolved representative simulations which can be tabulated or computed on the fly. The technique is demonstrated for a Carnot diffusor and a wire-wrap fuel assembly [1]. [4pt] [1] Himmel, S.R. phd thesis, Stuttgart University, Germany 2009, http://bibliothek.fzk.de/zb/berichte/FZKA7468.pdf
NASA Technical Reports Server (NTRS)
Shyy, W.; Thakur, S.; Udaykumar, H. S.
1993-01-01
A high accuracy convection scheme using a sequential solution technique has been developed and applied to simulate the longitudinal combustion instability and its active control. The scheme has been devised in the spirit of the Total Variation Diminishing (TVD) concept with special source term treatment. Due to the substantial heat release effect, a clear delineation of the key elements employed by the scheme, i.e., the adjustable damping factor and the source term treatment has been made. By comparing with the first-order upwind scheme previously utilized, the present results exhibit less damping and are free from spurious oscillations, offering improved quantitative accuracy while confirming the spectral analysis reported earlier. A simple feedback type of active control has been found to be capable of enhancing or attenuating the magnitude of the combustion instability.
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2014 CFR
2014-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2012 CFR
2012-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2010 CFR
2010-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2013 CFR
2013-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2011 CFR
2011-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
TRANSIENT X-RAY SOURCE POPULATION IN THE MAGELLANIC-TYPE GALAXY NGC 55
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jithesh, V.; Wang, Zhongxiang, E-mail: jithesh@shao.ac.cn
2016-04-10
We present the spectral and temporal properties of 15 candidate transient X-ray sources detected in archival XMM-Newton and Chandra observations of the nearby Magellanic-type, SB(s)m galaxy NGC 55. Based on an X-ray color classification scheme, the majority of the sources may be identified as X-ray binaries (XRBs), and six sources are soft, including a likely supernova remnant. We perform a detailed spectral and variability analysis of the data for two bright candidate XRBs. Both sources displayed strong short-term X-ray variability, and their X-ray spectra and hardness ratios are consistent with those of XRBs. These results, combined with their high X-raymore » luminosities (∼10{sup 38} erg s{sup −1}), strongly suggest that they are black hole (BH) binaries. Seven less luminous sources have spectral properties consistent with those of neutron star or BH XRBs in both normal and high-rate accretion modes, but one of them is the likely counterpart to a background galaxy (because of positional coincidence). From our spectral analysis, we find that the six soft sources are candidate super soft sources (SSSs) with dominant emission in the soft (0.3–2 keV) X-ray band. Archival Hubble Space Telescope optical images for seven sources are available, and the data suggest that most of them are likely to be high-mass XRBs. Our analysis has revealed the heterogeneous nature of the transient population in NGC 55 (six high-mass XRBs, one low-mass XRBs, six SSSs, one active galactic nucleus), helping establish the similarity of the X-ray properties of this galaxy to those of other Magellanic-type galaxies.« less
Novel techniques for characterization of hydrocarbon emission sources in the Barnett Shale
NASA Astrophysics Data System (ADS)
Nathan, Brian Joseph
Changes in ambient atmospheric hydrocarbon concentrations can have both short-term and long-term effects on the atmosphere and on human health. Thus, accurate characterization of emissions sources is critically important. The recent boom in shale gas production has led to an increase in hydrocarbon emissions from associated processes, though the exact extent is uncertain. As an original quantification technique, a model airplane equipped with a specially-designed, open-path methane sensor was flown multiple times over a natural gas compressor station in the Barnett Shale in October 2013. A linear optimization was introduced to a standard Gaussian plume model in an effort to determine the most probable emission rate coming from the station. This is shown to be a suitable approach given an ideal source with a single, central plume. Separately, an analysis was performed to characterize the nonmethane hydrocarbons in the Barnett during the same period. Starting with ambient hourly concentration measurements of forty-six hydrocarbon species, Lagrangian air parcel trajectories were implemented in a meteorological model to extend the resolution of these measurements and achieve domain-fillings of the region for the period of interest. A self-organizing map (a type of unsupervised classification) was then utilized to reduce the dimensionality of the total multivariate set of grids into characteristic one-dimensional signatures. By also introducing a self-organizing map classification of the contemporary wind measurements, the spatial hydrocarbon characterizations are analyzed for periods with similar wind conditions. The accuracy of the classification is verified through assessment of observed spatial mixing ratio enhancements of key species, through site-comparisons with a related long-term study, and through a random forest analysis (an ensemble learning method of supervised classification) to determine the most important species for defining key classes. The hydrocarbon classification is shown to have performed very well in identifying expected signatures near and downwind-of oil and gas facilities with active permits, which showcases this method's usefulness for future regional hydrocarbon source-apportionment analyses.
Analysis of source/drain engineered 22nm FDSOI using high-k spacers
NASA Astrophysics Data System (ADS)
Malviya, Abhishek Kumar; Chauhan, R. K.
2018-04-01
While looking at the current classical scaling of devices there are lots of short channel effects come into consideration. In this paper, a novel device structure is proposed that is an improved structure of Modified Source(MS) FDSOI in terms of better electrical performance, on current and reduced off state leakage current with a higher Ion/Ioff ratio that helps in fast switching of low power nano electronic devices. Proposed structure has Modified drain and source regions with two different type to doping profile at 22nm gate length. In the upper part of engineered region (MD and MS) the doping concentration is kept high and less in the lower region. The purpose was to achieve low parasitic capacitance in source and drain region by reducing doping concentration [1].
Propagation of sound waves through a linear shear layer: A closed form solution
NASA Technical Reports Server (NTRS)
Scott, J. N.
1978-01-01
Closed form solutions are presented for sound propagation from a line source in or near a shear layer. The analysis was exact for all frequencies and was developed assuming a linear velocity profile in the shear layer. This assumption allowed the solution to be expressed in terms of parabolic cyclinder functions. The solution is presented for a line monopole source first embedded in the uniform flow and then in the shear layer. Solutions are also discussed for certain types of dipole and quadrupole sources. Asymptotic expansions of the exact solutions for small and large values of Strouhal number gave expressions which correspond to solutions previously obtained for these limiting cases.
Sun, Feng-xia; Zhang, Wei-hua; Xu, Ming-gang; Zhang, Wen-ju; Li, Zhao-qiang; Zhang, Jing-ye
2010-11-01
In order to explore the effects of long-term fertilization on the microbiological characters of red soil, soil samples were collected from a 19-year long-term experimental field in Qiyang of Hunan, with their microbial biomass carbon (MBC) and nitrogen (MBN) and microbial utilization ratio of carbon sources analyzed. The results showed that after 19-year fertilization, the soil MBC and MBN under the application of organic manure and of organic manure plus inorganic fertilizers were 231 and 81 mg x kg(-1) soil, and 148 and 73 mg x kg(-1) soil, respectively, being significantly higher than those under non-fertilization, inorganic fertilization, and inorganic fertilization plus straw incorporation. The ratio of soil MBN to total N under the application of organic manure and of organic manure plus inorganic fertilizers was averagely 6.0%, significantly higher than that under non-fertilization and inorganic fertilization. Biolog-ECO analysis showed that the average well color development (AWCD) value was in the order of applying organic manure plus inorganic fertilizers = applying organic manure > non-fertilization > inorganic fertilization = inorganic fertilization plus straw incorporation. Under the application of organic manure or of organic manure plus inorganic fertilizers, the microbial utilization rate of carbon sources, including carbohydrates, carboxylic acids, amino acids, polymers, phenols, and amines increased; while under inorganic fertilization plus straw incorporation, the utilization rate of polymers was the highest, and that of carbohydrates was the lowest. Our results suggested that long-term application of organic manure could increase the red soil MBC, MBN, and microbial utilization rate of carbon sources, improve soil fertility, and maintain a better crop productivity.
Fuks, Kateryna B; Weinmayr, Gudrun; Hennig, Frauke; Tzivian, Lilian; Moebus, Susanne; Jakobs, Hermann; Memmesheimer, Michael; Kälsch, Hagen; Andrich, Silke; Nonnemacher, Michael; Erbel, Raimund; Jöckel, Karl-Heinz; Hoffmann, Barbara
2016-08-01
Long-term exposure to fine particulate matter (PM2.5) may lead to increased blood pressure (BP). The role of industry- and traffic-specific PM2.5 remains unclear. We investigated the associations of residential long-term source-specific PM2.5 exposure with arterial BP and incident hypertension in the population-based Heinz Nixdorf Recall cohort study. We defined hypertension as systolic BP≥140mmHg, or diastolic BP≥90mmHg, or current use of BP lowering medication. Long-term concentrations of PM2.5 from all local sources (PM2.5ALL), local industry (PM2.5IND) and traffic (PM2.5TRA) were modeled with a dispersion and chemistry transport model (EURAD-CTM) with a 1km(2) resolution. We performed a cross-sectional analysis with BP and prevalent hypertension at baseline, using linear and logistic regression, respectively, and a longitudinal analysis with incident hypertension at 5-year follow-up, using Poisson regression with robust variance estimation. We adjusted for age, sex, body mass index, lifestyle, education, and major road proximity. Change in BP (mmHg), odds ratio (OR) and relative risk (RR) for hypertension were calculated per 1μg/m(3) of exposure concentration. PM2.5ALL was highly correlated with PM2.5IND (Spearman's ρ=0.92) and moderately with PM2.5TRA (ρ=0.42). In adjusted cross-sectional analysis with 4539 participants, we found positive associations of PM2.5ALL with systolic (0.42 [95%-CI: 0.03, 0.80]) and diastolic (0.25 [0.04, 0.46]) BP. Higher, but less precise estimates were found for PM2.5IND (systolic: 0.55 [-0.05, 1.14]; diastolic: 0.35 [0.03, 0.67]) and PM2.5TRA (systolic: 0.88 [-1.55, 3.31]; diastolic: 0.41 [-0.91, 1.73]). We found crude positive association of PM2.5TRA with prevalence (OR 1.41 [1.10, 1.80]) and incidence of hypertension (RR 1.38 [1.03, 1.85]), attenuating after adjustment (OR 1.19 [0.90, 1.58] and RR 1.28 [0.94, 1.72]). We found no association of PM2.5ALL and PM2.5IND with hypertension. Long-term exposures to all-source and industry-specific PM2.5 were positively related to BP. We could not separate the effects of industry-specific PM2.5 from all-source PM2.5. Estimates with traffic-specific PM2.5 were generally higher but inconclusive. Copyright © 2016. Published by Elsevier GmbH.
Aström, Johan; Pettersson, Thomas J R; Reischer, Georg H; Hermansson, Malte
2013-09-01
The protection of drinking water from pathogens such as Cryptosporidium and Giardia requires an understanding of the short-term microbial release from faecal contamination sources in the catchment. Flow-weighted samples were collected during two rainfall events in a stream draining an area with on-site sewers and during two rainfall events in surface runoff from a bovine cattle pasture. Samples were analysed for human (BacH) and ruminant (BacR) Bacteroidales genetic markers through quantitative polymerase chain reaction (qPCR) and for sorbitol-fermenting bifidobacteria through culturing as a complement to traditional faecal indicator bacteria, somatic coliphages and the parasitic protozoa Cryptosporidium spp. and Giardia spp. analysed by standard methods. Significant positive correlations were observed between BacH, Escherichia coli, intestinal enterococci, sulphite-reducing Clostridia, turbidity, conductivity and UV254 in the stream contaminated by on-site sewers. For the cattle pasture, no correlation was found between any of the genetic markers and the other parameters. Although parasitic protozoa were not detected, the analysis for genetic markers provided baseline data on the short-term faecal contamination due to these potential sources of parasites. Background levels of BacH and BacR makers in soil emphasise the need to including soil reference samples in qPCR-based analyses for Bacteroidales genetic markers.
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...
2016-10-18
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
NASA Astrophysics Data System (ADS)
Karamehmedović, Mirza; Kirkeby, Adrian; Knudsen, Kim
2018-06-01
We consider the multi-frequency inverse source problem for the scalar Helmholtz equation in the plane. The goal is to reconstruct the source term in the equation from measurements of the solution on a surface outside the support of the source. We study the problem in a certain finite dimensional setting: from measurements made at a finite set of frequencies we uniquely determine and reconstruct sources in a subspace spanned by finitely many Fourier–Bessel functions. Further, we obtain a constructive criterion for identifying a minimal set of measurement frequencies sufficient for reconstruction, and under an additional, mild assumption, the reconstruction method is shown to be stable. Our analysis is based on a singular value decomposition of the source-to-measurement forward operators and the distribution of positive zeros of the Bessel functions of the first kind. The reconstruction method is implemented numerically and our theoretical findings are supported by numerical experiments.
The time variability of Jupiter's synchrotron radiation
NASA Astrophysics Data System (ADS)
Bolton, Scott Jay
1991-02-01
The time variability of the Jovian synchrotron emission is investigated by analyzing radio observations of Jupiter at decimetric wavelengths. The observations are composed from two distinct sets of measurements addressing both short term (days to weeks) and long term (months to years) variability. The study of long term variations utilizes a set of measurements made several times each month with the NASA Deep Space Network (DNS) antennas operating at 2295 MHz (13.1 cm). The DSN data set, covering 1971 through 1985, is compared with a set of measurements of the solar wind from a number of Earth orbiting spacecraft. The analysis indicates a maximum correlation between the synchrotron emission and the solar wind ram pressure with a two year time lag. Physical mechanisms affecting the synchrotron emission are discussed with an emphasis on radial diffusion. Calculations are performed that suggest the correlation is consistent with inward adiabatic diffusion of solar wind particles driven by Brice's model of ionospheric neutral wind convection (Brice 1972). The implication is that the solar wind could be a source of particles of Jupiter's radiation belts. The investigation of short term variability focuses on a three year Jupiter observing program using the University of California's Hat Creek radio telescope operating at 1400 MHz (21 cm). Measurements are made every two days during the months surrounding opposition. Results from the three year program suggest short term variability near the 10-20 percent level but should be considered inconclusive due to scheduling and observational limitations. A discussion of magneto-spheric processes on short term timescales identifies wave-particle interactions as a candidate source. Further analysis finds that the short term variations could be related to whistler mode wave-particles interactions in the radiation belts associated with atmospheric lightning on Jupiter. However, theoretical calculations on wave particle interactions imply thought if whistler mode waves are to interact with the synchrotron emitting electrons.
PANDORA: keyword-based analysis of protein sets by integration of annotation sources.
Kaplan, Noam; Vaaknin, Avishay; Linial, Michal
2003-10-01
Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.
Multi-criteria analysis for PM10 planning
NASA Astrophysics Data System (ADS)
Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa
To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.
Hanford Site Composite Analysis Technical Approach Description: Vadose Zone
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, M. D.; Nichols, W. E.; Ali, A.
2017-10-30
The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, and DOE M 435.1 Chg 1, Radioactive Waste Management Manual, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems;more » or, to determine management alternatives, corrective actions, or assessment needs, if potential problems are identified.« less
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Factors influencing atmospheric composition over subarctic North America during summer
NASA Technical Reports Server (NTRS)
Wofsy, Steven C.; Fan, S. -M.; Blake, D. R.; Bradshaw, J. D.; Sandholm, S. T.; Singh, H. B.; Sachse, G. W.; Harriss, R. C.
1994-01-01
Elevated concentrations of hydrocarbons, CO, and nitrogen oxides were observed in extensive haze layers over northeastern Canada in the summer of 1990, during ABLE 3B. Halocarbon concentrations remained near background in most layers, indicating a source from biomass wildfires. Elevated concentrations of C2Cl4 provided a sensitive indicator for pollution from urban/industrial sources. Detailed analysis of regional budgets for CO and hydrocarbons indicates that biomass fires accounted for approximately equal to 70% of the input to the subarctic for most hydrocarbons and for acetone and more than 50% for CO. Regional sources for many species (including CO) exceeded chemical sinks during summer, and the boreal region provided a net source to midlatitudes. Interannual variations and long-term trends in atmospheric composition are sensitive to climatic change; a shift to warmer, drier conditions could increase the areas burned and thus the sources of many trace gases.
Orejas, Jaime; Pfeuffer, Kevin P; Ray, Steven J; Pisonero, Jorge; Sanz-Medel, Alfredo; Hieftje, Gary M
2014-11-01
Ambient desorption/ionization (ADI) sources coupled to mass spectrometry (MS) offer outstanding analytical features: direct analysis of real samples without sample pretreatment, combined with the selectivity and sensitivity of MS. Since ADI sources typically work in the open atmosphere, ambient conditions can affect the desorption and ionization processes. Here, the effects of internal source parameters and ambient humidity on the ionization processes of the flowing atmospheric pressure afterglow (FAPA) source are investigated. The interaction of reagent ions with a range of analytes is studied in terms of sensitivity and based upon the processes that occur in the ionization reactions. The results show that internal parameters which lead to higher gas temperatures afforded higher sensitivities, although fragmentation is also affected. In the case of humidity, only extremely dry conditions led to higher sensitivities, while fragmentation remained unaffected.
The future of meat: a qualitative analysis of cultured meat media coverage.
Goodwin, J N; Shoulders, C W
2013-11-01
This study sought to explore the informational themes and information sources cited by the media to cover stories of cultured meat in both the United States and the European Union. The results indicated that cultured meat news articles in both the United States and the European Union commonly discuss cultured meat in terms of benefits, history, process, time, livestock production problems, and skepticism. Additionally, the information sources commonly cited in the articles included cultured meat researchers, sources from academia, People for the Ethical Treatment of Animals (PETA), New Harvest, Winston Churchill, restaurant owners/chefs, and sources from the opposing countries (e.g. US use some EU sources and vice versa). The implications of this study will allow meat scientists to understand how the media is influencing consumers' perceptions about the topic, and also allow them to strategize how to shape future communication about cultured meat. Published by Elsevier Ltd.
Liu, Haiwei; Zhang, Yan; Zhou, Xue; You, Xiuxuan; Shi, Yi; Xu, Jialai
2017-02-01
Samples of surface soil from tobacco (Nicotiana tabacum L.) fields were analysed for heavy metals and showed the following concentrations (mean of 246 samples, mg/kg): As, 5.10; Cd, 0.11; Cr, 49.49; Cu, 14.72; Hg, 0.08; Ni, 19.28; Pb. 20.20 and Zn, 30.76. The values of the index of geoaccumulation (I geo ) and of the enrichment factor indicated modest enrichment with As, Cd, Cr, Hg, Ni or Pb. Principal component analysis and cluster analysis correctly allocated each investigated element to its source, whether anthropogenic or natural. The results were consistent with estimated inputs of heavy metals from fertilizers, irrigation water and atmospheric deposition. The variation in the concentrations of As, Cd, Cu, Pb and Zn in the soil was mainly due to long-term agricultural practises, and that of Cr and Ni was mainly due to the soil parent material, whereas the source of Hg was industrial activity, which ultimately led to atmospheric deposition. Atmospheric deposition was the main exogenous source of heavy metals, and fertilizers also played an important role in the accumulation of these elements in soil. Identifying the sources of heavy metals in agricultural soils can serve as a basis for appropriate action to control and reduce the addition of heavy metals to cultivated soils.
Cohen, Michael X; Gulbinaite, Rasa
2017-02-15
Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.
Relationship of oil seep in Kudat Peninsula with surrounding rocks based on geochemical analysis
NASA Astrophysics Data System (ADS)
Izzati Azman, Nurul; Nur Fathiyah Jamaludin, Siti
2017-10-01
This study aims to investigate the relation of oil seepage at Sikuati area with the structural and petroleum system of Kudat Peninsula. The abundance of highly carbonaceous rocks with presence of lamination in the Sikuati Member outcrop at Kudat Peninsula may give an idea on the presence of oil seepage in this area. A detailed geochemical analysis of source rock sample and oil seepage from Sikuati area was carried out for their characterization and correlation. Hydrocarbon propectivity of Sikuati Member source rock is poor to good with Total Organic Carbon (TOC) value of 0.11% to 1.48%. and also categorized as immature to early mature oil window with Vitrinite Reflectance (VRo) value of 0.43% to 0.50 %Ro. Based on biomarker distribution, from Gas Chromatography (GC) and Gas Chromatography-Mass Spectrometry (GC-MS) analysis, source rock sample shows Pr/Ph, CPI and WI of 2.22 to 2.68, 2.17 to 2.19 and 2.46 to 2.74 respectively indicates the source rock is immature and coming from terrestrial environment. The source rock might be rich in carbonaceous material organic matter resulting from planktonic/bacterial activity which occurs at fluvial to fluvio-deltaic environment. Overall, the source rock from outcrop level of Kudat Peninsula is moderately prolific in term of prospectivity and maturity. However, as go far deeper beneath the surface, we can expect more activity of mature source rock that generate and expulse hydrocarbon from the subsurface then migrating through deep-seated fault beneath the Sikuati area.
pyJac: Analytical Jacobian generator for chemical kinetics
NASA Astrophysics Data System (ADS)
Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen
2017-06-01
Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using pyJac via matrix evaluation timing comparisons; the routines produced by pyJac outperformed first-order finite differences by 3-7.5 times and the existing analytical Jacobian software TChem by 1.1-2.2 times on a single-threaded basis. It is noted that TChem is not thread-safe, while pyJac is easily parallelized, and hence can greatly outperform TChem on multicore CPUs. The Jacobian matrix generator we describe here will be useful for reducing the cost of integrating chemical source terms with implicit algorithms in particular and algorithms that require an accurate Jacobian matrix in general. Furthermore, the open-source release of the program and Python-based implementation will enable wide adoption.
Yin, Zi; Jin, Haosheng; Ma, Tingting; Zhou, Yu; Yu, Min; Jian, Zhixiang
2018-04-30
The optimal management choice in consideration of long-term overall survival (OS) and disease-free survival (DFS) for patients with BLCL very early stage is a matter of debate. A systematic review and meta-analysis was conducted to evaluate the efficacy of liver resection (RES) and radiofrequency ablation (RFA) for single HCC 2 cm or less. The primary sources of the reviewed studies through December 2017, without restriction on the languages or regions, were Pubmed and Embase. The hazard ratio (HR) was used as a summary statistic for long-term outcomes. A total of 5 studies qualified for inclusion in this quantified meta-analysis with a total of 729 HCC patients of BCLC very early stage. Only postoperative 1-year OS was comparable in both RES and RFA groups. As for long-term outcomes of 3-year and 5-year OSs, RES was significantly better than RFA, the HRs were 0.64 (95%CI: 0.41, 1.00; P = 0.05) and 0.63 (95%CI: 0.42, 0.95; P = 0.03) respectively. In terms of postoperative DFS, reduced tumor recurrence was observed in RES, and all the short- and long-terms outcomes were favored RES. RES offers better long-term oncologic outcomes compared with RFA in current clinical evidences. Copyright © 2018. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Bliss, Donald; Franzoni, Linda; Rouse, Jerry; Manning, Ben
2005-09-01
An analysis method for time-dependent broadband diffuse sound fields in enclosures is described. Beginning with a formulation utilizing time-dependent broadband intensity boundary sources, the strength of these wall sources is expanded in a series in powers of an absorption parameter, thereby giving a separate boundary integral problem for each power. The temporal behavior is characterized by a Taylor expansion in the delay time for a source to influence an evaluation point. The lowest-order problem has a uniform interior field proportional to the reciprocal of the absorption parameter, as expected, and exhibits relatively slow exponential decay. The next-order problem gives a mean-square pressure distribution that is independent of the absorption parameter and is primarily responsible for the spatial variation of the reverberant field. This problem, which is driven by input sources and the lowest-order reverberant field, depends on source location and the spatial distribution of absorption. Additional problems proceed at integer powers of the absorption parameter, but are essentially higher-order corrections to the spatial variation. Temporal behavior is expressed in terms of an eigenvalue problem, with boundary source strength distributions expressed as eigenmodes. Solutions exhibit rapid short-time spatial redistribution followed by long-time decay of a predominant spatial mode.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Caroline Leland; John Hom; Nicholas Skowronski; F. Thomas Ledig; Paul J. Krusic; Edward R. Cook; Dario Martin-Benito; Javier Martin-Fernandez; Neil Pederson; Dusan Gomory
2016-01-01
Provenance studies are an increasingly important analog for understanding how trees adapted to particular climatic conditions might respond to climate change. Dendrochronological analysis can illuminate differences among trees from different seed sources in terms of absolute annual growth and sensitivity to external growth factors. We analyzed annual radial growth of...
Convenience or Credibility? A Study of College Student Online Research Behaviors
ERIC Educational Resources Information Center
Biddix, J. Patrick; Chung, Chung Joo; Park, Han Woo
2011-01-01
The purpose of this study was to investigate where students turn for course-related assignments, whether an ordered pattern could be described in terms of which sources students turn to and how students evaluated the information they chose to use. Data were drawn from open-ended questionnaires (n = 282). Semantic network analysis was conducted…
Processing the Army’s Wartime Replacements: The Preferred CONUS Replacement Center Concept.
1987-12-01
Replacement System. The first model, the macro model, was a network flow model which was used to analyze the flow of replacements from their source through...individual CRCs. Through the analysis of the macro model, recommendations were made on how the CRC system should be configured in terms of size, location
An Open Source Agenda for Research Linking Text and Image Content Features.
ERIC Educational Resources Information Center
Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi
2001-01-01
Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…
FT-IR and C-13 NMR analysis of soil humic fractions from a long term cropping systems study
USDA-ARS?s Scientific Manuscript database
Increased knowledge of humic fractions is important due to its involvement in many soil ecosystem processes. Soil humic acid (HA) and fulvic acid (FA) from a nine-year agroecosystem study with different tillage, cropping system, and N source treatments were characterized using FT-IR andsolid-state ...
Telling the story of tree species’ range shifts in a complex landscape
Sharon M. Stanton; Vicente J. Monleon; Heather E. Lintz; Joel Thompson
2015-01-01
The Forest Inventory and Analysis Program is the unrivaled source for long-term, spatially balanced, publicly available data. FIA will continue to be providers of data, but the program is growing and adapting, including a shift in how we communicate information and knowledge derived from those data. Online applications, interactive mapping, and infographics provide...
ERIC Educational Resources Information Center
Bidyuk, Natalya
2014-01-01
The problem of the professional development of young researchers in terms of Master's training has been analyzed. The analysis of the literature references, documental and other sources gave grounds to state that the basic principle of Master's professional training is a research-oriented paradigm. The necessity of using the innovative ideas of…
Energy Conversion Chain Analysis of Sustainable Energy Systems: A Transportation Case Study
ERIC Educational Resources Information Center
Evans, Robert L.
2008-01-01
In general terms there are only three primary energy sources: fossil fuels, renewable energy, and nuclear fission. For fueling road transportation, there has been much speculation about the use of hydrogen as an energy carrier, which would usher in the "hydrogen economy." A parallel situation would use a simple battery to store electricity…
Analysis of Scientific Research Related Anxiety Levels of Undergraduate Students'
ERIC Educational Resources Information Center
Yildirim, Sefa; Hasiloglu, Mehmet Akif
2018-01-01
In this study, it was aimed to identify the scientific research-related anxiety levels of the undergraduate students studying in the department of faculty of science and letters and faculty of education to analyse these anxiety levels in terms of various variables (students' gender, using web based information sources, going to the library,…
Tivesten, Emma; Wiberg, Henrik
2013-03-01
Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. Copyright © 2012 Elsevier Ltd. All rights reserved.
Low birth weight and air pollution in California: Which sources and components drive the risk?
Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun
2016-01-01
Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
NASA Technical Reports Server (NTRS)
1972-01-01
Various methods used in the organic analysis of lunar samples are reviewed. The scope, advantages, and limitations of these methods are discussed, with particular emphasis on possible sources of contamination and experimental artifacts inherent in their use. A broad survey of the organogenic elements and compounds found in lunar samples covers the search for biogenic structures and viable organisms; the abundance and isotopic composition of various elements and compounds; the search for porphyrins, amino acids, or amino acid precursors; and the presence of heterocylics, aromatic hydrocarbons, and other organic compounds. The sources of the organogenic elements and compounds detected in lunar samples are discussed. The significance of the lunar organic analysis for exobiology is discussed in terms of its relevance to and implications for the studies of chemical evolution and terrestrial organic geochemistry. Individual items are announced in this issue.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
Pinto, U; Maheshwari, B L; Ollerton, R L
2013-06-01
The Hawkesbury-Nepean River (HNR) system in South-Eastern Australia is the main source of water supply for the Sydney Metropolitan area and is one of the more complex river systems due to the influence of urbanisation and other activities in the peri-urban landscape through which it flows. The long-term monitoring of river water quality is likely to suffer from data gaps due to funding cuts, changes in priority and related reasons. Nevertheless, we need to assess river health based on the available information. In this study, we demonstrated how the Factor Analysis (FA), Hierarchical Agglomerative Cluster Analysis (HACA) and Trend Analysis (TA) can be applied to evaluate long-term historic data sets. Six water quality parameters, viz., temperature, chlorophyll-a, dissolved oxygen, oxides of nitrogen, suspended solids and reactive silicates, measured at weekly intervals between 1985 and 2008 at 12 monitoring stations located along the 300 km length of the HNR system were evaluated to understand the human and natural influences on the river system in a peri-urban landscape. The application of FA extracted three latent factors which explained more than 70 % of the total variance of the data and related to the 'bio-geographical', 'natural' and 'nutrient pollutant' dimensions of the HNR system. The bio-geographical and nutrient pollution factors more likely related to the direct influence of changes and activities of peri-urban natures and accounted for approximately 50 % of variability in water quality. The application of HACA indicated two major clusters representing clean and polluted zones of the river. On the spatial scale, one cluster was represented by the upper and lower sections of the river (clean zone) and accounted for approximately 158 km of the river. The other cluster was represented by the middle section (polluted zone) with a length of approximately 98 km. Trend Analysis indicated how the point sources influence river water quality on spatio-temporal scales, taking into account the various effects of nutrient and other pollutant loads from sewerage effluents, agriculture and other point and non-point sources along the river and major tributaries of the HNR. Over the past 26 years, water temperature has significantly increased while suspended solids have significantly decreased (p < 0.05). The analysis of water quality data through FA, HACA and TA helped to characterise the key sections and cluster the key water quality variables of the HNR system. The insights gained from this study have the potential to improve the effectiveness of river health-monitoring programs in terms of cost, time and effort, particularly in a peri-urban context.
Improved Nitrogen Removal Effect In Continuous Flow A2/O Process Using Typical Extra Carbon Source
NASA Astrophysics Data System (ADS)
Wu, Haiyan; Gao, Junyan; Yang, Dianhai; Zhou, Qi; Cai, Bijing
2010-11-01
In order to provide a basis for optimal selection of carbon source, three typical external carbon sources (i.e. methanol, sodium acetate and leachate) were applied to examine nitrogen removal efficiency of continuous flow A2/O system with the influent from the effluent of grit chamber in the second Kunming wastewater treatment plant. The best dosage was determined, and the specific nitrogen removal rate and carbon consumption rate were calculated with regard to individual external carbon source in A2/O system. Economy and technology analysis was also conducted to select the suitable carbon source with a low operation cost. Experimental results showed that the external typical carbon source caused a remarkable enhancement of system nitrate degradation ability. In comparison with the blank test, the average TN and NH3-N removal efficiency of system with different dosing quantities of external carbon source was improved by 15.2% and 34.2%, respectively. The optimal dosage of methanol, sodium acetate and leachate was respectively up to 30 mg/L, 40 mg/L and 100 mg COD/L in terms of a high nitrogen degradation effect. The highest removal efficiency of COD, TN and NH3-N reached respectively 92.3%, 73.9% and 100% with methanol with a dosage of 30 mg/L. The kinetic analysis and calculation revealed that the greatest denitrification rate was 0.0107 mg TN/mg MLVSSṡd with sodium acetate of 60 mg/L. As to carbon consumption rate, however, the highest value occurred in the blank test with a rate of 0.1955 mg COD/mg MLVSSṡd. Also, further economic analysis proved leachate to be pragmatic external carbon source whose cost was far cheaper than methanol.
NASA Astrophysics Data System (ADS)
Ou, Jiamin; Guo, Hai; Zheng, Junyu; Cheung, Kalam; Louie, Peter K. K.; Ling, Zhenhao; Wang, Dawei
2015-02-01
To understand the long-term variations of nonmethane hydrocarbons (NMHCs) and their emission sources, real-time speciated NMHCs have been monitored in Hong Kong since 2005. Data analysis showed that the concentrations of C3-C5 and C6-C7 alkanes slightly increased from 2005 to 2013 at a rate of 0.0015 and 0.0005 μg m-3 yr-1 (p < 0.05), respectively, while aromatics decreased at a rate of 0.006 μg m-3 yr-1 (p < 0.05). Positive Matrix Factorization (PMF) model was applied to identify and quantify the NMHC sources. Vehicular exhaust, gasoline evaporation and liquefied petroleum gas (LPG) usage, consumer product and printing, architectural paints, and biogenic emissions were identified and on average accounted for 20.2 ± 6.2%, 25.4 ± 6.3%, 32.6 ± 5.8%, 21.5 ± 4.5%, and 3.3 ± 1.5% of the ambient NMHC concentrations, respectively. From 2005 to 2013, the contributions of both traffic-related sources and solvent-related sources showed no significant changes, different from the trends in emission inventory. On O3 episode days dominated by local air masses, the increase ratio of NMHC species from non-episode to episode days was found to be a natural function of the reactivity of NMHC species, suggesting that photochemical reaction would significantly change the NMHCs composition between emission sources and the receptors. Effect of photochemical reaction loss on receptor-oriented source apportionment analysis needs to be quantified in order to identify the NMHCs emission sources on O3 episode days.
High-order scheme for the source-sink term in a one-dimensional water temperature model
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005
High-order scheme for the source-sink term in a one-dimensional water temperature model.
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.
A Study of Regional Waveform Calibration in the Eastern Mediterranean Region.
NASA Astrophysics Data System (ADS)
di Luccio, F.; Pino, A.; Thio, H.
2002-12-01
We modeled Pnl phases from several moderate magnitude events in the eastern Mediterranean to test methods and to develop path calibrations for source determination. The study region spanning from the eastern part of the Hellenic arc to the eastern Anatolian fault is mostly interested by moderate earthquakes, that can produce relevant damages. The selected area consists of several tectonic environment, which produces increased level of difficulty in waveform modeling. The results of this study are useful for the analysis of regional seismicity and for seismic hazard as well, in particular because very few broadband seismic stations are available in the selected area. The obtained velocity model gives a 30 km crustal tickness and low upper mantle velocities. The applied inversion procedure to determine the source mechanism has been successful, also in terms of discrimination of depth, for the entire range of selected paths. We conclude that using the true calibration of the seismic structure and high quality broadband data, it is possible to determine the seismic source in terms of mechanism, even with a single station.
Analysis of neutron and gamma-ray streaming along the maze of NRCAM thallium production target room.
Raisali, G; Hajiloo, N; Hamidi, S; Aslani, G
2006-08-01
Study of the shield performance of a thallium-203 production target room has been investigated in this work. Neutron and gamma-ray equivalent dose rates at various points of the maze are calculated by simulating the transport of streaming neutrons, and photons using Monte Carlo method. For determination of neutron and gamma-ray source intensities and their energy spectrum, we have applied SRIM 2003 and ALICE91 computer codes to Tl target and its Cu substrate for a 145 microA of 28.5 MeV protons beam. The MCNP/4C code has been applied with neutron source term in mode n p to consider both prompt neutrons and secondary gamma-rays. Then the code is applied for the prompt gamma-rays as the source term. The neutron-flux energy spectrum and equivalent dose rates for neutron and gamma-rays in various positions in the maze have been calculated. It has been found that the deviation between calculated and measured dose values along the maze is less than 20%.
An Improved Elastic and Nonelastic Neutron Transport Algorithm for Space Radiation
NASA Technical Reports Server (NTRS)
Clowdsley, Martha S.; Wilson, John W.; Heinbockel, John H.; Tripathi, R. K.; Singleterry, Robert C., Jr.; Shinn, Judy L.
2000-01-01
A neutron transport algorithm including both elastic and nonelastic particle interaction processes for use in space radiation protection for arbitrary shield material is developed. The algorithm is based upon a multiple energy grouping and analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. The algorithm is then coupled to the Langley HZETRN code through a bidirectional neutron evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for an aluminum water shield-target configuration is then compared with MCNPX and LAHET Monte Carlo calculations for the same shield-target configuration. With the Monte Carlo calculation as a benchmark, the algorithm developed in this paper showed a great improvement in results over the unmodified HZETRN solution. In addition, a high-energy bidirectional neutron source based on a formula by Ranft showed even further improvement of the fluence results over previous results near the front of the water target where diffusion out the front surface is important. Effects of improved interaction cross sections are modest compared with the addition of the high-energy bidirectional source terms.
Textural Maturity Analysis and Sedimentary Environment Discrimination Based on Grain Shape Data
NASA Astrophysics Data System (ADS)
Tunwal, M.; Mulchrone, K. F.; Meere, P. A.
2017-12-01
Morphological analysis of clastic sedimentary grains is an important source of information regarding the processes involved in their formation, transportation and deposition. However, a standardised approach for quantitative grain shape analysis is generally lacking. In this contribution we report on a study where fully automated image analysis techniques were applied to loose sediment samples collected from glacial, aeolian, beach and fluvial environments. A range of shape parameters are evaluated for their usefulness in textural characterisation of populations of grains. The utility of grain shape data in ranking textural maturity of samples within a given sedimentary environment is evaluated. Furthermore, discrimination of sedimentary environment on the basis of grain shape information is explored. The data gathered demonstrates a clear progression in textural maturity in terms of roundness, angularity, irregularity, fractal dimension, convexity, solidity and rectangularity. Textural maturity can be readily categorised using automated grain shape parameter analysis. However, absolute discrimination between different depositional environments on the basis of shape parameters alone is less certain. For example, the aeolian environment is quite distinct whereas fluvial, glacial and beach samples are inherently variable and tend to overlap each other in terms of textural maturity. This is most likely due to a collection of similar processes and sources operating within these environments. This study strongly demonstrates the merit of quantitative population-based shape parameter analysis of texture and indicates that it can play a key role in characterising both loose and consolidated sediments. This project is funded by the Irish Petroleum Infrastructure Programme (www.pip.ie)
Barbe, Anna Greta
2017-09-01
Interventions for the management of radiotherapy-induced xerostomia and hyposalivation: A systematic review and meta-analysis. Mercadante V, Hamad AA, Lodi G, Porter S, Fedele S. Oral Oncol 2017;66:64-74. The authors reported that no external funding sources directly supported this study TYPE OF STUDY/DESIGN: Systematic review and meta-analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1979-01-01
Results of a study leading to the preliminary design of a five passenger hybrid vehicle utilizing two energy sources (electricity and gasoline/diesel fuel) to minimize petroleum usage on a fleet basis are presented. The study methodology is described. Vehicle characterizations, the mission description, characterization, and impact on potential sales, and the rationale for the selection of the reference internal combustion engine vehicle are presented. Conclusions and recommendations of the mission analysis and performance specification report are included.
1989-05-01
NUMERICAL ANALYSIS OF STEFAN PROBLEMS FOR GENERALIZED MULTI- DIMENSIONAL PHASE-CHANGE STRUCTURES USING THE ENTHALPY TRANSFORMING MODEL 4.1 Summary...equation St Stefan number, cs(Tm-Tw)/H or cs(Tm-Ti)/H s circumferential distance coordinate, m, Section III s dimensionless interface position along...fluid, kg/m 3 0 viscous dissipation term in the energy eqn. (1.4), Section I; dummy variable, Section IV r dimensionless time, ta/L 2 a Stefan -Boltzmann
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Todd, Cameron A; Bareiss, Anna K; McCoul, Edward D; Rodriguez, Kimsey H
2017-11-01
Objective To determine the impact of adenotonsillectomy on the quality of life of pediatric patients with obstructive sleep apnea (OSA) and to identify gaps in the current research. Data Sources The MEDLINE, EMBASE, and Cochrane databases were systematically searched via the Ovid portal on June 18, 2016, for English-language articles. Review Methods Full-text articles were selected that studied boys and girls <18 years of age who underwent adenotonsillectomy for OSA or sleep-disordered breathing and that recorded validated, quantitative quality-of-life outcomes. Studies that lacked such measures, performed adenotonsillectomy for indications other than OSA or sleep-disordered breathing, or grouped adenotonsillectomy with other procedures were excluded. Results Of the 328 articles initially identified, 37 were included for qualitative analysis. The level of evidence was generally low. All studies involving short-term follow-up (≤6 months) showed improvement in quality-of-life scores after adenotonsillectomy as compared with preoperative values. Studies involving long-term follow-up (>6 months) showed mixed results. Modifications to and concurrent procedures with conventional adenotonsillectomy were also identified that showed quality-of-life improvements. Three studies were identified for meta-analysis that compared pre- and postoperative Obstructive Sleep Apnea-18 scores. Short- and long-term follow-up versus preoperative scores showed significant improvement ( P < .001). Short- and long-term scores showed no significant difference. Conclusion This systematic review and meta-analysis demonstrate adenotonsillectomy's effectiveness in improving the quality of life of pediatric patients with OSA. This is well demonstrated in the short term and has strong indications in the long term.
A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms
2014-01-01
Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581
Validation of a mixture-averaged thermal diffusion model for premixed lean hydrogen flames
NASA Astrophysics Data System (ADS)
Schlup, Jason; Blanquart, Guillaume
2018-03-01
The mixture-averaged thermal diffusion model originally proposed by Chapman and Cowling is validated using multiple flame configurations. Simulations using detailed hydrogen chemistry are done on one-, two-, and three-dimensional flames. The analysis spans flat and stretched, steady and unsteady, and laminar and turbulent flames. Quantitative and qualitative results using the thermal diffusion model compare very well with the more complex multicomponent diffusion model. Comparisons are made using flame speeds, surface areas, species profiles, and chemical source terms. Once validated, this model is applied to three-dimensional laminar and turbulent flames. For these cases, thermal diffusion causes an increase in the propagation speed of the flames as well as increased product chemical source terms in regions of high positive curvature. The results illustrate the necessity for including thermal diffusion, and the accuracy and computational efficiency of the mixture-averaged thermal diffusion model.
NASA Astrophysics Data System (ADS)
Valença, J. V. B.; Silveira, I. S.; Silva, A. C. A.; Dantas, N. O.; Antonio, P. L.; Caldas, L. V. E.; d'Errico, F.; Souza, S. O.
2017-11-01
The OSL characteristics of three different borate glass matrices containing magnesia (LMB), quicklime (LCB) or potassium carbonate (LKB) were examined. Five different formulations for each composition were produced using a melt-quenching method and analyzed in terms of both dose-response curves and OSL shape decay. The samples were irradiated using a 90Sr/90Y beta source with doses up to 30 Gy. Dose-response curves were plotted using the initial OSL intensity as the chosen parameter. The OSL analysis showed that LKB glasses are the most sensitive to beta irradiation. For the most sensitive LKB composition, the irradiation process was also done using a 60Co gamma source in a dose range from 200 to 800 Gy. In all cases, no saturation was observed. A fitting process using a three-term exponential function was performed for the most sensitive formulations of each composition, which suggested a similar behavior in the OSL decay.
Madjidi, Faramarz; Behroozy, Ali
2014-01-01
Exposure to visible light and near infrared (NIR) radiation in the wavelength region of 380 to 1400 nm may cause thermal retinal injury. In this analysis, the effective spectral radiance of a hot source is replaced by its temperature in the exposure limit values in the region of 380-1400 nm. This article describes the development and implementation of a computer code to predict those temperatures, corresponding to the exposure limits proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). Viewing duration and apparent diameter of the source were inputs for the computer code. At the first stage, an infinite series was created for calculation of spectral radiance by integration with Planck's law. At the second stage for calculation of effective spectral radiance, the initial terms of this infinite series were selected and integration was performed by multiplying these terms by a weighting factor R(λ) in the wavelength region 380-1400 nm. At the third stage, using a computer code, the source temperature that can emit the same effective spectral radiance was found. As a result, based only on measuring the source temperature and accounting for the exposure time and the apparent diameter of the source, it is possible to decide whether the exposure to visible and NIR in any 8-hr workday is permissible. The substitution of source temperature for effective spectral radiance provides a convenient way to evaluate exposure to visible light and NIR.
NASA Astrophysics Data System (ADS)
Kwiatek, Grzegorz; Martínez-Garzón, Patricia; Dresen, Georg; Bohnhoff, Marco; Sone, Hiroki; Hartline, Craig
2015-10-01
The long-term temporal and spatial changes in statistical, source, and stress characteristics of one cluster of induced seismicity recorded at The Geysers geothermal field (U.S.) are analyzed in relation to the field operations, fluid migration, and constraints on the maximum likely magnitude. Two injection wells, Prati-9 and Prati-29, located in the northwestern part of the field and their associated seismicity composed of 1776 events recorded throughout a 7 year period were analyzed. The seismicity catalog was relocated, and the source characteristics including focal mechanisms and static source parameters were refined using first-motion polarity, spectral fitting, and mesh spectral ratio analysis techniques. The source characteristics together with statistical parameters (b value) and cluster dynamics were used to investigate and understand the details of fluid migration scheme in the vicinity of injection wells. The observed temporal, spatial, and source characteristics were clearly attributed to fluid injection and fluid migration toward greater depths, involving increasing pore pressure in the reservoir. The seasonal changes of injection rates were found to directly impact the shape and spatial extent of the seismic cloud. A tendency of larger seismic events to occur closer to injection wells and a correlation between the spatial extent of the seismic cloud and source sizes of the largest events was observed suggesting geometrical constraints on the maximum likely magnitude and its correlation to the average injection rate and volume of fluids present in the reservoir.
Cell phones and brain tumors: a review including the long-term epidemiologic data.
Khurana, Vini G; Teo, Charles; Kundi, Michael; Hardell, Lennart; Carlberg, Michael
2009-09-01
The debate regarding the health effects of low-intensity electromagnetic radiation from sources such as power lines, base stations, and cell phones has recently been reignited. In the present review, the authors attempt to address the following question: is there epidemiologic evidence for an association between long-term cell phone usage and the risk of developing a brain tumor? Included with this meta-analysis of the long-term epidemiologic data are a brief overview of cell phone technology and discussion of laboratory data, biological mechanisms, and brain tumor incidence. In order to be included in the present meta-analysis, studies were required to have met all of the following criteria: (i) publication in a peer-reviewed journal; (ii) inclusion of participants using cell phones for > or = 10 years (ie, minimum 10-year "latency"); and (iii) incorporation of a "laterality" analysis of long-term users (ie, analysis of the side of the brain tumor relative to the side of the head preferred for cell phone usage). This is a meta-analysis incorporating all 11 long-term epidemiologic studies in this field. The results indicate that using a cell phone for > or = 10 years approximately doubles the risk of being diagnosed with a brain tumor on the same ("ipsilateral") side of the head as that preferred for cell phone use. The data achieve statistical significance for glioma and acoustic neuroma but not for meningioma. The authors conclude that there is adequate epidemiologic evidence to suggest a link between prolonged cell phone usage and the development of an ipsilateral brain tumor.
NASA Astrophysics Data System (ADS)
Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C
2011-07-01
An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.
NASA Astrophysics Data System (ADS)
Duncan, J. M.; Band, L. E.; Groffman, P.
2017-12-01
Discharge, land use, and watershed management practices (stream restoration and stormwater control measures) have been found to be important determinants of nitrogen (N) export to receiving waters. We used long-term water quality stations from the Baltimore Ecosystem Study Long-Term Ecological Research (BES LTER) Site to quantify nitrogen export across streamflow conditions at the small watershed scale. We calculated nitrate and total nitrogen fluxes using methodology that allows for changes over time; weighted regressions on time, discharge, and seasonality. Here we tested the hypotheses that a) while the largest N stream fluxes occur during storm events, there is not a clear relationship between N flux and discharge and b) N export patterns are aseasonal in developed watersheds where sources are larger and retention capacity is lower. The goal is to scale understanding from small watersheds to larger ones. Developing a better understanding of hydrologic controls on nitrogen export is essential for successful adaptive watershed management at societally meaningful spatial scales.
NASA Astrophysics Data System (ADS)
Yoshida, Satoshi
Applications of inductively coupled plasma mass spectrometry (ICP-MS) to the determination of long-lived radionuclides in environmental samples were summarized. In order to predict the long-term behavior of the radionuclides, related stable elements were also determined. Compared with radioactivity measurements, the ICP-MS method has advantages in terms of its simple analytical procedures, prompt measurement time, and capability of determining the isotope ratio such as240Pu/239Pu, which can not be separated by radiation. Concentration of U and Th in Japanese surface soils were determined in order to determine the background level of the natural radionuclides. The 235U/238U ratio was successfully used to detect the release of enriched U from reconversion facilities to the environment and to understand the source term. The 240Pu/239Pu ratios in environmental samples varied widely depending on the Pu sources. Applications of ICP-MS to the measurement of I and Tc isotopes were also described. The ratio between radiocesium and stable Cs is useful for judging the equilibrium of deposited radiocesium in a forest ecosystem.
ON THE CONNECTION OF THE APPARENT PROPER MOTION AND THE VLBI STRUCTURE OF COMPACT RADIO SOURCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moor, A.; Frey, S.; Lambert, S. B.
2011-06-15
Many of the compact extragalactic radio sources that are used as fiducial points to define the celestial reference frame are known to have proper motions detectable with long-term geodetic/astrometric very long baseline interferometry (VLBI) measurements. These changes can be as high as several hundred microarcseconds per year for certain objects. When imaged with VLBI at milliarcsecond (mas) angular resolution, these sources (radio-loud active galactic nuclei) typically show structures dominated by a compact, often unresolved 'core' and a one-sided 'jet'. The positional instability of compact radio sources is believed to be connected with changes in their brightness distribution structure. For themore » first time, we test this assumption in a statistical sense on a large sample rather than on only individual objects. We investigate a sample of 62 radio sources for which reliable long-term time series of astrometric positions as well as detailed 8 GHz VLBI brightness distribution models are available. We compare the characteristic direction of their extended jet structure and the direction of their apparent proper motion. We present our data and analysis method, and conclude that there is indeed a correlation between the two characteristic directions. However, there are cases where the {approx}1-10 mas scale VLBI jet directions are significantly misaligned with respect to the apparent proper motion direction.« less
Analysis and Modeling of Parallel Photovoltaic Systems under Partial Shading Conditions
NASA Astrophysics Data System (ADS)
Buddala, Santhoshi Snigdha
Since the industrial revolution, fossil fuels like petroleum, coal, oil, natural gas and other non-renewable energy sources have been used as the primary energy source. The consumption of fossil fuels releases various harmful gases into the atmosphere as byproducts which are hazardous in nature and they tend to deplete the protective layers and affect the overall environmental balance. Also the fossil fuels are bounded resources of energy and rapid depletion of these sources of energy, have prompted the need to investigate alternate sources of energy called renewable energy. One such promising source of renewable energy is the solar/photovoltaic energy. This work focuses on investigating a new solar array architecture with solar cells connected in parallel configuration. By retaining the structural simplicity of the parallel architecture, a theoretical small signal model of the solar cell is proposed and modeled to analyze the variations in the module parameters when subjected to partial shading conditions. Simulations were run in SPICE to validate the model implemented in Matlab. The voltage limitations of the proposed architecture are addressed by adopting a simple dc-dc boost converter and evaluating the performance of the architecture in terms of efficiencies by comparing it with the traditional architectures. SPICE simulations are used to compare the architectures and identify the best one in terms of power conversion efficiency under partial shading conditions.
Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil
NASA Astrophysics Data System (ADS)
de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.
2018-05-01
A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.
Accuracy analysis and design of A3 parallel spindle head
NASA Astrophysics Data System (ADS)
Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan
2016-03-01
As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.
Zhou, Peiyu; Chen, Changshu; Ye, Jianjun; Shen, Wenjie; Xiong, Xiaofei; Hu, Ping; Fang, Hongda; Huang, Chuguang; Sun, Yongge
2015-04-15
Oil fingerprints have been a powerful tool widely used for determining the source of spilled oil. In most cases, this tool works well. However, it is usually difficult to identify the source if the oil spill accident occurs during offshore petroleum exploration due to the highly similar physiochemical characteristics of suspected oils from the same drilling platform. In this report, a case study from the waters of the South China Sea is presented, and multidimensional scaling analysis (MDS) is introduced to demonstrate how oil fingerprints can be combined with mathematical methods to identify the source of spilled oil from highly similar suspected sources. The results suggest that the MDS calculation based on oil fingerprints and subsequently integrated with specific biomarkers in spilled oils is the most effective method with a great potential for determining the source in terms of highly similar suspected oils. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Muon Conditions Data Management:. Database Architecture and Software Infrastructure
NASA Astrophysics Data System (ADS)
Verducci, Monica
2010-04-01
The management of the Muon Conditions Database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored and their analysis. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing the reconstruction and the analysis. In particular for the early data, the knowledge of the detector performance, the corrections in term of efficiency and calibration will be extremely important for the correct reconstruction of the events. In this work, an overview of the entire Muon conditions database architecture is given, in particular the different sources of the data and the storage model used, including the database technology associated. Particular emphasis is given to the Data Quality chain: the flow of the data, the analysis and the final results are described. In addition, the description of the software interfaces used to access to the conditions data are reported, in particular, in the ATLAS Offline Reconstruction framework ATHENA environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-06-01
The bibliography contains citations concerning standards and standard tests for water quality in drinking water sources, reservoirs, and distribution systems. Standards from domestic and international sources are presented. Glossaries and vocabularies that concern water quality analysis, testing, and evaluation are included. Standard test methods for individual elements, selected chemicals, sensory properties, radioactivity, and other chemical and physical properties are described. Discussions for proposed standards on new pollutant materials are briefly considered. (Contains a minimum of 203 citations and includes a subject term index and title list.)
Measurement of Fukushima Aerosol Debris in Sequim and Richland, WA and Ketchikan, AK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miley, Harry S.; Bowyer, Ted W.; Engelmann, Mark D.
2013-05-01
Aerosol collections were initiated at several locations by PNNL shortly after the Great East Japan Earthquake of May 2011. Aerosol samples were transferred to laboratory high-resolution gamma spectrometers for analysis. Similar to treaty monitoring stations operating across the Northern hemisphere, iodine and other isotopes which could be volatilized at high temperature were detected. Though these locations are not far apart, they have significant variations with respect to water, mountain-range placement, and local topography. Variation in computed source terms will be shown to bound the variability of this approach to source estimation.
Hynds, Paul D; Misstear, Bruce D; Gill, Laurence W
2013-09-30
While the safety of public drinking water supplies in the Republic of Ireland is governed and monitored at both local and national levels, there are currently no legislative tools in place relating to private supplies. It is therefore paramount that private well owners (and users) be aware of source specifications and potential contamination risks, to ensure adequate water quality. The objective of this study was to investigate the level of awareness among private well owners in the Republic of Ireland, relating to source characterisation and groundwater contamination issues. This was undertaken through interviews with 245 private well owners. Statistical analysis indicates that respondents' source type significantly influences owner awareness, particularly regarding well construction and design parameters. Water treatment, source maintenance and regular water quality testing are considered the three primary "protective actions" (or "stewardship activities") to consumption of contaminated groundwater and were reported as being absent in 64%, 72% and 40% of cases, respectively. Results indicate that the level of awareness exhibited by well users did not significantly affect the likelihood of their source being contaminated (source susceptibility); increased awareness on behalf of well users was associated with increased levels of protective action, particularly among borehole owners. Hence, lower levels of awareness may result in increased contraction of waterborne illnesses where contaminants have entered the well. Accordingly, focused educational strategies to increase awareness among private groundwater users are advocated in the short-term; the development and introdiction of formal legislation is recommended in the long-term, including an integrated programme of well inspections and risk assessments. Copyright © 2013 Elsevier Ltd. All rights reserved.
Effects of volcano topography on seismic broad-band waveforms
NASA Astrophysics Data System (ADS)
Neuberg, Jürgen; Pointer, Tim
2000-10-01
Volcano seismology often deals with rather shallow seismic sources and seismic stations deployed in their near field. The complex stratigraphy on volcanoes and near-field source effects have a strong impact on the seismic wavefield, complicating the interpretation techniques that are usually employed in earthquake seismology. In addition, as most volcanoes have a pronounced topography, the interference of the seismic wavefield with the stress-free surface results in severe waveform perturbations that affect seismic interpretation methods. In this study we deal predominantly with the surface effects, but take into account the impact of a typical volcano stratigraphy as well as near-field source effects. We derive a correction term for plane seismic waves and a plane-free surface such that for smooth topographies the effect of the free surface can be totally removed. Seismo-volcanic sources radiate energy in a broad frequency range with a correspondingly wide range of different Fresnel zones. A 2-D boundary element method is employed to study how the size of the Fresnel zone is dependent on source depth, dominant wavelength and topography in order to estimate the limits of the plane wave approximation. This approximation remains valid if the dominant wavelength does not exceed twice the source depth. Further aspects of this study concern particle motion analysis to locate point sources and the influence of the stratigraphy on particle motions. Furthermore, the deployment strategy of seismic instruments on volcanoes, as well as the direct interpretation of the broad-band waveforms in terms of pressure fluctuations in the volcanic plumbing system, are discussed.
26 CFR 1.737-1 - Recognition of precontribution gain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...
Data-optimized source modeling with the Backwards Liouville Test–Kinetic method
Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.; ...
2017-09-14
In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less
MICROBIAL LABORATORY GUIDANCE MANUAL FOR THE ...
The Long-Term 2 Enhanced Surface Water Treatment Rule Laboratory Instruction Manual will be a compilation of all information needed by laboratories and field personnel to collect, analyze, and report the microbiological data required under the rule. The manual will provide laboratories with a single source of information that currently is available from various sources including the latest versions of Methods 1622 and 1623, including all approved, equivalent modifications; the procedures for E.coli methods approved for use under the LT2ESWTR; lists of vendor sources; data recording forms; data reporting requirements; information on the Laboratory Quality Assurance Evaluation Program for the Analysis of Cryptosporidium in Water; and sample collection procedures. Although most of this information is available elsewhere, a single, comprehensive compendium containing this information is needed to aid utilities and laboratories performing the sampling and analysis activities required under the LT2 rule. This manual will serve as an instruction manual for laboratories to use when collecting data for Crypto, E. coli and turbidity.
NASA Astrophysics Data System (ADS)
Goldstein, Janna; Veitch, John; Sesana, Alberto; Vecchio, Alberto
2018-04-01
Super-massive black hole binaries are expected to produce a gravitational wave (GW) signal in the nano-Hertz frequency band which may be detected by pulsar timing arrays (PTAs) in the coming years. The signal is composed of both stochastic and individually resolvable components. Here we develop a generic Bayesian method for the analysis of resolvable sources based on the construction of `null-streams' which cancel the part of the signal held in common for each pulsar (the Earth-term). For an array of N pulsars there are N - 2 independent null-streams that cancel the GW signal from a particular sky location. This method is applied to the localisation of quasi-circular binaries undergoing adiabatic inspiral. We carry out a systematic investigation of the scaling of the localisation accuracy with signal strength and number of pulsars in the PTA. Additionally, we find that source sky localisation with the International PTA data release one is vastly superior than what is achieved by its constituent regional PTAs.
Observation-based source terms in the third-generation wave model WAVEWATCH
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.
2015-12-01
Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.
Bayesian source term determination with unknown covariance of measurements
NASA Astrophysics Data System (ADS)
Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav
2017-04-01
Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Hohoff, Ariane; Rabe, Heike; Ehmer, Ulrike; Harms, Erik
2005-01-01
Background The evidence on prematurity as 'a priori' a risk for palatal disturbances that increase the need for orthodontic or orthognathic treatment is still weak. Further well-designed clinical studies are needed. The objective of this review is to provide a fundamental analysis of methodologies, confounding factors, and outcomes of studies on palatal development. One focus of this review is the analysis of studies on the palate of the term newborn, since knowing what is 'normal' is a precondition of being able to assess abnormalities. Methods A search profile based on Cochrane search strategies applied to 10 medical databases was used to identify existing studies. Articles, mainly those published before 1960, were identified from hand searches in textbooks, encyclopedias, reference lists and bibliographies. Sources in English, German, and French of more than a century were included. Data for term infants were recalculated if particular information about weight, length, or maturity was given. The extracted values, especially those from non-English paper sources, were provided unfiltered for comparison. Results The search strategy yielded 182 articles, of which 155 articles remained for final analysis. Morphology of the term newborn's palate was of great interest in the first half of the last century. Two general methodologies were used to assess palatal morphology: visual and metrical descriptions. Most of the studies on term infants suffer from lack of reliability tests. The groove system was recognized as the distinctive feature of the infant palate. The shape of the palate of the term infant may vary considerably, both visually and metrically. Gender, race, mode of delivery, and nasal deformities were identified as causes contributing to altered palatal morphology. Until today, anatomical features of the newborn's palate are subject to a non-uniform nomenclature. Conclusion Today's knowledge of a newborn's 'normal' palatal morphology is based on non-standardized and limited methodologies for measuring a three-dimensional shape. This shortcoming increases bias and is the reason for contradictory research results, especially if pathologic conditions like syndromes or prematurity are involved. Adequate measurement techniques are needed and the 'normal palatal morphology' should be defined prior to new clinical studies on palatal development. PMID:16270908
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Unveiling the physics of AGN through X-ray variability
NASA Astrophysics Data System (ADS)
Hernández-García, L.; González-Martín, O.; Masegosa, J.; Márquez, I.
2017-03-01
Although variability is a general property characterizing active galactic nuclei (AGN), it is not well established whether the changes occur in the same way in every nuclei. The main purpose of this work is to study the X-ray variability pattern(s) in AGN selected at optical wavelengths in a large sample, including low ionization nuclear emission line regions (LINERs) and type 1.8, 1.9, and 2 Seyferts, using the public archives in Chandra and/or XMM-Newton. Spectra of the same source gathered at different epochs were simultaneously fitted to study long term variations; the variability patterns were studied allowing different parameters to vary during the spectral fit. Whenever possible, short term variations from the analysis of the light curves and long term UV flux variability were studied. Variations at X-rays in timescales of months/years are very common in all AGN families but short term variations are only found in type 1.8 and 1.9 Seyferts. The main driver of the long term X-ray variations seems to be related to changes in the nuclear power. Other variability patterns cannot be discarded in a few cases. We discuss the geometry and physics of AGN through the X-ray variability analysis.
ERIC Educational Resources Information Center
Gardner, Sheena; Rea-Dickins, Pauline
2001-01-01
Investigates teacher representations of language in relation to assessment contexts. Analyzes not only what is represented in teachers' use of metalanguage, but also how it is presented--in terms of expression, voice, and source. The analysis is based on interviews with teachers, transcripts of lessons, and classroom-based assessments, formal…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-18
... on the occurrence of Cryptosporidium or E. coli in their source waters. Systems that are placed into... categories (i.e., bins) a public drinking water system (PWS) should be placed. The second topic is the... of Escherichia coli as a screen to identify small filtered PWSs that need to perform Cryptosporidium...
2014-10-01
time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the... data , weapon design data , firing posture, range and sex. 15. SUBJECT TERMS Soldier shooting performance, anthropometrics, weapon design, range, sex...Conditions and Test Matrix ......................................................................9 7. Data Analysis 9 7.1 Data Stratification
Performance Analysis of AeroRP with Ground Station Advertisements
2012-03-12
results showed that AeroRP outperforms the traditional MANET routing protocols in terms of throughput and packet delivery ra - tio (PDR) [5, 6]. AeroRP...and waiting for the source to re- send the packet increases the end-to-end delay. The AeroNP corruption indicator and HEC -CRC (header error check...Dev ID | NP HEC CRC-16 | +-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+ \\ \\ / AeroTP Payload
An Improved Neutron Transport Algorithm for Space Radiation
NASA Technical Reports Server (NTRS)
Heinbockel, John H.; Clowdsley, Martha S.; Wilson, John W.
2000-01-01
A low-energy neutron transport algorithm for use in space radiation protection is developed. The algorithm is based upon a multigroup analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. This analysis is accomplished by solving a realistic but simplified neutron transport test problem. The test problem is analyzed by using numerical and analytical procedures to obtain an accurate solution within specified error bounds. Results from the test problem are then used for determining mean values associated with rescattering terms that are associated with a multigroup solution of the straight-ahead Boltzmann equation. The algorithm is then coupled to the Langley HZETRN code through the evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for a water and an aluminum-water shield-target configuration is then compared with LAHET and MCNPX Monte Carlo code calculations for the same shield-target configuration. The algorithm developed showed a great improvement in results over the unmodified HZETRN solution. In addition, a two-directional solution of the evaporation source showed even further improvement of the fluence near the front of the water target where diffusion from the front surface is important.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mark DeHart; William Skerjanc; Sean Morrell
2012-06-01
Analysis of the performance of the ATR with a LEU fuel design shows promise in terms of a core design that will yield the same neutron sources in target locations. A proposed integral cladding burnable absorber design appears to meet power profile requirements that will satisfy power distributions for safety limits. Performance of this fuel design is ongoing; the current work is the initial evaluation of the core performance of this fuel design with increasing burnup. Results show that LEU fuel may have a longer lifetime that HEU fuel however, such limits may be set by mechanical performance of themore » fuel rather that available reactivity. Changes seen in the radial fuel power distribution with burnup in LEU fuel will require further study to ascertain the impact on neutron fluxes in target locations. Source terms for discharged fuel have also been studied. By its very nature, LEU fuel produces much more plutonium than is present in HEU fuel at discharge. However, the effect of the plutonium inventory appears to have little affect on radiotoxicity or decay heat in the fuel.« less
Monitoring the Black Hole Binary GRS 1758-258 with INTEGRAL and RXTE
NASA Technical Reports Server (NTRS)
Pottschmidt, Katja; Chernyakova, Masha; Lubinski, Piotr; Migliari, Simone; Smith, David M.; Zdziarski, Andrzej A.; Tomsick, John A.; Bezayiff, N.; Kreykenbohm, Ingo; Kretschmar, Peter;
2008-01-01
The microquasar GRS 1758-258 is one of only three persistent black hole binaries that spend most of their time in the hard spectral state, the other two being Cyg X-l and 1E 1741.7-2942. It therefore provides the rare opportunity for an extensive long term study of this important black hole state which is associated with strong variability and radio jet emission. INTEGRAL has been monitoring the source since the first Galactic Center Deep Exposure season in spring 2003 during two 2-3 months long Galactic Center viewing epochs each year, amounting to 11 epochs including spring of 2008. With the exception of the last epoch quasi-simultaneous RXTE monitoring observations are available as well. Here we present an analysis of the epoch averaged broad band spectra which display considerable long term variability, most notably the occurrence of two soft/off states, extreme examples for the hysteretic behavior of black hole binaries. The hard source spectrum and long exposures allow us to extend the analysis for several epochs to approximately 800 keV using PICsIT data and address the question of the presence of a non-thermal Comptonization component.
Silvente, Sonia; Sobolev, Anatoly P.; Lara, Miguel
2012-01-01
Soybean (Glycine max L.) is an important source of protein for human and animal nutrition, as well as a major source of vegetable oil. The soybean crop requires adequate water all through its growth period to attain its yield potential, and the lack of soil moisture at critical stages of growth profoundly impacts the productivity. In this study, utilizing 1H NMR-based metabolite analysis combined with the physiological studies we assessed the effects of short-term water stress on overall growth, nitrogen fixation, ureide and proline dynamics, as well as metabolic changes in drought tolerant (NA5009RG) and sensitive (DM50048) genotypes of soybean in order to elucidate metabolite adjustments in relation to the physiological responses in the nitrogen-fixing plants towards water limitation. The results of our analysis demonstrated critical differences in physiological responses between these two genotypes, and identified the metabolic pathways that are affected by short-term water limitation in soybean plants. Metabolic changes in response to drought conditions highlighted pools of metabolites that play a role in the adjustment of metabolism and physiology of the soybean varieties to meet drought effects. PMID:22685583
Global threat to agriculture from invasive species.
Paini, Dean R; Sheppard, Andy W; Cook, David C; De Barro, Paul J; Worner, Susan P; Thomas, Matthew B
2016-07-05
Invasive species present significant threats to global agriculture, although how the magnitude and distribution of the threats vary between countries and regions remains unclear. Here, we present an analysis of almost 1,300 known invasive insect pests and pathogens, calculating the total potential cost of these species invading each of 124 countries of the world, as well as determining which countries present the greatest threat to the rest of the world given their trading partners and incumbent pool of invasive species. We find that countries vary in terms of potential threat from invasive species and also their role as potential sources, with apparently similar countries sometimes varying markedly depending on specifics of agricultural commodities and trade patterns. Overall, the biggest agricultural producers (China and the United States) could experience the greatest absolute cost from further species invasions. However, developing countries, in particular, Sub-Saharan African countries, appear most vulnerable in relative terms. Furthermore, China and the United States represent the greatest potential sources of invasive species for the rest of the world. The analysis reveals considerable scope for ongoing redistribution of known invasive pests and highlights the need for international cooperation to slow their spread.
Global threat to agriculture from invasive species
Paini, Dean R.; Sheppard, Andy W.; Cook, David C.; De Barro, Paul J.; Worner, Susan P.; Thomas, Matthew B.
2016-01-01
Invasive species present significant threats to global agriculture, although how the magnitude and distribution of the threats vary between countries and regions remains unclear. Here, we present an analysis of almost 1,300 known invasive insect pests and pathogens, calculating the total potential cost of these species invading each of 124 countries of the world, as well as determining which countries present the greatest threat to the rest of the world given their trading partners and incumbent pool of invasive species. We find that countries vary in terms of potential threat from invasive species and also their role as potential sources, with apparently similar countries sometimes varying markedly depending on specifics of agricultural commodities and trade patterns. Overall, the biggest agricultural producers (China and the United States) could experience the greatest absolute cost from further species invasions. However, developing countries, in particular, Sub-Saharan African countries, appear most vulnerable in relative terms. Furthermore, China and the United States represent the greatest potential sources of invasive species for the rest of the world. The analysis reveals considerable scope for ongoing redistribution of known invasive pests and highlights the need for international cooperation to slow their spread. PMID:27325781
Point focusing using loudspeaker arrays from the perspective of optimal beamforming.
Bai, Mingsian R; Hsieh, Yu-Hao
2015-06-01
Sound focusing is to create a concentrated acoustic field in the region surrounded by a loudspeaker array. This problem was tackled in the previous research via the Helmholtz integral approach, brightness control, acoustic contrast control, etc. In this paper, the same problem was revisited from the perspective of beamforming. A source array model is reformulated in terms of the steering matrix between the source and the field points, which lends itself to the use of beamforming algorithms such as minimum variance distortionless response (MVDR) and linearly constrained minimum variance (LCMV) originally intended for sensor arrays. The beamforming methods are compared with the conventional methods in terms of beam pattern, directional index, and control effort. Objective tests are conducted to assess the audio quality by using perceptual evaluation of audio quality (PEAQ). Experiments of produced sound field and listening tests are conducted in a listening room, with results processed using analysis of variance and regression analysis. In contrast to the conventional energy-based methods, the results have shown that the proposed methods are phase-sensitive in light of the distortionless constraint in formulating the array filters, which helps enhance audio quality and focusing performance.
Analysis of Solar Spectral Irradiance Measurements from the SBUV/2-Series and the SSBUV Instruments
NASA Technical Reports Server (NTRS)
Cebula, Richard P.; DeLand, Matthew T.; Hilsenrath, Ernest
1997-01-01
During this period of performance, 1 March 1997 - 31 August 1997, the NOAA-11 SBUV/2 solar spectral irradiance data set was validated using both internal and external assessments. Initial quality checking revealed minor problems with the data (e.g. residual goniometric errors, that were manifest as differences between the two scans acquired each day). The sources of these errors were determined and the errors were corrected. Time series were constructed for selected wavelengths and the solar irradiance changes measured by the instrument were compared to a Mg II proxy-based model of short- and long-term solar irradiance variations. This analysis suggested that errors due to residual, uncorrected long-term instrument drift have been reduced to less than 1-2% over the entire 5.5 year NOAA-11 data record. Detailed statistical analysis was performed. This analysis, which will be documented in a manuscript now in preparation, conclusively demonstrates the evolution of solar rotation periodicity and strength during solar cycle 22.
Immunocompetence analysis of the aquatic snail Lymnaea stagnalis exposed to urban wastewaters.
Boisseaux, Paul; Noury, Patrice; Delorme, Nicolas; Perrier, Lucile; Thomas-Guyon, Helene; Garric, Jeanne
2018-04-02
Wastewater treatment plant effluents from urban area are a well-known source of chronic multiple micropollution to the downstream living organisms. In this study, ecologically relevant laboratory-bred freshwater gastropods, Lymnaea stagnalis, were exposed for 29 days to raw effluents of a wastewater treatment plant in Lyon area (France). A time-course analysis of individual markers of immunocompetence (hemocyte density and viability, hemocyte NADPH activity, phenol oxidase activity, and capacity of phagocytosis) has shown slight trends of inflammatory-like responses induced by the 100% effluents. So far, no short-term hazard for L. stagnalis can be revealed. However, over the long term, such environmental stress-stimulating immune responses could provoke deleterious life history trade-offs because the immune system is known to be highly energy-consuming.
Thomas, François; Cébron, Aurélie
2016-01-01
Over the last decades, understanding of the effects of plants on soil microbiomes has greatly advanced. However, knowledge on the assembly of rhizospheric communities in aged-contaminated industrial soils is still limited, especially with regard to transcriptionally active microbiomes and their link to the quality or quantity of carbon sources. We compared the short-term (2–10 days) dynamics of bacterial communities and potential PAH-degrading bacteria in bare or ryegrass-planted aged-contaminated soil spiked with phenanthrene, put in relation with dissolved organic carbon (DOC) sources and polycyclic aromatic hydrocarbon (PAH) pollution. Both resident and active bacterial communities (analyzed from DNA and RNA, respectively) showed higher species richness and smaller dispersion between replicates in planted soils. Root development strongly favored the activity of Pseudomonadales within the first 2 days, and of members of Actinobacteria, Caulobacterales, Rhizobiales, and Xanthomonadales within 6–10 days. Plants slowed down the dissipation of phenanthrene, while root exudation provided a cocktail of labile substrates that might preferentially fuel microbial growth. Although the abundance of PAH-degrading genes increased in planted soil, their transcription level stayed similar to bare soil. In addition, network analysis revealed that plants induced an early shift in the identity of potential phenanthrene degraders, which might influence PAH dissipation on the long-term. PMID:26903971
SPECTRAL SURVEY OF X-RAY BRIGHT ACTIVE GALACTIC NUCLEI FROM THE ROSSI X-RAY TIMING EXPLORER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivers, Elizabeth; Markowitz, Alex; Rothschild, Richard, E-mail: erivers@ucsd.edu
2011-03-15
Using long-term monitoring data from the Rossi X-ray Timing Explorer (RXTE), we have selected 23 active galactic nuclei (AGNs) with sufficient brightness and overall observation time to derive broadband X-ray spectra from 3 to {approx}>100 keV. Our sample includes mainly radio-quiet Seyferts, as well as seven radio-loud sources. Given the longevity of the RXTE mission, the greater part of our data is spread out over more than a decade, providing truly long-term average spectra and eliminating inconsistencies arising from variability. We present long-term average values of absorption, Fe line parameters, Compton reflection strengths, and photon indices, as well as fluxesmore » and luminosities for the hard and very hard energy bands, 2-10 keV and 20-100 keV, respectively. We find tentative evidence for high-energy rollovers in three of our objects. We improve upon previous surveys of the very hard X-ray energy band in terms of accuracy and sensitivity, particularly with respect to confirming and quantifying the Compton reflection component. This survey is meant to provide a baseline for future analysis with respect to the long-term averages for these sources and to cement the legacy of RXTE, and especially its High Energy X-ray Timing Experiment, as a contributor to AGN spectral science.« less
Short-term Wind Forecasting at Wind Farms using WRF-LES and Actuator Disk Model
NASA Astrophysics Data System (ADS)
Kirkil, Gokhan
2017-04-01
Short-term wind forecasts are obtained for a wind farm on a mountainous terrain using WRF-LES. Multi-scale simulations are also performed using different PBL parameterizations. Turbines are parameterized using Actuator Disc Model. LES models improved the forecasts. Statistical error analysis is performed and ramp events are analyzed. Complex topography of the study area affects model performance, especially the accuracy of wind forecasts were poor for cross valley-mountain flows. By means of LES, we gain new knowledge about the sources of spatial and temporal variability of wind fluctuations such as the configuration of wind turbines.
Gravitational wave source counts at high redshift and in models with extra dimensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
García-Bellido, Juan; Nesseris, Savvas; Trashorras, Manuel, E-mail: juan.garciabellido@uam.es, E-mail: savvas.nesseris@csic.es, E-mail: manuel.trashorras@csic.es
2016-07-01
Gravitational wave (GW) source counts have been recently shown to be able to test how gravitational radiation propagates with the distance from the source. Here, we extend this formalism to cosmological scales, i.e. the high redshift regime, and we discuss the complications of applying this methodology to high redshift sources. We also allow for models with compactified extra dimensions like in the Kaluza-Klein model. Furthermore, we also consider the case of intermediate redshifts, i.e. 0 < z ∼< 1, where we show it is possible to find an analytical approximation for the source counts dN / d ( S /more » N ). This can be done in terms of cosmological parameters, such as the matter density Ω {sub m} {sub ,0} of the cosmological constant model or the cosmographic parameters for a general dark energy model. Our analysis is as general as possible, but it depends on two important factors: a source model for the black hole binary mergers and the GW source to galaxy bias. This methodology also allows us to obtain the higher order corrections of the source counts in terms of the signal-to-noise S / N . We then forecast the sensitivity of future observations in constraining GW physics but also the underlying cosmology by simulating sources distributed over a finite range of signal-to-noise with a number of sources ranging from 10 to 500 sources as expected from future detectors. We find that with 500 events it will be possible to provide constraints on the matter density parameter at present Ω {sub m} {sub ,0} on the order of a few percent and with the precision growing fast with the number of events. In the case of extra dimensions we find that depending on the degeneracies of the model, with 500 events it may be possible to provide stringent limits on the existence of the extra dimensions if the aforementioned degeneracies can be broken.« less
Micropollutants in urban watersheds : substance flow analysis as management tool
NASA Astrophysics Data System (ADS)
Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.
2009-04-01
Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.
Information Communication using Knowledge Engine on Flood Issues
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2012-04-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.
Transfer function analysis of thermospheric perturbations
NASA Technical Reports Server (NTRS)
Mayr, H. G.; Harris, I.; Varosi, F.; Herrero, F. A.; Spencer, N. W.
1986-01-01
Applying perturbation theory, a spectral model in terms of vectors spherical harmonics (Legendre polynomials) is used to describe the short term thermospheric perturbations originating in the auroral regions. The source may be Joule heating, particle precipitation or ExB ion drift-momentum coupling. A multiconstituent atmosphere is considered, allowing for the collisional momentum exchange between species including Ar, O2, N2, O, He and H. The coupled equations of energy, mass and momentum conservation are solved simultaneously for the major species N2 and O. Applying homogeneous boundary conditions, the integration is carred out from the Earth's surface up to 700 km. In the analysis, the spherical harmonics are treated as eigenfunctions, assuming that the Earth's rotation (and prevailing circulation) do not significantly affect perturbations with periods which are typically much less than one day. Under these simplifying assumptions, and given a particular source distribution in the vertical, a two dimensional transfer function is constructed to describe the three dimensional response of the atmosphere. In the order of increasing horizontal wave numbers (order of polynomials), this transfer function reveals five components. To compile the transfer function, the numerical computations are very time consuming (about 100 hours on a VAX for one particular vertical source distribution). However, given the transfer function, the atmospheric response in space and time (using Fourier integral representation) can be constructed with a few seconds of a central processing unit. This model is applied in a case study of wind and temperature measurements on the Dynamics Explorer B, which show features characteristic of a ringlike excitation source in the auroral oval. The data can be interpreted as gravity waves which are focused (and amplified) in the polar region and then are reflected to propagate toward lower latitudes.
Transfer function analysis of thermospheric perturbations
NASA Astrophysics Data System (ADS)
Mayr, H. G.; Harris, I.; Varosi, F.; Herrero, F. A.; Spencer, N. W.
1986-06-01
Applying perturbation theory, a spectral model in terms of vectors spherical harmonics (Legendre polynomials) is used to describe the short term thermospheric perturbations originating in the auroral regions. The source may be Joule heating, particle precipitation or ExB ion drift-momentum coupling. A multiconstituent atmosphere is considered, allowing for the collisional momentum exchange between species including Ar, O2, N2, O, He and H. The coupled equations of energy, mass and momentum conservation are solved simultaneously for the major species N2 and O. Applying homogeneous boundary conditions, the integration is carred out from the Earth's surface up to 700 km. In the analysis, the spherical harmonics are treated as eigenfunctions, assuming that the Earth's rotation (and prevailing circulation) do not significantly affect perturbations with periods which are typically much less than one day. Under these simplifying assumptions, and given a particular source distribution in the vertical, a two dimensional transfer function is constructed to describe the three dimensional response of the atmosphere. In the order of increasing horizontal wave numbers (order of polynomials), this transfer function reveals five components. To compile the transfer function, the numerical computations are very time consuming (about 100 hours on a VAX for one particular vertical source distribution). However, given the transfer function, the atmospheric response in space and time (using Fourier integral representation) can be constructed with a few seconds of a central processing unit. This model is applied in a case study of wind and temperature measurements on the Dynamics Explorer B, which show features characteristic of a ringlike excitation source in the auroral oval. The data can be interpreted as gravity waves which are focused (and amplified) in the polar region and then are reflected to propagate toward lower latitudes.
Grba, Nenad; Krčmar, Dejan; Isakovski, Marijana Kragulj; Jazić, Jelena Molnar; Maletić, Snežana; Pešić, Vesna; Dalmacija, Božo
2016-11-01
Surface sediments were subject to systematic long-term monitoring (2002-2014) in the Republic of Serbia (Province of Vojvodina). Eight heavy metals (Ni, Zn, Cd, Cr, Cu, Pb, As and Hg), mineral oils (total petroleum hydrocarbons), 16 EPA PAHs, selected pesticides and polychlorinated biphenyls (PCB) were monitored. As part of this research, this paper presents a sediment contamination spatial and temporal trend study of diverse pollution sources and the ecological risk status of the alluvial sediments of Carska Bara at three representative sampling sites (S1S3), in order to establish the status of contamination and recommend substances of interest for more widespread future monitoring. Multivariate statistical methods including factor analysis of principal component analysis (PCA/FA), Pearson correlation and several synthetic indicators were used to evaluate the extent and origin of contamination (anthropogenic or natural, geogenic sources) and potential ecological risks. Hg, Cd, As, mineral oils and PAHs (dominated by dibenzo(a,h)anthracene and benzo(a)pyrene, contributing 85.7% of the total) are derived from several anthropogenic sources, whereas Ni, Cu, Cr and Zn are convincingly of geogenic origin, and exhibit dual origins. Cd and Hg significantly raise the levels of potential ecological risk for all sampling locations, demonstrating the effect of long-term bioaccumulation and biomagnification. Pb is isolated from the other parameters, implying unique sources. This research suggests four heavy metals (Zn, Cr, Cu and As) and dibenzo(a,h)anthracene be added to the list of priority pollutants within the context of the application of the European Water Framework Directive (WFD), in accordance with significant national and similar environmental data from countries in the region. Copyright © 2016 Elsevier Ltd. All rights reserved.
Versatile new ion source for the analysis of materials in open air under ambient conditions.
Cody, Robert B; Laramée, James A; Durst, H Dupont
2005-04-15
A new ion source has been developed for rapid, noncontact analysis of materials at ambient pressure and at ground potential. The new source, termed DART (for "Direct Analysis in Real Time"), is based on the reactions of electronic or vibronic excited-state species with reagent molecules and polar or nonpolar analytes. DART has been installed on a high-resolution time-of-flight mass spectrometer (TOFMS) that provides improved selectivity and accurate elemental composition assignment through exact mass measurements. Although DART has been applied to the analysis of gases, liquids, and solids, a unique application is the direct detection of chemicals on surfaces without requiring sample preparation, such as wiping or solvent extraction. DART has demonstrated success in sampling hundreds of chemicals, including chemical agents and their signatures, pharmaceutics, metabolites, peptides and oligosaccharides, synthetic organics, organometallics, drugs of abuse, explosives, and toxic industrial chemicals. These species were detected on various surfaces, such as concrete, asphalt, human skin, currency, airline boarding passes, business cards, fruits, vegetables, spices, beverages, body fluids, horticultural leaves, cocktail glasses, and clothing. DART employs no radioactive components and is more versatile than devices using radioisotope-based ionization. Because its response is instantaneous, DART provides real-time information, a critical requirement for screening or high throughput.
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-07-26
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-01-11
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E
1994-01-01
OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788
Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis
NASA Astrophysics Data System (ADS)
Moridnejad, A.; Karimi, N.; Ariya, P. A.
2014-12-01
The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.
Çakar, Mehmet; Tari Kasnakoglu, Berna; Ökem, Zeynep Güldem; Okuducu, Ümmühan; Beksaç, M Sinan
2016-12-01
The goal is to explore the effects of age, education, obstetric history and information sources on the (Beck) anxiety levels of pregnant women attending invasive prenatal testing. Questionnaire results from 152 pregnant women are utilized. Results are analyzed through an independent samples t-test and a two-step cluster analysis attempting to categorize patients in terms of the chosen variables. t-Tests reveal that age, education and bad obstetric history do not significantly affect anxiety levels. Descriptive statistics indicate that almost 60% of patients feel anxious mostly because of the fear of receiving bad news, followed by the fear of miscarriage, the fear of pain and the fear of hurting the baby. According to the cluster analysis, patients who use doctors or nurses as information sources have significantly lower anxiety levels, while those who do not receive information from any source have the second lowest level of anxiety. Patients who receive information from personal sources (i.e. friends and family) have the highest level of anxiety. Anxiety levels do not change according to test type. Doctors and nurses should allocate enough time for providing information about prenatal diagnosis before the procedure. This will reduce the anxiety level as well as the felt necessity to search for information from other sources, such as personal or popular which will further increase the level of anxiety.
Flow of GE90 Turbofan Engine Simulated
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
1999-01-01
The objective of this task was to create and validate a three-dimensional model of the GE90 turbofan engine (General Electric) using the APNASA (average passage) flow code. This was a joint effort between GE Aircraft Engines and the NASA Lewis Research Center. The goal was to perform an aerodynamic analysis of the engine primary flow path, in under 24 hours of CPU time, on a parallel distributed workstation system. Enhancements were made to the APNASA Navier-Stokes code to make it faster and more robust and to allow for the analysis of more arbitrary geometry. The resulting simulation exploited the use of parallel computations by using two levels of parallelism, with extremely high efficiency.The primary flow path of the GE90 turbofan consists of a nacelle and inlet, 49 blade rows of turbomachinery, and an exhaust nozzle. Secondary flows entering and exiting the primary flow path-such as bleed, purge, and cooling flows-were modeled macroscopically as source terms to accurately simulate the engine. The information on these source terms came from detailed descriptions of the cooling flow and from thermodynamic cycle system simulations. These provided boundary condition data to the three-dimensional analysis. A simplified combustor was used to feed boundary conditions to the turbomachinery. Flow simulations of the fan, high-pressure compressor, and high- and low-pressure turbines were completed with the APNASA code.
Kothari, Anita; Boyko, Jennifer A; Campbell-Davison, Andrea
2015-09-09
Informal knowledge is used in public health practice to make sense of research findings. Although knowledge translation theories highlight the importance of informal knowledge, it is not clear to what extent the same literature provides guidance in terms of how to use it in practice. The objective of this study was to address this gap by exploring what planned action theories suggest in terms of using three types of informal knowledge: local, experiential and expert. We carried out an exploratory secondary analysis of the planned action theories that informed the development of a popular knowledge translation theory. Our sample included twenty-nine (n = 29) papers. We extracted information from these papers about sources of and guidance for using informal knowledge, and then carried out a thematic analysis. We found that theories of planned action provide guidance (including sources of, methods for identifying, and suggestions for use) for using local, experiential and expert knowledge. This study builds on previous knowledge translation related work to provide insight into the practical use of informal knowledge. Public health practitioners can refer to the guidance summarized in this paper to inform their decision-making. Further research about how to use informal knowledge in public health practice is needed given the value being accorded to using informal knowledge in public health decision-making processes.
A Semi-implicit Treatment of Porous Media in Steady-State CFD.
Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund
There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.
Mantini, D; Franciotti, R; Romani, G L; Pizzella, V
2008-03-01
The major limitation for the acquisition of high-quality magnetoencephalography (MEG) recordings is the presence of disturbances of physiological and technical origins: eye movements, cardiac signals, muscular contractions, and environmental noise are serious problems for MEG signal analysis. In the last years, multi-channel MEG systems have undergone rapid technological developments in terms of noise reduction, and many processing methods have been proposed for artifact rejection. Independent component analysis (ICA) has already shown to be an effective and generally applicable technique for concurrently removing artifacts and noise from the MEG recordings. However, no standardized automated system based on ICA has become available so far, because of the intrinsic difficulty in the reliable categorization of the source signals obtained with this technique. In this work, approximate entropy (ApEn), a measure of data regularity, is successfully used for the classification of the signals produced by ICA, allowing for an automated artifact rejection. The proposed method has been tested using MEG data sets collected during somatosensory, auditory and visual stimulation. It was demonstrated to be effective in attenuating both biological artifacts and environmental noise, in order to reconstruct clear signals that can be used for improving brain source localizations.
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
The evolution of methods for noise prediction of high speed rotors and propellers in the time domain
NASA Technical Reports Server (NTRS)
Farassat, F.
1986-01-01
Linear wave equation models which have been used over the years at NASA Langley for describing noise emissions from high speed rotating blades are summarized. The noise sources are assumed to lie on a moving surface, and analysis of the situation has been based on the Ffowcs Williams-Hawkings (FW-H) equation. Although the equation accounts for two surface and one volume source, the NASA analyses have considered only the surface terms. Several variations on the FW-H model are delineated for various types of applications, noting the computational benefits of removing the frequency dependence of the calculations. Formulations are also provided for compact and noncompact sources, and features of Long's subsonic integral equation and Farassat's high speed integral equation are discussed. The selection of subsonic or high speed models is dependent on the Mach number of the blade surface where the source is located.
Flow diagram analysis of electrical fatalities in construction industry.
Chi, Chia-Fen; Lin, Yuan-Yuan; Ikhwan, Mohamad
2012-01-01
The current study reanalyzed 250 electrical fatalities in the construction industry from 1996 to 2002 into seven patterns based on source of electricity (power line, energized equipment, improperly installed or damaged equipment), direct contact or indirect contact through some source of injury (boom vehicle, metal bar or pipe, and other conductive material). Each fatality was coded in terms of age, company size, experience, performing tasks, source of injury, accident cause and hazard pattern. The Chi-square Automatic Interaction Detector (CHAID) was applied to the coded data of the fatal electrocution to find a subset of predictors that might derive meaningful classifications or accidents scenarios. A series of Flow Diagrams was constructed based on CHAID result to illustrate the flow of electricity travelling from electrical source to human body. Each of the flow diagrams can be directly linked with feasible prevention strategies by cutting the flow of electricity.
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne
2014-01-01
Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.
Sani, Norrakiah Abdullah; Odeyemi, Olumide A
2015-01-01
Cronobacter species are motile, non-spore forming, Gram negative emerging opportunistic pathogens mostly associated with bacteremia, meningitis, septicemia, brain abscesses and necrotizing enterocolitis in infected neonates, infants and immunocompromised adults. Members of the genus Cronobacter are previously associated with powdered infant formula although the main reservoir and routes of contamination are yet to be ascertained. This study therefore aim to summarize occurrence and prevalence of Cronobacter spp. from different food related sources. A retrospective systematic review and meta-analysis of peer reviewed primary studies reported between 2008 and 2014 for the occurrence and prevalence of Cronobacter spp. in animal and plant related sources was conducted using "Cronobacter isolation", "Cronobacter detection" and "Cronobacter enumeration" as search terms in the following databases: Web of Science (Science Direct) and ProQuest. Data extracted from the primary studies were then analyzed with meta-analysis techniques for effect rate and fixed effects was used to explore heterogeneity between the sources. Publication bias was evaluated using funnel plot. A total of 916 articles were retrieved from the data bases of which 28 articles met inclusion criteria. Cronobacter spp. could only be isolated from 103 (5.7 %) samples of animal related food while 123 (19 %) samples of plant related food samples harbors the bacteria. The result of this study shows that occurrence of Cronobacter was more prevalent in plant related sources with overall prevalence rate of 20.1 % (95 % CI 0.168-0.238) than animal originated sources with overall prevalence rate of 8 % (95 % CI 0.066-0.096). High heterogeneity (I (2) = 84) was observed mostly in plant related sources such as herbs, spices and vegetables compared to animal related sources (I (2) = 82). It could be observed from this study that plant related sources serve as reservoir and contamination routes of Cronobacter spp.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friese, Judah I.; Kephart, Rosara F.; Lucas, Dawn D.
2013-05-01
The Comprehensive Nuclear Test Ban Treaty (CTBT) has remote radionuclide monitoring followed by an On Site Inspection (OSI) to clarify the nature of a suspect event. An important aspect of radionuclide measurements on site is the discrimination of other potential sources of similar radionuclides such as reactor accidents or medical isotope production. The Chernobyl and Fukushima nuclear reactor disasters offer two different reactor source term environmental inputs that can be compared against historical measurements of nuclear explosions. The comparison of whole-sample gamma spectrometry measurements from these three events and the analysis of similarities and differences are presented. This analysis ismore » a step toward confirming what is needed for measurements during an OSI under the auspices of the Comprehensive Test Ban Treaty.« less
Association of Internet search trends with suicide death in Taipei City, Taiwan, 2004-2009.
Yang, Albert C; Tsai, Shi-Jen; Huang, Norden E; Peng, Chung-Kang
2011-07-01
Although Internet has become an important source for affected people seeking suicide information, the connection between Internet searches for suicide information and suicidal death remains largely unknown. This study aims to evaluate the association between suicide and Internet searches trends for 37 suicide-related terms representing major known risks of suicide. This study analyzes suicide death data in Taipei City, Taiwan and corresponding local Internet search trend data provided by Google Insights for Search during the period from January 2004 to December 2009. The investigation uses cross correlation analysis to estimate the temporal relationship between suicide and Internet search trends and multiple linear regression analysis to identify significant factors associated with suicide from a pool of search trend data that either coincides or precedes the suicide death. Results show that a set of suicide-related search terms, the trends of which either temporally coincided or preceded trends of suicide data, were associated with suicide death. These search factors varied among different suicide samples. Searches for "major depression" and "divorce" accounted for, at most, 30.2% of the variance in suicide data. When considering only leading suicide trends, searches for "divorce" and the pro-suicide term "complete guide of suicide," accounted for 22.7% of variance in suicide data. Appropriate filtering and detection of potentially harmful source in keyword-driven search results by search engine providers may be a reasonable strategy to reduce suicide deaths. Copyright © 2011 Elsevier B.V. All rights reserved.
Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo
The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.
Uncertainty, variability, and earthquake physics in ground‐motion prediction equations
Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.
2017-01-01
Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20 km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.
Bonkosky, M; Hernández-Delgado, E A; Sandoz, B; Robledo, I E; Norat-Ramírez, J; Mattei, H
2009-01-01
Human fecal contamination of coral reefs is a major cause of concern. Conventional methods used to monitor microbial water quality cannot be used to discriminate between different fecal pollution sources. Fecal coliforms, enterococci, and human-specific Bacteroides (HF183, HF134), general Bacteroides-Prevotella (GB32), and Clostridium coccoides group (CP) 16S rDNA PCR assays were used to test for the presence of non-point source fecal contamination across the southwestern Puerto Rico shelf. Inshore waters were highly turbid, consistently receiving fecal pollution from variable sources, and showing the highest frequency of positive molecular marker signals. Signals were also detected at offshore waters in compliance with existing microbiological quality regulations. Phylogenetic analysis showed that most isolates were of human fecal origin. The geographic extent of non-point source fecal pollution was large and impacted extensive coral reef systems. This could have deleterious long-term impacts on public health, local fisheries and in tourism potential if not adequately addressed.
Aerosol Source Attributions and Source-Receptor Relationships Across the Northern Hemisphere
NASA Technical Reports Server (NTRS)
Bian, Huisheng; Chin, Mian; Kucsera, Tom; Pan, Xiaohua; Darmenov, Anton; Colarco, Peter; Torres, Omar; Shults, Michael
2014-01-01
Emissions and long-range transport of air pollution pose major concerns on air quality and climate change. To better assess the impact of intercontinental transport of air pollution on regional and global air quality, ecosystems, and near-term climate change, the UN Task Force on Hemispheric Transport of Air Pollution (HTAP) is organizing a phase II activity (HTAP2) that includes global and regional model experiments and data analysis, focusing on ozone and aerosols. This study presents the initial results of HTAP2 global aerosol modeling experiments. We will (a) evaluate the model results with surface and aircraft measurements, (b) examine the relative contributions of regional emission and extra-regional source on surface PM concentrations and column aerosol optical depth (AOD) over several NH pollution and dust source regions and the Arctic, and (c) quantify the source-receptor relationships in the pollution regions that reflect the sensitivity of regional aerosol amount to the regional and extra-regional emission reductions.
NASA Astrophysics Data System (ADS)
Zhang, Ye; Gong, Rongfang; Cheng, Xiaoliang; Gulliksson, Mårten
2018-06-01
This study considers the inverse source problem for elliptic partial differential equations with both Dirichlet and Neumann boundary data. The unknown source term is to be determined by additional boundary conditions. Unlike the existing methods found in the literature, which usually employ the first-order in time gradient-like system (such as the steepest descent methods) for numerically solving the regularized optimization problem with a fixed regularization parameter, we propose a novel method with a second-order in time dissipative gradient-like system and a dynamical selected regularization parameter. A damped symplectic scheme is proposed for the numerical solution. Theoretical analysis is given for both the continuous model and the numerical algorithm. Several numerical examples are provided to show the robustness of the proposed algorithm.
Surface Wave Dynamics in the Coastal Zone
2014-09-30
also collected from the Duck measurement site, operated by the USACE Field Research Facility at Duck , North Carolina. The collection and validation...similar analysis for 10 storm periods using wave data collected at Duck , North Carolina. The preparations consist of creating a dedicated unstructured...validated in the Southern North Sea and Duck validation studies. The shallow water source terms for wave breaking and triad interactions are being
ERIC Educational Resources Information Center
Scrimshaw, Susan
This guidebook is both a practical tool and a source book to aid health planners assess the importance, extent, and impact of indigenous and private sector medical systems in developing nations. Guidelines are provided for assessment in terms of: use patterns; the meaning and importance to users of various available health services; and ways of…
Navy Future Fleet Platform Architecture Study
2016-07-01
Aircraft Carriers Source: GAO Report GAO/NSIAD-98-1, Navy Aircraft Carriers: Cost - Effectiveness of Conventionally and Nuclear - Powered Carriers...and Russia. The analysis shows the U.S. Navy has a decisive advantage in terms of striking power from aircraft carriers, surface combatants, and...conventional power , but roughly the same displacement and an emphasis on containing costs now that some of the Nuclear Propulsion Program requirements no
Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry
NASA Astrophysics Data System (ADS)
Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter
2014-06-01
Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.
Master, Suely; Guzman, Marco; Carlos de Miranda, Helder; Lloyd, Adam
2013-03-01
Previous studies with long-term average spectrum (LTAS) showed the importance of the glottal source for understanding the projected voices of actresses. In this study, electroglottographic (EGG) analysis was used to investigate the contribution of the glottal source to the projected voice, comparing actresses and nonactresses' voices, in different levels of intensity. Thirty actresses and 30 nonactresses sustained vowels in habitual, moderate, and loud intensity levels. The EGG variables were contact quotient (CQ), closing quotient (QCQ), and opening quotient (QOQ). Other variables were sound pressure level (SPL) and fundamental frequency (F0). A KayPENTAX EGG was used. Variables were inputted in a general linear model. Actresses showed significantly higher values for SPL, in all levels, and both groups increased SPL significantly while changing from habitual to moderate and further to loud. There were no significant differences between groups for EGG quotients. There were significant differences between the levels only for F0 and CQ for both groups. SPL was significantly higher among actresses in all intensity levels, but in the EGG analysis, no differences were found. This apparently weak contribution of the glottal source in the supposedly projected voices of actresses, contrary to previous LTAS studies, might be because of a higher subglottal pressure or perhaps greater vocal tract contribution in SPL. Results from the present study suggest that trained subjects did not produce a significant higher SPL than untrained individuals by increasing the cost in terms of higher vocal fold collision and hence more impact stress. Future researches should explore the difference between trained and nontrained voices by aerodynamic measurements to evaluate the relationship between physiologic findings and the acoustic and EGG data. Moreover, further studies should consider both types of vocal tasks, sustained vowel and running speech, for both EGG and LTAS analysis. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Turbelin, Grégory; Singh, Sarvesh Kumar; Issartel, Jean-Pierre
2014-12-01
In the event of an accidental or intentional contaminant release in the atmosphere, it is imperative, for managing emergency response, to diagnose the release parameters of the source from measured data. Reconstruction of the source information exploiting measured data is called an inverse problem. To solve such a problem, several techniques are currently being developed. The first part of this paper provides a detailed description of one of them, known as the renormalization method. This technique, proposed by Issartel (2005), has been derived using an approach different from that of standard inversion methods and gives a linear solution to the continuous Source Term Estimation (STE) problem. In the second part of this paper, the discrete counterpart of this method is presented. By using matrix notation, common in data assimilation and suitable for numerical computing, it is shown that the discrete renormalized solution belongs to a family of well-known inverse solutions (minimum weighted norm solutions), which can be computed by using the concept of generalized inverse operator. It is shown that, when the weight matrix satisfies the renormalization condition, this operator satisfies the criteria used in geophysics to define good inverses. Notably, by means of the Model Resolution Matrix (MRM) formalism, we demonstrate that the renormalized solution fulfils optimal properties for the localization of single point sources. Throughout the article, the main concepts are illustrated with data from a wind tunnel experiment conducted at the Environmental Flow Research Centre at the University of Surrey, UK.
Elemental composition and size distribution of particulates in Cleveland, Ohio
NASA Technical Reports Server (NTRS)
King, R. B.; Fordyce, J. S.; Neustadter, H. E.; Leibecki, H. F.
1975-01-01
Measurements were made of the elemental particle size distribution at five contrasting urban environments with different source-type distributions in Cleveland, Ohio. Air quality conditions ranged from normal to air pollution alert levels. A parallel network of high-volume cascade impactors (5-state) were used for simultaneous sampling on glass fiber surfaces for mass determinations and on Whatman-41 surfaces for elemental analysis by neutron activation for 25 elements. The elemental data are assessed in terms of distribution functions and interrelationships and are compared between locations as a function of resultant wind direction in an attempt to relate the findings to sources.
Elemental composition and size distribution of particulates in Cleveland, Ohio
NASA Technical Reports Server (NTRS)
Leibecki, H. F.; King, R. B.; Fordyce, J. S.; Neustadter, H. E.
1975-01-01
Measurements have been made of the elemental particle size distribution at five contrasting urban environments with different source-type distributions in Cleveland, Ohio. Air quality conditions ranged from normal to air pollution alert levels. A parallel network of high-volume cascade impactors (5-stage) were used for simultaneous sampling on glass fiber surfaces for mass determinations and on Whatman-41 surfaces for elemental analysis by neutron activation for 25 elements. The elemental data are assessed in terms of distribution functions and interrelationships and are compared between locations as a function of resultant wind direction in an attempt to relate the findings to sources.
NASA Technical Reports Server (NTRS)
Yee, H. C.; Shinn, J. L.
1986-01-01
Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.
Common Calibration Source for Monitoring Long-term Ozone Trends
NASA Technical Reports Server (NTRS)
Kowalewski, Matthew
2004-01-01
Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.
Watershed nitrogen and phosphorus balance: The upper Potomac River basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaworski, N.A.; Groffman, P.M.; Keller, A.A.
1992-01-01
Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less
The long-term intensity behavior of Centaurus X-3
NASA Technical Reports Server (NTRS)
Schreier, E. J.; Swartz, K.; Giacconi, R.; Fabbiano, G.; Morin, J.
1976-01-01
In three years of observation, the X-ray source Cen X-3 appears to alternate between 'high states', with an intensity of 150 counts/s (2-6 keV) or greater, and 'low states', where the source is barely detectable. The time scale of this behavior is of the order of months, and no apparent periodicity has been observed. Analysis of two transitions between these states is reported. During two weeks in July 1972, the source increased from about 20 counts/s to 150 counts/s. The detailed nature of this turn-on is interpreted in terms of a model in which the supergiant's stellar wind decreases in density. A second transition, a turnoff in February 1973, is similarly analyzed and found to be consistent with a simple decrease in accretion rate. The presence of absorption dips during transitions at orbital phases 0.4-0.5 as well as at phase 0.75 is discussed. The data are consistent with a stellar-wind accretion model and with different kinds of extended lows caused by increased wind density masking the X-ray emission or by decreased wind density lowering the accretion rate.
Industry funding and the reporting quality of large long-term weight loss trials
Thomas, Olivia; Thabane, Lehana; Douketis, James; Chu, Rong; Westfall, Andrew O.; Allison, David B.
2009-01-01
Background Quality of reporting (QR) in industry-funded research is a concern of the scientific community. Greater scrutiny of industry-sponsored research reporting has been suggested, although differences in QR by sponsorship type have not been evaluated in weight loss interventions. Objective To evaluate the association of funding source and QR of long-term obesity randomized clinical trials. Methods We analyzed papers that reported long-term weight loss trials. Articles were obtained through searches of MEDLINE, HealthStar, and the Cochrane Controlled Trials Register between the years 1966–2003. QR scores were determined for each study based upon expanded criteria from the Consolidated Standards for Reporting Trials (CONSORT) checklist for a maximum score of 44 points. Studies were coded by category of industry support (0=no industry support, 1= industry support, 2= in kind contribution from industry and 3=duality of interest reported). Individual CONSORT reporting criteria were tabulated by funding type. An independent samples t-test compared differences in QR scores by funding source and the Wilcox-Mann-Whitney test and generalized estimating equations (GEE) were used for sensitivity analyses. Results Of the 63 RCTs evaluated, 67% were industry-supported trials. Industry funding was associated with higher QR score in long-term weight loss trials compared to non-industry funded studies (Mean QR (SD): Industry = 27.9 (4.1), Non-Industry =23.4 (4.1); p < 0.0005). The Wilcox-Mann-Whitney test confirmed this result (p<0.0005). Controlling for the year of publication and whether paper was published before the CONSORT statement was released in a GEE regression analysis, the direction and magnitude of effect was similar and statistically significant (p=0.035). Of the individual criteria that prior research has associated with biases, industry funding was associated with greater reporting of intent-to-treat analysis (p=0.0158), but was not different from non-industry studies in reporting of treatment allocation and blinding. Conclusion Our findings suggest that efforts to improve reporting quality be directed at all obesity RCTs irrespective of funding source. PMID:18711388
Industry funding and the reporting quality of large long-term weight loss trials.
Thomas, O; Thabane, L; Douketis, J; Chu, R; Westfall, A O; Allison, D B
2008-10-01
Quality of reporting (QR) in industry-funded research is a concern of the scientific community. Greater scrutiny of industry-sponsored research reporting has been suggested, although differences in QR by sponsorship type have not been evaluated in weight loss interventions. To evaluate the association of funding source and QR of long-term obesity randomized clinical trials (RCT). We analysed papers that reported long-term weight loss trials. Articles were obtained through searches of Medline, HealthStar, and the Cochrane Controlled Trials Register between the years 1966 and 2003. QR scores were determined for each study based upon expanded criteria from the Consolidated Standards for Reporting Trials (CONSORT) checklist for a maximum score of 44 points. Studies were coded by category of industry support (0=no industry support, 1=industry support, 2=in kind contribution from industry and 3=duality of interest reported). Individual CONSORT reporting criteria were tabulated by funding type. An independent samples t-test compared the differences in QR scores by funding source and the Wilcox-Mann-Whitney test and generalised estimating equations (GEE) were used for sensitivity analyses. Of the 63 RCTs evaluated, 67% were industry-supported trials. Industry funding was associated with higher QR score in long-term weight loss trials compared with nonindustry-funded studies (mean QR (s.d.): industry=27.9 (4.1), nonindustry=23.4 (4.1); P<0.0005). The Wilcox-Mann-Whitney test confirmed this result (P<0.0005). Controlling for the year of publication and whether the paper was published before the CONSORT statement was released in the GEE regression analysis, the direction and magnitude of effect were similar and statistically significant (P=0.035). Of the individual criteria that prior research has associated with biases, industry funding was associated with greater reporting of intent-to-treat analysis (P=0.0158), but was not different from nonindustry studies in reporting of treatment allocation and blinding. Our findings suggest that the efforts to improve reporting quality be directed to all obesity RCTs, irrespective of funding source.
Radioactive waste management complex low-level waste radiological composite analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarthy, J.M.; Becker, B.H.; Magnuson, S.O.
1998-05-01
The composite analysis estimates the projected cumulative impacts to future members of the public from the disposal of low-level radioactive waste (LLW) at the Idaho National Engineering and Environmental Laboratory (INEEL) Radioactive Waste Management Complex (RWMC) and all other sources of radioactive contamination at the INEEL that could interact with the LLW disposal facility to affect the radiological dose. Based upon the composite analysis evaluation, waste buried in the Subsurface Disposal Area (SDA) at the RWMC is the only source at the INEEL that will significantly interact with the LLW facility. The source term used in the composite analysis consistsmore » of all historical SDA subsurface disposals of radionuclides as well as the authorized LLW subsurface disposal inventory and projected LLW subsurface disposal inventory. Exposure scenarios evaluated in the composite analysis include all the all-pathways and groundwater protection scenarios. The projected dose of 58 mrem/yr exceeds the composite analysis guidance dose constraint of 30 mrem/yr; therefore, an options analysis was conducted to determine the feasibility of reducing the projected annual dose. Three options for creating such a reduction were considered: (1) lowering infiltration of precipitation through the waste by providing a better cover, (2) maintaining control over the RWMC and portions of the INEEL indefinitely, and (3) extending the period of institutional control beyond the 100 years assumed in the composite analysis. Of the three options investigated, maintaining control over the RWMC and a small part of the present INEEL appears to be feasible and cost effective.« less
Seeley, T D; Mikheyev, A S; Pagano, G J
2000-09-01
For more than 50 years, investigators of the honey bee's waggle dance have reported that richer food sources seem to elicit longer-lasting and livelier dances than do poorer sources. However, no one had measured both dance duration and liveliness as a function of food-source profitability. Using video analysis, we found that nectar foragers adjust both the duration (D) and the rate (R) of waggle-run production, thereby tuning the number of waggle runs produced per foraging trip (W, where W= DR) as a function of food-source profitability. Both duration and rate of waggle-run production increase with rising food-source profitability. Moreover, we found that a dancing bee adjusts the rate of waggle-run production (R) in relation to food-source profitability by adjusting the mean duration of the return-phase portion of her dance circuits. This finding raises the possibility that bees can use return-phase duration as an index of food-source profitability. Finally, dances having different levels of liveliness have different mean durations of the return phase, indicating that dance liveliness can be quantified in terms of the time interval between consecutive waggle runs.
Multicriteria analysis for sources of renewable energy using data from remote sensing
NASA Astrophysics Data System (ADS)
Matejicek, L.
2015-04-01
Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more preferred classes for utilization of sources of renewable energy due to an increase area of reclaimed sites. Using data from remote sensing, such as the multispectral images and the CORINE land cover datasets, can reduce the financial resources currently required for finding and assessing suitable areas.
Asymptotic/numerical analysis of supersonic propeller noise
NASA Technical Reports Server (NTRS)
Myers, M. K.; Wydeven, R.
1989-01-01
An asymptotic analysis based on the Mach surface structure of the field of a supersonic helical source distribution is applied to predict thickness and loading noise radiated by high speed propeller blades. The theory utilizes an integral representation of the Ffowcs-Williams Hawkings equation in a fully linearized form. The asymptotic results are used for chordwise strips of the blade, while required spanwise integrations are performed numerically. The form of the analysis enables predicted waveforms to be interpreted in terms of Mach surface propagation. A computer code developed to implement the theory is described and found to yield results in close agreement with more exact computations.
NASA Astrophysics Data System (ADS)
Li, Xinyi; Bao, Jingfu; Huang, Yulin; Zhang, Benfeng; Omori, Tatsuya; Hashimoto, Ken-ya
2018-07-01
In this paper, we propose the use of the hierarchical cascading technique (HCT) for the finite element method (FEM) analysis of bulk acoustic wave (BAW) devices. First, the implementation of this technique is presented for the FEM analysis of BAW devices. It is shown that the traveling-wave excitation sources proposed by the authors are fully compatible with the HCT. Furthermore, a HCT-based absorbing mechanism is also proposed to replace the perfectly matched layer (PML). Finally, it is demonstrated how the technique is much more efficient in terms of memory consumption and execution time than the full FEM analysis.
Another collision for the Coma cluster
NASA Technical Reports Server (NTRS)
Vikhlinin, A.; Forman, W.; Jones, C.
1996-01-01
The wavelet transform analysis of the Rosat position sensitive proportional counter (PSPC) images of the Coma cluster are presented. The analysis shows, on small scales, a substructure dominated by two extended sources surrounding the two bright clusters NGC 4874 and NGC 4889. On scales of about 2 arcmin to 3 arcmin, the analysis reveals a tail of X-ray emission originating near the cluster center, curving to the south and east for approximately 25 arcmin and ending near the galaxy NGC 4911. The results are interpreted in terms of a merger of a group, having a core mass of approximately 10(exp 13) solar mass, with the main body of the Coma cluster.
Runge, Michael C.; LaGory, Kirk E.; Russell, Kendra; Balsom, Janet R.; Butler, R. Alan; Coggins,, Lewis G.; Grantz, Katrina A.; Hayse, John; Hlohowskyj, Ihor; Korman, Josh; May, James E.; O'Rourke, Daniel J.; Poch, Leslie A.; Prairie, James R.; VanKuiken, Jack C.; Van Lonkhuyzen, Robert A.; Varyu, David R.; Verhaaren, Bruce T.; Veselka, Thomas D.; Williams, Nicholas T.; Wuthrich, Kelsey K.; Yackulic, Charles B.; Billerbeck, Robert P.; Knowles, Glen W.
2016-01-07
The results of the decision analysis are meant to serve as only one of many sources of information that can be used to evaluate the alternatives proposed in the Environmental Impact Statement. These results only focus on those resource goals for which quantitative performance metrics could be formulated and evaluated; there are other important aspects of the resource goals that also need to be considered. Not all the stakeholders who were invited to participate in the decision analysis chose to do so; thus, the Bureau of Reclamation, National Park Service, and U.S. Department of Interior may want to consider other input.
clusterProfiler: an R package for comparing biological themes among gene clusters.
Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu
2012-05-01
Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.
Local tsunamis and earthquake source parameters
Geist, Eric L.; Dmowska, Renata; Saltzman, Barry
1999-01-01
This chapter establishes the relationship among earthquake source parameters and the generation, propagation, and run-up of local tsunamis. In general terms, displacement of the seafloor during the earthquake rupture is modeled using the elastic dislocation theory for which the displacement field is dependent on the slip distribution, fault geometry, and the elastic response and properties of the medium. Specifically, nonlinear long-wave theory governs the propagation and run-up of tsunamis. A parametric study is devised to examine the relative importance of individual earthquake source parameters on local tsunamis, because the physics that describes tsunamis from generation through run-up is complex. Analysis of the source parameters of various tsunamigenic earthquakes have indicated that the details of the earthquake source, namely, nonuniform distribution of slip along the fault plane, have a significant effect on the local tsunami run-up. Numerical methods have been developed to address the realistic bathymetric and shoreline conditions. The accuracy of determining the run-up on shore is directly dependent on the source parameters of the earthquake, which provide the initial conditions used for the hydrodynamic models.
Water use sources of desert riparian Populus euphratica forests.
Si, Jianhua; Feng, Qi; Cao, Shengkui; Yu, Tengfei; Zhao, Chunyan
2014-09-01
Desert riparian forests are the main body of natural oases in the lower reaches of inland rivers; its growth and distribution are closely related to water use sources. However, how does the desert riparian forest obtains a stable water source and which water sources it uses to effectively avoid or overcome water stress to survive? This paper describes an analysis of the water sources, using the stable oxygen isotope technique and the linear mixed model of the isotopic values and of desert riparian Populus euphratica forests growing at sites with different groundwater depths and conditions. The results showed that the main water source of Populus euphratica changes from water in a single soil layer or groundwater to deep subsoil water and groundwater as the depth of groundwater increases. This appears to be an adaptive selection to arid and water-deficient conditions and is a primary reason for the long-term survival of P. euphratica in the desert riparian forest of an extremely arid region. Water contributions from the various soil layers and from groundwater differed and the desert riparian P. euphratica forests in different habitats had dissimilar water use strategies.
Thinking the "unthinkable": why Philip Morris considered quitting
Smith, E; Malone, R
2003-01-01
Objective: To investigate the genesis and development of tobacco company Philip Morris's recent image enhancement strategies and analyse their significance. Data sources: Internal Philip Morris documents, made available by the terms of the Master Settlement Agreement between the tobacco companies and the attorneys general of 46 states, and secondary newspaper sources. Study selection: Searches of the Philip Morris documents website (www.pmdocs.com) beginning with terms such as "image management" and "identity" and expanding as relevant new terms (consultant names, project names, and dates), were identified, using a "snowball" sampling strategy. Findings and conclusions: In the early 1990s, Philip Morris, faced with increasing pressures generated both externally, from the non-smokers' rights and public health communities, and internally, from the conflicts among its varied operating companies, seriously considered leaving the tobacco business. Discussions of this option, which occurred at the highest levels of management, focused on the changing social climate regarding tobacco and smoking that the tobacco control movement had effected. However, this option was rejected in favour of the image enhancement strategy that culminated with the recent "Altria" name change. This analysis suggests that advocacy efforts have the potential to significantly denormalise tobacco as a corporate enterprise. PMID:12773733
NASA Astrophysics Data System (ADS)
He, Han; Wang, Huaning; Zhang, Mei; Mehrabi, Ahmad; Yan, Yan; Yun, Duo
2018-05-01
The light curves of solar-type stars present both periodic fluctuation and flare spikes. The gradual periodic fluctuation is interpreted as the rotational modulation of magnetic features on the stellar surface and is used to deduce magnetic feature activity properties. The flare spikes in light curves are used to derive flare activity properties. In this paper, we analyze the light curve data of three solar-type stars (KIC 6034120, KIC 3118883, and KIC 10528093) observed with Kepler space telescope and investigate the relationship between their magnetic feature activities and flare activities. The analysis shows that: (1) both the magnetic feature activity and the flare activity exhibit long-term variations as the Sun does; (2) unlike the Sun, the long-term variations of magnetic feature activity and flare activity are not in phase with each other; (3) the analysis of star KIC 6034120 suggests that the long-term variations of magnetic feature activity and flare activity have a similar cycle length. Our analysis and results indicate that the magnetic features that dominate rotational modulation and the flares possibly have different source regions, although they may be influenced by the magnetic field generated through a same dynamo process.
Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
2003-01-01
A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiefelbein, C.; Ho, T.
Changes in the physical properties (measured in terms of vitrinite reflectance, elemental analysis, and C-13 nuclear magnetic resonance) of an immature coal (0.46% R{sub o}) from Craig County, Colorado, that was thermally altered using hydrous pyrolysis were used to establish a correspondence between hydrous pyrolysis time/temperature reaction conditions and relative maturity (expressed in terms of vitrinite reflectance). This correspondence was used to determine the oil generation maturity limits for an immature hydrogen-rich (Type I fluorescing amorphous oil-prone kerogen) source rock from an offshore Congo well that was thermally altered using the same reaction conditions as applied to the immature coal.more » The resulting changes in the physical properties of the altered source rock, measured in terms of decreasing reactive carbon content (from Rock-Eval pyrolysis), were used to construct a hydrocarbon yield curve from which the relative maturity associated with the onset, main phase, and peak of oil generation was determined. Results, substantiated by anhydrous pyrolysis techniques, indicate that the source rock from Congo has a late onset of appreciable ({gt}10% transformation) oil generation (0.9% R{sub o} {plus minus} 0.1%), generates maximum quantities of oil from about 1.1 to 1.3% R{sub o}, and reaches the end (or peak) of the primary oil generating window at approximately 1.4% R{sub o} ({plus minus}0.1%) when secondary cracking reactions become important. However, the bottom of the oil window can be extended to about 1.6% R{sub o} because the heavy molecular weight degradation by-products (asphaltenes) that are not efficiently expelled from source rocks continue to degrade into progressively lower molecular weight hydrocarbons.« less
McIDAS-V: Advanced Visualization for 3D Remote Sensing Data
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2010-12-01
McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.
NASA Astrophysics Data System (ADS)
Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.
2016-12-01
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.
Modeling the contribution of point sources and non-point sources to Thachin River water pollution.
Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth
2009-08-15
Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.
Fermi Large Area Telescope Second Source Catalog
NASA Astrophysics Data System (ADS)
Nolan, P. L.; Abdo, A. A.; Ackermann, M.; Ajello, M.; Allafort, A.; Antolini, E.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Belfiore, A.; Bellazzini, R.; Berenji, B.; Bignami, G. F.; Blandford, R. D.; Bloom, E. D.; Bonamente, E.; Bonnell, J.; Borgland, A. W.; Bottacini, E.; Bouvier, A.; Brandt, T. J.; Bregeon, J.; Brigida, M.; Bruel, P.; Buehler, R.; Burnett, T. H.; Buson, S.; Caliandro, G. A.; Cameron, R. A.; Campana, R.; Cañadas, B.; Cannon, A.; Caraveo, P. A.; Casandjian, J. M.; Cavazzuti, E.; Ceccanti, M.; Cecchi, C.; Çelik, Ö.; Charles, E.; Chekhtman, A.; Cheung, C. C.; Chiang, J.; Chipaux, R.; Ciprini, S.; Claus, R.; Cohen-Tanugi, J.; Cominsky, L. R.; Conrad, J.; Corbet, R.; Cutini, S.; D'Ammando, F.; Davis, D. S.; de Angelis, A.; DeCesar, M. E.; DeKlotz, M.; De Luca, A.; den Hartog, P. R.; de Palma, F.; Dermer, C. D.; Digel, S. W.; Silva, E. do Couto e.; Drell, P. S.; Drlica-Wagner, A.; Dubois, R.; Dumora, D.; Enoto, T.; Escande, L.; Fabiani, D.; Falletti, L.; Favuzzi, C.; Fegan, S. J.; Ferrara, E. C.; Focke, W. B.; Fortin, P.; Frailis, M.; Fukazawa, Y.; Funk, S.; Fusco, P.; Gargano, F.; Gasparrini, D.; Gehrels, N.; Germani, S.; Giebels, B.; Giglietto, N.; Giommi, P.; Giordano, F.; Giroletti, M.; Glanzman, T.; Godfrey, G.; Grenier, I. A.; Grondin, M.-H.; Grove, J. E.; Guillemot, L.; Guiriec, S.; Gustafsson, M.; Hadasch, D.; Hanabata, Y.; Harding, A. K.; Hayashida, M.; Hays, E.; Hill, A. B.; Horan, D.; Hou, X.; Hughes, R. E.; Iafrate, G.; Itoh, R.; Jóhannesson, G.; Johnson, R. P.; Johnson, T. E.; Johnson, A. S.; Johnson, T. J.; Kamae, T.; Katagiri, H.; Kataoka, J.; Katsuta, J.; Kawai, N.; Kerr, M.; Knödlseder, J.; Kocevski, D.; Kuss, M.; Lande, J.; Landriu, D.; Latronico, L.; Lemoine-Goumard, M.; Lionetto, A. M.; Llena Garde, M.; Longo, F.; Loparco, F.; Lott, B.; Lovellette, M. N.; Lubrano, P.; Madejski, G. M.; Marelli, M.; Massaro, E.; Mazziotta, M. N.; McConville, W.; McEnery, J. E.; Mehault, J.; Michelson, P. F.; Minuti, M.; Mitthumsiri, W.; Mizuno, T.; Moiseev, A. A.; Mongelli, M.; Monte, C.; Monzani, M. E.; Morselli, A.; Moskalenko, I. V.; Murgia, S.; Nakamori, T.; Naumann-Godo, M.; Norris, J. P.; Nuss, E.; Nymark, T.; Ohno, M.; Ohsugi, T.; Okumura, A.; Omodei, N.; Orlando, E.; Ormes, J. F.; Ozaki, M.; Paneque, D.; Panetta, J. H.; Parent, D.; Perkins, J. S.; Pesce-Rollins, M.; Pierbattista, M.; Pinchera, M.; Piron, F.; Pivato, G.; Porter, T. A.; Racusin, J. L.; Rainò, S.; Rando, R.; Razzano, M.; Razzaque, S.; Reimer, A.; Reimer, O.; Reposeur, T.; Ritz, S.; Rochester, L. S.; Romani, R. W.; Roth, M.; Rousseau, R.; Ryde, F.; Sadrozinski, H. F.-W.; Salvetti, D.; Sanchez, D. A.; Saz Parkinson, P. M.; Sbarra, C.; Scargle, J. D.; Schalk, T. L.; Sgrò, C.; Shaw, M. S.; Shrader, C.; Siskind, E. J.; Smith, D. A.; Spandre, G.; Spinelli, P.; Stephens, T. E.; Strickman, M. S.; Suson, D. J.; Tajima, H.; Takahashi, H.; Takahashi, T.; Tanaka, T.; Thayer, J. G.; Thayer, J. B.; Thompson, D. J.; Tibaldo, L.; Tibolla, O.; Tinebra, F.; Tinivella, M.; Torres, D. F.; Tosti, G.; Troja, E.; Uchiyama, Y.; Vandenbroucke, J.; Van Etten, A.; Van Klaveren, B.; Vasileiou, V.; Vianello, G.; Vitale, V.; Waite, A. P.; Wallace, E.; Wang, P.; Werner, M.; Winer, B. L.; Wood, D. L.; Wood, K. S.; Wood, M.; Yang, Z.; Zimmer, S.
2012-04-01
We present the second catalog of high-energy γ-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurements in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely γ-ray-producing source classes. We dedicate this paper to the memory of our colleague Patrick Nolan, who died on 2011 November 6. His career spanned much of the history of high-energy astronomy from space and his work on the Large Area Telescope (LAT) began nearly 20 years ago when it was just a concept. Pat was a central member in the operation of the LAT collaboration and he is greatly missed.
Imbalances of Water and Solutes in Experimental Watersheds: Spatial or Temporal Origin?
NASA Astrophysics Data System (ADS)
Ruiz, L.; Fovet, O.; Sekhar, M.; Riotte, J.; Braun, J.; Gascuel-odoux, C.; Durand, P.
2012-12-01
Experimental watersheds where originally conceived as a tool to measure water balance in different landscapes and climates and in particular evapotranspiration fluxes. Pioneering experimentalists where paying attention to all possible causes of unmeasured losses, for example by ensuring that the watershed outlet was lying on impervious bedrock and that the hydrological boundary was consistent with the topographic divide. Nowadays, watershed studies encompass a large range of objectives, from hydrological process quantification, to diffuse pollution assessment. In many cases, the above-mentioned experimental precautions are not much considered and the closure of the water balance is rarely achieved, although this fact is even more rarely publicized in the scientific communications. As a consequence, it is very often very difficult to determine whether an observed difference between input and output of water or solutes is due to some hidden deep losses or to variation of storage in internal compartments of the watershed. In this presentation, we will discuss this issue based on long term hydrological and geochemical monitoring of experimental watersheds belonging to the research observatories BVET (http://bvet.omp.obs-mip.fr/ ) in South India and AgrHys (http://www.inra.fr/ore_agrhys/) in Western France. In the South Indian forested watershed of Mule Hole (10 years of monitoring) we demonstrated that transpiration by deep tree roots was a significant component of the water balance, and that the main pathway for hydrological and geochemical fluxes was groundwater underflow. In the French agricultural watersheds of Kerbernez and Kerrien (20 years of monitoring) significant water and solute losses through groundwater underflow was also demonstrated. In both sites, a model with flexible structure was calibrated and validated on the observation and then long-term simulations were produced using available long term weather data series of 50 years. Results demonstrated that the water or solute imbalances at the watershed scale depended, to a large extent, on the duration considered in the analysis. In the Mule Hole watershed, water storage in the unsaturated weathered bedrock was the major cause of water imbalance for short time series (less than 10 years) while deep loses were the only source of imbalance for long term analysis (more than 30 years). On the contrary, in the Kerrien and Kerbernez watersheds, solute imbalance were mainly attributed to underflow for short term analysis (less than 10 years) while variation of solute storage in the weathered bedrock became a major source of imbalance for long term analysis (more than 20 years). Discussion will focus on the consequences of these results on the validity of the hypotheses used in hydrological and hydrochemical modeling studies, and on the interest of long term environmental observatories for understanding water and element cycles.
Automorphic properties of low energy string amplitudes in various dimensions
NASA Astrophysics Data System (ADS)
Green, Michael B.; Russo, Jorge G.; Vanhove, Pierre
2010-04-01
This paper explores the moduli-dependent coefficients of higher-derivative interactions that appear in the low-energy expansion of the four-supergraviton amplitude of maximally supersymmetric string theory compactified on a d torus. These automorphic functions are determined for terms up to order ∂6R4 and various values of d by imposing a variety of consistency conditions. They satisfy Laplace eigenvalue equations with or without source terms, whose solutions are given in terms of Eisenstein series, or more general automorphic functions, for certain parabolic subgroups of the relevant U-duality groups. The ultraviolet divergences of the corresponding supergravity field theory limits are encoded in various logarithms, although the string theory expressions are finite. This analysis includes intriguing representations of SL(d) and SO(d,d) Eisenstein series in terms of toroidally compactified one and two-loop string and supergravity amplitudes.
PDF-ECG in clinical practice: A model for long-term preservation of digital 12-lead ECG data.
Sassi, Roberto; Bond, Raymond R; Cairns, Andrew; Finlay, Dewar D; Guldenring, Daniel; Libretti, Guido; Isola, Lamberto; Vaglio, Martino; Poeta, Roberto; Campana, Marco; Cuccia, Claudio; Badilini, Fabio
In clinical practice, data archiving of resting 12-lead electrocardiograms (ECGs) is mainly achieved by storing a PDF report in the hospital electronic health record (EHR). When available, digital ECG source data (raw samples) are only retained within the ECG management system. The widespread availability of the ECG source data would undoubtedly permit successive analysis and facilitate longitudinal studies, with both scientific and diagnostic benefits. PDF-ECG is a hybrid archival format which allows to store in the same file both the standard graphical report of an ECG together with its source ECG data (waveforms). Using PDF-ECG as a model to address the challenge of ECG data portability, long-term archiving and documentation, a real-world proof-of-concept test was conducted in a northern Italy hospital. A set of volunteers undertook a basic ECG using routine hospital equipment and the source data captured. Using dedicated web services, PDF-ECG documents were then generated and seamlessly uploaded in the hospital EHR, replacing the standard PDF reports automatically generated at the time of acquisition. Finally, the PDF-ECG files could be successfully retrieved and re-analyzed. Adding PDF-ECG to an existing EHR had a minimal impact on the hospital's workflow, while preserving the ECG digital data. Copyright © 2017 Elsevier Inc. All rights reserved.
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
Wang, S.W.; Iverson, S.J.; Springer, A.M.; Hatch, Shyla A.
2007-01-01
Procellariiforms are unique among seabirds in storing dietary lipids in both adipose tissue and stomach oil. Thus, both lipid sources are potentially useful for trophic studies using fatty acid (FA) signatures. However, little is known about the relationship between FA signatures in stomach oil and adipose tissue of individuals or whether these signatures provide similar information about diet and physiology. We compared the FA composition of stomach oil and adipose tissue biopsies of individual northern fulmars (N = 101) breeding at three major colonies in Alaska. Fatty acid signatures differed significantly between the two lipid sources, reflecting differences in dietary time scales, metabolic processing, or both. However, these signatures exhibited a relatively consistent relationship between individuals, such that the two lipid sources provided a similar ability to distinguish foraging differences among individuals and colonies. Our results, including the exclusive presence of dietary wax esters in stomach oil but not adipose tissue, are consistent with the notion that stomach oil FA signatures represent lipids retained from prey consumed during recent foraging and reflect little metabolic processing, whereas adipose tissue FA signatures represent a longer-term integration of dietary intake. Our study illustrates the potential for elucidating short- versus longer-term diet information in Procellariiform birds using different lipid sources. ?? 2007 Springer-Verlag.
Washburn, Richard A.; Szabo, Amanda N.; Lambourne, Kate; Willis, Erik A.; Ptomey, Lauren T.; Honas, Jeffery J.; Herrmann, Stephen D.; Donnelly, Joseph E.
2014-01-01
Background Differences in biological changes from weight loss by energy restriction and/or exercise may be associated with differences in long-term weight loss/regain. Objective To assess the effect of weight loss method on long-term changes in weight, body composition and chronic disease risk factors. Data Sources PubMed and Embase were searched (January 1990-October 2013) for studies with data on the effect of energy restriction, exercise (aerobic and resistance) on long-term weight loss. Twenty articles were included in this review. Study Eligibility Criteria Primary source, peer reviewed randomized trials published in English with an active weight loss period of >6 months, or active weight loss with a follow-up period of any duration, conducted in overweight or obese adults were included. Study Appraisal and Synthesis Methods Considerable heterogeneity across trials existed for important study parameters, therefore a meta-analysis was considered inappropriate. Results were synthesized and grouped by comparisons (e.g. diet vs. aerobic exercise, diet vs. diet + aerobic exercise etc.) and study design (long-term or weight loss/follow-up). Results Forty percent of trials reported significantly greater long-term weight loss with diet compared with aerobic exercise, while results for differences in weight regain were inconclusive. Diet+aerobic exercise resulted in significantly greater weight loss than diet alone in 50% of trials. However, weight regain (∼55% of loss) was similar in diet and diet+aerobic exercise groups. Fat-free mass tended to be preserved when interventions included exercise. PMID:25333384
What do popular Spanish women's magazines say about caesarean section? A 21-year survey.
Torloni, M R; Campos Mansilla, B; Merialdi, M; Betrán, A P
2014-04-01
Caesarean section (CS) rates are increasing worldwide and maternal request is cited as one of the main reasons for this trend. Women's preferences for route of delivery are influenced by popular media, including magazines. We assessed the information on CS presented in Spanish women's magazines. Systematic review. Women's magazines printed from 1989 to 2009 with the largest national distribution. Articles with any information on CS. Articles were selected, read and abstracted in duplicate. Sources of information, scientific accuracy, comprehensiveness and women's testimonials were objectively extracted using a content analysis form designed for this study. Accuracy, comprehensiveness and sources of information. Most (67%) of the 1223 selected articles presented exclusively personal opinion/birth stories, 12% reported the potential benefits of CS, 26% mentioned the short-term and 10% mentioned the long-term maternal risks, and 6% highlighted the perinatal risks of CS. The most frequent short-term risks were the increased time for maternal recovery (n = 86), frustration/feelings of failure (n = 83) and increased post-surgical pain (n = 71). The most frequently cited long-term risks were uterine rupture (n = 57) and the need for another CS in any subsequent pregnancy (n = 42). Less than 5% of the selected articles reported that CS could increase the risks of infection (n = 53), haemorrhage (n = 31) or placenta praevia/accreta in future pregnancies (n = 6). The sources of information were not reported by 68% of the articles. The portrayal of CS in Spanish women's magazines is not sufficiently comprehensive and does not provide adequate important information to help the readership to understand the real benefits and risks of this route of delivery. © 2014 The Authors. BJOG An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.
What do popular Spanish women's magazines say about caesarean section? A 21-year survey
Torloni, MR; Campos Mansilla, B; Merialdi, M; Betrán, AP
2014-01-01
Objectives Caesarean section (CS) rates are increasing worldwide and maternal request is cited as one of the main reasons for this trend. Women's preferences for route of delivery are influenced by popular media, including magazines. We assessed the information on CS presented in Spanish women's magazines. Design Systematic review. Setting Women's magazines printed from 1989 to 2009 with the largest national distribution. Sample Articles with any information on CS. Methods Articles were selected, read and abstracted in duplicate. Sources of information, scientific accuracy, comprehensiveness and women's testimonials were objectively extracted using a content analysis form designed for this study. Main outcome measures Accuracy, comprehensiveness and sources of information. Results Most (67%) of the 1223 selected articles presented exclusively personal opinion/birth stories, 12% reported the potential benefits of CS, 26% mentioned the short-term and 10% mentioned the long-term maternal risks, and 6% highlighted the perinatal risks of CS. The most frequent short-term risks were the increased time for maternal recovery (n = 86), frustration/feelings of failure (n = 83) and increased post-surgical pain (n = 71). The most frequently cited long-term risks were uterine rupture (n = 57) and the need for another CS in any subsequent pregnancy (n = 42). Less than 5% of the selected articles reported that CS could increase the risks of infection (n = 53), haemorrhage (n = 31) or placenta praevia/accreta in future pregnancies (n = 6). The sources of information were not reported by 68% of the articles. Conclusions The portrayal of CS in Spanish women's magazines is not sufficiently comprehensive and does not provide adequate important information to help the readership to understand the real benefits and risks of this route of delivery. PMID:24467797
Anomalous Low States and Long Term Variability in the Black Hole Binary LMC X-3
NASA Technical Reports Server (NTRS)
Smale, Alan P.; Boyd, Patricia T.
2012-01-01
Rossi X-my Timing Explorer observations of the black hole binary LMC X-3 reveal an extended very low X-ray state lasting from 2003 December 13 until 2004 March 18, unprecedented both in terms of its low luminosity (>15 times fainter than ever before seen in this source) and long duration (approx 3 times longer than a typical low/hard state excursion). During this event little to no source variability is observed on timescales of approx hours-weeks, and the X-ray spectrum implies an upper limit of 1.2 x 10(exp 35) erg/s, Five years later another extended low state occurs, lasting from 2008 December 11 until 2009 June 17. This event lasts nearly twice as long as the first, and while significant variability is observed, the source remains reliably in the low/hard spectral state for the approx 188 day duration. These episodes share some characteristics with the "anomalous low states" in the neutron star binary Her X-I. The average period and amplitude of the Variability of LMC X-3 have different values between these episodes. We characterize the long-term variability of LMC X-3 before and after the two events using conventional and nonlinear time series analysis methods, and show that, as is the case in Her X-I, the characteristic amplitude of the variability is related to its characteristic timescale. Furthermore, the relation is in the same direction in both systems. This suggests that a similar mechanism gives rise to the long-term variability, which in the case of Her X-I is reliably modeled with a tilted, warped precessing accretion disk.
LOOKING FOR GRANULATION AND PERIODICITY IMPRINTS IN THE SUNSPOT TIME SERIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, Ilídio; Silva, Hugo G., E-mail: ilidio.lopes@tecnico.ulisboa.pt, E-mail: hgsilva@uevora.pt
2015-05-10
The sunspot activity is the end result of the cyclic destruction and regeneration of magnetic fields by the dynamo action. We propose a new method to analyze the daily sunspot areas data recorded since 1874. By computing the power spectral density of daily data series using the Mexican hat wavelet, we found a power spectrum with a well-defined shape, characterized by three features. The first term is the 22 yr solar magnetic cycle, estimated in our work to be 18.43 yr. The second term is related to the daily volatility of sunspots. This term is most likely produced by themore » turbulent motions linked to the solar granulation. The last term corresponds to a periodic source associated with the solar magnetic activity, for which the maximum power spectral density occurs at 22.67 days. This value is part of the 22–27 day periodicity region that shows an above-average intensity in the power spectra. The origin of this 22.67 day periodic process is not clearly identified, and there is a possibility that it can be produced by convective flows inside the star. The study clearly shows a north–south asymmetry. The 18.43 yr periodical source is correlated between the two hemispheres, but the 22.67 day one is not correlated. It is shown that toward the large timescales an excess occurs in the northern hemisphere, especially near the previous two periodic sources. To further investigate the 22.67 day periodicity, we made a Lomb–Scargle spectral analysis. The study suggests that this periodicity is distinct from others found nearby.« less
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
NASA Astrophysics Data System (ADS)
Abro, Kashif Ali; Memon, Anwar Ahmed; Uqaili, Muhammad Aslam
2018-03-01
This research article is analyzed for the comparative study of RL and RC electrical circuits by employing newly presented Atangana-Baleanu and Caputo-Fabrizio fractional derivatives. The governing ordinary differential equations of RL and RC electrical circuits have been fractionalized in terms of fractional operators in the range of 0 ≤ ξ ≤ 1 and 0 ≤ η ≤ 1. The analytic solutions of fractional differential equations for RL and RC electrical circuits have been solved by using the Laplace transform with its inversions. General solutions have been investigated for periodic and exponential sources by implementing the Atangana-Baleanu and Caputo-Fabrizio fractional operators separately. The investigated solutions have been expressed in terms of simple elementary functions with convolution product. On the basis of newly fractional derivatives with and without singular kernel, the voltage and current have interesting behavior with several similarities and differences for the periodic and exponential sources.
Possible consequences of severe accidents at the Lubiatowo site, Poland
NASA Astrophysics Data System (ADS)
Seibert, Petra; Philipp, Anne; Hofman, Radek; Gufler, Klaus; Sholly, Steven
2014-05-01
The construction of a nuclear power plant is under consideration in Poland. One of the sites under discussion is near Lubiatowo, located on the cost of the Baltic Sea northwest of Gdansk. An assessment of possible environmental consequences is carried out for 88 real meteorological cases with the Lagrangian particle dispersion model FLEXPART. Based on literature research, three reactor designs (ABWR, EPR, AP 1000) were identified as being under discussion in Poland. For each of the designs, a set of accident scenarios was evaluated and two source terms per reactor design were selected for analysis. One of the selected source terms was a relatively large release while the second one was a severe accident with an intact containment. Considered endpoints of the calculations are ground contamination with Cs-137 and time-integrated concentrations of I-131 in air as well as committed doses. They are evaluated on a grid of ca. 3 km mesh size covering eastern Central Europe.
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
Multi-Scale Analysis of Trends in Northeastern Temperate Forest Springtime Phenology
NASA Astrophysics Data System (ADS)
Moon, M.; Melaas, E. K.; Sulla-menashe, D. J.; Friedl, M. A.
2017-12-01
The timing of spring leaf emergence is highly variable in many ecosystems, exerts first-order control growing season length, and significantly modulates seasonally-integrated photosynthesis. Numerous studies have reported trends toward earlier spring phenology in temperate forests, with some papers indicating that this trend is also leading to increased carbon uptake. At broad spatial scales, however, most of these studies have used data from coarse spatial resolution instruments such as MODIS, which does not resolve ecologically important landscape-scale patterns in phenology. In this work, we examine how long-term trends in spring phenology differ across three data sources acquired at different scales of measurements at the Harvard Forest in central Massachusetts. Specifically, we compared trends in the timing of phenology based on long-term in-situ measurements of phenology, estimates based on eddy-covariance measurements of net carbon uptake transition dates, and from two sources of satellite-based remote sensing (MODIS and Landsat) land surface phenology (LSP) data. Our analysis focused on the flux footprint surrounding the Harvard Forest Environmental Measurements (EMS) tower. Our results reveal clearly defined trends toward earlier springtime phenology in Landsat LSP and in the timing of tower-based net carbon uptake. However, we find no statistically significant trend in springtime phenology measured from MODIS LSP data products, possibly because the time series of MODIS observations is relatively short (13 years). The trend in tower-based transition data exhibited a larger negative value than the trend derived from Landsat LSP data (-0.42 and -0.28 days per year for 21 and 28 years, respectively). More importantly, these results have two key implications regarding how changes in spring phenology are impacting carbon uptake at landscape-scale. First, long-term trends in spring phenology can be quite different, depending on what data source is used to estimate the trend, and 2) the response of carbon uptake to climate change may be more sensitive than the response of land surface phenology itself.
Analysis of capital spending and capital financing among large US nonprofit health systems.
Stewart, Louis J
2012-01-01
This article examines the recent trends (2006 to 2009) in capital spending among 25 of the largest nonprofit health systems in the United States and analyzes the financing sources that these large nonprofit health care systems used to fund their capital spending. Total capital spending for these 25 nonprofit health entities exceeded $41 billion for the four-year period of this study. Less than 3 percent of total capital spending resulted in mergers and acquisition activities. Total annual capital spending grew at an average annual rate of 17.6 percent during the first three year of this study's period of analysis. Annual capital spending for 2009 fell by more than 22 percent over prior year's level due to the impact of widespread disruption in US tax-exempt variable rate debt markets. While cash inflow from long-term debt issues was a significant source of capital financing, this study's primary finding was that operating cash flow was the predominant source of capital spending funding. Key words: nonprofit, mergers and acquisitions (M&A), capital spending, capital financing.
LittleQuickWarp: an ultrafast image warping tool.
Qu, Lei; Peng, Hanchuan
2015-02-01
Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.
Development a computer codes to couple PWR-GALE output and PC-CREAM input
NASA Astrophysics Data System (ADS)
Kuntjoro, S.; Budi Setiawan, M.; Nursinta Adi, W.; Deswandri; Sunaryo, G. R.
2018-02-01
Radionuclide dispersion analysis is part of an important reactor safety analysis. From the analysis it can be obtained the amount of doses received by radiation workers and communities around nuclear reactor. The radionuclide dispersion analysis under normal operating conditions is carried out using the PC-CREAM code, and it requires input data such as source term and population distribution. Input data is derived from the output of another program that is PWR-GALE and written Population Distribution data in certain format. Compiling inputs for PC-CREAM programs manually requires high accuracy, as it involves large amounts of data in certain formats and often errors in compiling inputs manually. To minimize errors in input generation, than it is make coupling program for PWR-GALE and PC-CREAM programs and a program for writing population distribution according to the PC-CREAM input format. This work was conducted to create the coupling programming between PWR-GALE output and PC-CREAM input and programming to written population data in the required formats. Programming is done by using Python programming language which has advantages of multiplatform, object-oriented and interactive. The result of this work is software for coupling data of source term and written population distribution data. So that input to PC-CREAM program can be done easily and avoid formatting errors. Programming sourceterm coupling program PWR-GALE and PC-CREAM is completed, so that the creation of PC-CREAM inputs in souceterm and distribution data can be done easily and according to the desired format.
Donnelly, Aoife; Naughton, Owen; Misstear, Bruce; Broderick, Brian
2016-10-14
This article describes a new methodology for increasing the spatial representativeness of individual monitoring sites. Air pollution levels at a given point are influenced by emission sources in the immediate vicinity. Since emission sources are rarely uniformly distributed around a site, concentration levels will inevitably be most affected by the sources in the prevailing upwind direction. The methodology provides a means of capturing this effect and providing additional information regarding source/pollution relationships. The methodology allows for the division of the air quality data from a given monitoring site into a number of sectors or wedges based on wind direction and estimation of annual mean values for each sector, thus optimising the information that can be obtained from a single monitoring station. The method corrects for short-term data, diurnal and seasonal variations in concentrations (which can produce uneven weighting of data within each sector) and uneven frequency of wind directions. Significant improvements in correlations between the air quality data and the spatial air quality indicators were obtained after application of the correction factors. This suggests the application of these techniques would be of significant benefit in land-use regression modelling studies. Furthermore, the method was found to be very useful for estimating long-term mean values and wind direction sector values using only short-term monitoring data. The methods presented in this article can result in cost savings through minimising the number of monitoring sites required for air quality studies while also capturing a greater degree of variability in spatial characteristics. In this way, more reliable, but also more expensive monitoring techniques can be used in preference to a higher number of low-cost but less reliable techniques. The methods described in this article have applications in local air quality management, source receptor analysis, land-use regression mapping and modelling and population exposure studies.
Gaines, Tommi L; Urada, Lianne A; Martinez, Gustavo; Goldenberg, Shira M; Rangel, Gudelia; Reed, Elizabeth; Patterson, Thomas L; Strathdee, Steffanie A
2015-06-01
This study quantitatively examined the prevalence and correlates of short-term sex work cessation among female sex workers who inject drugs (FSW-IDUs) and determined whether injection drug use was independently associated with cessation. We used data from FSW-IDUs (n=467) enrolled into an intervention designed to increase condom use and decrease sharing of injection equipment but was not designed to promote sex work cessation. We applied a survival analysis that accounted for quit-re-entry patterns of sex work over 1-year stratified by city, Tijuana and Ciudad Juarez, Mexico. Overall, 55% of participants stopped sex work at least once during follow-up. Controlling for other characteristics and intervention assignment, injection drug use was inversely associated with short-term sex work cessation in both cities. In Ciudad Juarez, women receiving drug treatment during follow-up had a 2-fold increase in the hazard of stopping sex work. In both cities, income from sources other than sex work, police interactions and healthcare access were independently and significantly associated with shorter-term cessation. Short-term sex work cessation was significantly affected by injection drug use. Expanded drug treatment and counseling coupled with supportive services such as relapse prevention, job training, and provision of alternate employment opportunities may promote longer-term cessation among women motivated to leave the sex industry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gaines, Tommi L.; Urada, Lianne; Martinez, Gustavo; Goldenberg, Shira M.; Rangel, Gudelia; Reed, Elizabeth; Patterson, Thomas L.; Strathdee, Steffanie A.
2015-01-01
Objective This study quantitatively examined the prevalence and correlates of short-term sex work cessation among female sex workers who inject drugs (FSW-IDUs) and determined whether injection drug use was independently associated with cessation. Methods We used data from FSW-IDUs (n=467) enrolled into an intervention designed to increase condom use and decrease sharing of injection equipment but was not designed to promote sex work cessation. We applied a survival analysis that accounted for quit-re-entry patterns of sex work over 1-year stratified by city, Tijuana and Ciudad Juarez, Mexico. Results Overall, 55% of participants stopped sex work at least once during follow-up. Controlling for other characteristics and intervention assignment, injection drug use was inversely associated with short-term sex work cessation in both cities. In Ciudad Juarez, women receiving drug treatment during follow-up had a 2-fold increase in the hazard of stopping sex work. In both cities, income from sources other than sex work, police interactions and healthcare access were independently and significantly associated with shorter-term cessation. Conclusions Short-term sex work cessation was significantly affected by injection drug use. Expanded drug treatment and counseling coupled with supportive services such as relapse prevention, job training, and provision of alternate employment opportunities may promote longer-term cessation among women motivated to leave the sex industry. PMID:25644589
Warr, Benjamin; Magerl, Andreas
2016-01-01
Summary In the past few years, resource use and resource efficiency have been implemented in the European Union (EU) environmental policy programs as well as international sustainable development programs. In their programs, the EU focuses on four resource types that should be addressed: materials, energy (or carbon dioxide [CO2] emissions), water, and land. In this article, we first discuss different perspectives on energy use and present the results of a long‐term exergy and useful work analysis of the Austrian economy for the period 1900–2012, using the methodology developed by Ayres and Warr. Second, we discuss Austrian resource efficiency by comparing the presented exergy and useful work data with material use, CO2 emissions, and land‐use data taken from statistical sources. This comparison provides, for the first time, a long‐term analysis of Austrian resource efficiency based on a broad understanding thereof and evaluates Austrian development in relation to EU and Austrian policy targets. PMID:29353991
Evaluating Sustainability Models for Interoperability through Brokering Software
NASA Astrophysics Data System (ADS)
Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew
2016-04-01
Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.
A Growing Role for Gender Analysis in Air Pollution Epidemiology
Clougherty, Jane E.
2010-01-01
Objective Epidemiologic studies of air pollution effects on respiratory health report significant modification by sex, although results are not uniform. Importantly, it remains unclear whether modifications are attributable to socially derived gendered exposures, to sex-linked physiological differences, or to some interplay thereof. Gender analysis, which aims to disaggregate social from biological differences between males and females, may help to elucidate these possible sources of effect modification. Data sources and data extraction A PubMed literature search was performed in July 2009, using the terms “respiratory” and any of “sex” or “gender” or “men and women” or “boys and girls” and either “PM2.5” (particulate matter ≥ 2.5 μm in aerodynamic diameter) or “NO2” (nitrogen dioxide). I reviewed the identified studies, and others cited therein, to summarize current evidence of effect modification, with attention to authors’ interpretation of observed differences. Owing to broad differences in exposure mixes, outcomes, and analytic techniques, with few studies examining any given combination thereof, meta-analysis was not deemed appropriate at this time. Data synthesis More studies of adults report stronger effects among women, particularly for older persons or where using residential exposure assessment. Studies of children suggest stronger effects among boys in early life and among girls in later childhood. Conclusions The qualitative review describes possible sources of difference in air pollution response between women and men, which may vary by life stage, coexposures, hormonal status, or other factors. The sources of observed effect modifications remain unclear, although gender analytic approaches may help to disentangle gender and sex differences in pollution response. A framework for incorporating gender analysis into environmental epidemiology is offered, along with several potentially useful methods from gender analysis. PMID:20123621
12 CFR 201.4 - Availability and terms of credit.
Code of Federal Regulations, 2014 CFR
2014-01-01
... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...
2006-09-01
Lavoie, D. Kurts, SYNTHETIC ENVIRONMENTS AT THE ENTREPRISE LEVEL: OVERVIEW OF A GOVERNMENT OF CANADA (GOC), ACADEMIA and INDUSTRY DISTRIBUTED...vehicle (UAV) focused to locate the radiological source, and by comparing the performance of these assets in terms of various capability based...framework to analyze homeland security capabilities • Illustrate how a rapidly configured distributed simulation involving academia, industry and
Verification of Methods for Assessing the Sustainability of Monitored Natural Attenuation (MNA)
2013-01-01
sugars TOC total organic carbon TSR thermal source removal USACE U.S. Army Corps of Engineers USEPA U.S. Environmental Protection Agency USGS...the SZD function for long-term DNAPL dissolution simulations. However, the sustainability assessment was easily implemented using an alternative...neutral sugars [THNS]). Chapelle et al. (2009) suggested THAA and THNS as measures of the bioavailability of organic carbon based on an analysis of
Characterization and Reliability of Vertical N-Type Gallium Nitride Schottky Contacts
2016-09-01
barrier diode SEM scanning electron microscopy SiC silicon carbide SMU source measure unit xvi THIS PAGE INTENTIONALLY LEFT BLANK xvii...arguably the Schottky barrier diode (SBD). The SBD is a fundamental component in the majority of power electronic devices; specifically, those used in...Ishizuka, and Ueno demonstrated the long-term reliability of vertical metal-GaN Schottky barrier diodes through their analysis of the degradation
The Navy’s Superior Supplier Incentive Program: Analysis of Supplier Proposed Benefits
2015-12-01
rather than “short-term, easy, expendable and replaceable sources of goods and services” (Carter & Choi, 2008, p. 2). Japanese automakers Toyota ...was the key element behind Toyota and Honda’s strategic successes. Under the supplier keiretsu, the automakers worked closely with the selected...suppliers to achieve mutually beneficial objectives. Toyota and Honda implemented the keiretsu model in their North American plants and achieved
Spectrophotometers for plutonium monitoring in HB-line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lascola, R. J.; O'Rourke, P. E.; Kyser, E. A.
2016-02-12
This report describes the equipment, control software, calibrations for total plutonium and plutonium oxidation state, and qualification studies for the instrument. It also provides a detailed description of the uncertainty analysis, which includes source terms associated with plutonium calibration standards, instrument drift, and inter-instrument variability. Also included are work instructions for instrument, flow cell, and optical fiber setup, work instructions for routine maintenance, and drawings and schematic diagrams.
An airport cargo inspection system based on X-ray and thermal neutron analysis (TNA).
Ipe, Nisy E; Akery, A; Ryge, P; Brown, D; Liu, F; Thieu, J; James, B
2005-01-01
A cargo inspection system incorporating a high-resolution X-ray imaging system with a material-specific detection system based on Ancore Corporation's patented thermal neutron analysis (TNA) technology can detect bulk quantities of explosives and drugs concealed in trucks or cargo containers. The TNA process utilises a 252Cf neutron source surrounded by a moderator. The neutron interactions with the inspected object result in strong and unique gamma-ray signals from nitrogen, which is a key ingredient in modern high explosives, and from chlorinated drugs. The TNA computer analyses the gamma-ray signals and automatically determines the presence of explosives or drugs. The radiation source terms and shielding design of the facility are described. For the X-ray generator, the primary beam, leakage radiation, and scattered primary and leakage radiation were considered. For the TNA, the primary neutrons and tunnel scattered neutrons as well as the neutron-capture gamma rays were considered.
NASA Technical Reports Server (NTRS)
Kraft, R. E.
1996-01-01
The objective of this effort is to develop an analytical model for the coupling of active noise control (ANC) piston-type actuators that are mounted flush to the inner and outer walls of an annular duct to the modes in the duct generated by the actuator motion. The analysis will be used to couple the ANC actuators to the modal analysis propagation computer program for the annular duct, to predict the effects of active suppression of fan-generated engine noise sources. This combined program will then be available to assist in the design or evaluation of ANC systems in fan engine annular exhaust ducts. An analysis has been developed to predict the modes generated in an annular duct due to the coupling of flush-mounted ring actuators on the inner and outer walls of the duct. The analysis has been combined with a previous analysis for the coupling of modes to a cylindrical duct in a FORTRAN computer program to perform the computations. The method includes the effects of uniform mean flow in the duct. The program can be used for design or evaluation purposes for active noise control hardware for turbofan engines. Predictions for some sample cases modeled after the geometry of the NASA Lewis ANC Fan indicate very efficient coupling in both the inlet and exhaust ducts for the m = 6 spinning mode at frequencies where only a single radial mode is cut-on. Radial mode content in higher order cut-off modes at the source plane and the required actuator displacement amplitude to achieve 110 dB SPL levels in the desired mode were predicted. Equivalent cases with and without flow were examined for the cylindrical and annular geometry, and little difference was found for a duct flow Mach number of 0.1. The actuator ring coupling program will be adapted as a subroutine to the cylindrical duct modal analysis and the exhaust duct modal analysis. This will allow the fan source to be defined in terms of characteristic modes at the fan source plane and predict the propagation to the arbitrarily-located ANC source plane. The actuator velocities can then be determined to generate the anti-phase mode. The resulting combined fan source/ANC pressure can then be calculated at any desired wall sensor position. The actuator velocities can be determined manually or using a simulation of a control system feedback loop. This will provide a very useful ANC system design and evaluation tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, S R; Dreger, D S; Phillips, W S
2008-07-16
Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less
The Impact of Long-Term Climate Change on Nitrogen Runoff at the Watershed Scale.
NASA Astrophysics Data System (ADS)
Dorley, J.; Duffy, C.; Arenas Amado, A.
2017-12-01
The impact of agricultural runoff is a major concern for water quality of mid-western streams. This concern is largely due to excessive use of agricultural fertilizer, a major source of nutrients in many Midwestern watersheds. In order to improve water quality in these watersheds, understanding the long-term trends in nutrient concentration and discharge is an important water quality problem. This study attempts to analyze the role of long-term temperature and precipitation on nitrate runoff in an agriculturally dominated watershed in Iowa. The approach attempts to establish the concentration-discharge (C-Q) signature for the watershed using time series analysis, frequency analysis and model simulation. The climate data is from the Intergovernmental Panel on Climate Change (IPCC), model GFDL-CM3 (Geophysical Fluid Dynamic Laboratory Coupled Model 3). The historical water quality data was made available by the IIHR-Hydroscience & Engineering at the University of Iowa for the clear creek watershed (CCW). The CCW is located in east-central Iowa. The CCW is representative of many Midwestern watersheds with humid-continental climate with predominantly agricultural land use. The study shows how long-term climate changes in temperature and precipitation affects the C-Q dynamics and how a relatively simple approach to data analysis and model projections can be applied to best management practices at the site.
NASA Astrophysics Data System (ADS)
Hao, Ming; Rohrdantz, Christian; Janetzko, Halldór; Keim, Daniel; Dayal, Umeshwar; Haug, Lars-Erik; Hsu, Mei-Chun
2012-01-01
Twitter currently receives over 190 million tweets (small text-based Web posts) and manufacturing companies receive over 10 thousand web product surveys a day, in which people share their thoughts regarding a wide range of products and their features. A large number of tweets and customer surveys include opinions about products and services. However, with Twitter being a relatively new phenomenon, these tweets are underutilized as a source for determining customer sentiments. To explore high-volume customer feedback streams, we integrate three time series-based visual analysis techniques: (1) feature-based sentiment analysis that extracts, measures, and maps customer feedback; (2) a novel idea of term associations that identify attributes, verbs, and adjectives frequently occurring together; and (3) new pixel cell-based sentiment calendars, geo-temporal map visualizations and self-organizing maps to identify co-occurring and influential opinions. We have combined these techniques into a well-fitted solution for an effective analysis of large customer feedback streams such as for movie reviews (e.g., Kung-Fu Panda) or web surveys (buyers).
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.
1991-01-01
The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.
NASA Astrophysics Data System (ADS)
Ravanel, X.; Trouiller, C.; Juhel, M.; Wyon, C.; Kwakman, L. F. Tz.; Léonard, D.
2008-12-01
Recent time-of-flight secondary ion mass spectrometry studies using primary ion cluster sources such as Au n+, SF 5+, Bi n+ or C 60+ have shown the great advantages in terms of secondary ion yield enhancement and ion formation efficiency of polyatomic ion sources as compared to monoatomic ion sources like the commonly used Ga +. In this work, the effective gains provided by such a source in the static ToF-SIMS analysis of microelectronics devices were investigated. Firstly, the influence of the number of atoms in the primary cluster ion on secondary ion formation was studied for physically adsorbed di-isononyl phthalate (DNP) (plasticizer) and perfluoropolyether (PFPE). A drastic increase in secondary ion formation efficiency and a much lower detection limit were observed when using a polyatomic primary ion. Moreover, the yield of the higher mass species was much enhanced indicating a lower degree of fragmentation that can be explained by the fact that the primary ion energy is spread out more widely, or that there is a lower energy per incoming ion. Secondly, the influence of the number of Bi atoms in the Bi n primary ion on the information depth was studied using reference thermally grown silicon oxide samples. The information depth provided by a Bi n cluster was shown to be lowered when the number of atoms in the aggregate was increased.
Source apportionment of wet-deposited atmospheric mercury in Tampa, Florida
NASA Astrophysics Data System (ADS)
Michael, Ryan; Stuart, Amy L.; Trotz, Maya A.; Akiwumi, Fenda
2016-03-01
In this paper, sources of mercury deposition to the Tampa area (Florida, USA) are investigated by analysis of one year (March 2000-March 2001) of daily wet deposition data. HYSPLIT back-trajectory modeling was performed to assess potential source locations for high versus low concentration events in data stratified by precipitation level. Positive matrix factorization (PMF) was also applied to apportion the elemental compositions from each event and to identify sources. Increased total mercury deposition was observed during summer months, corresponding to increased precipitation. However, mercury concentration in deposited samples was not strongly correlated with precipitation amount. Back-trajectories show air masses passing over Florida land in the short (12 h) and medium (24 h) term prior to deposition for high mercury concentration events. PMF results indicate that eleven factors contribute to the deposited elements in the event data. Diagnosed elemental profiles suggest the sources that contribute to mercury wet deposition at the study site are coal combustion (52% of the deposited mercury mass), municipal waste incineration (23%), medical waste incineration (19%), and crustal dust (6%). Overall, results suggest that sources local to the county and in Florida likely contributed substantially to mercury deposition at the study site, but distant sources may also contribute.
10 CFR 40.41 - Terms and conditions of licenses.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...
Bosire Onyancha, Omwoyo
2008-05-01
As channels of communicating HIV/AIDS research information, serial publications and particularly journals are increasingly used in response to the pandemic. The last few decades have witnessed a proliferation of sources of HIV/AIDS-related information, bringing many challenges to collection-development librarians as well as to researchers. This study uses an informetric approach to examine the growth, productivity and scientific impact of these sources, during the period 1980 to 2005, and especially to measure performance in the publication and dissemination of HIV/AIDS research about or from eastern or southern Africa. Data were collected from MEDLINE, Science Citation Index (SCI), Social Sciences Citation Index (SSCI), and Ulrich's Periodical Directory. The analysis used Sitkis version 1.5, Microsoft Office Access, Microsoft Office Excel, Bibexcel, and Citespace version 2.0.1. The specific objectives were to identify the number of sources of HIV/AIDS-related information that have been published in the region, the coverage of these in key bibliographic databases, the most commonly used publication type for HIV/AIDS research, the countries in which the sources are published, the sources' productivity in terms of numbers of papers and citations, the most influential sources, the subject coverage of the sources, and the core sources of HIV/AIDS-information.
Recent changes in the oxidized to reduced nitrogen ratio in atmospheric precipitation
NASA Astrophysics Data System (ADS)
Kurzyca, Iwona; Frankowski, Marcin
2017-10-01
In this study, the characteristics of precipitation in terms of various nitrogen forms (NO3-, NO2-, NH4+, Norganic, Ntotal) is presented. The samples were collected in the areas of different anthropogenic pressure (urban area vs. ecologically protected woodland area, ∼30 km distant from each other; Wielkopolska region, Poland). Based on the Nox and Nred emission profiles (Nox/Nred ratio), temporal and spatial comparison was carried out. For both sites, during a decade of observation, more than 60% of samples had higher contribution of N-NH4+ than N-NO3-, the amount of N-NO2- was negligible, and organic nitrogen amounted to 30% of total nitrogen content which varied up to 16 mg/l. The precipitation events w ith high concentration of nitrogen species were investigated in terms of possible local and remote sources of nitrogen (synoptic meteorology), to indicate the areas which can act as potential sources of N-compounds. Based on the chemometric analysis, it was found that Nred implies Nox and vice versa, due to interactions between them in the atmosphere. Taking into account the analysis of precipitation occurring simultaneously in both locations (about 50% of all rainfall episodes), it was observed that such factor as anthropogenic pressure differentiates but does not determine the chemical composition of precipitation in the investigated areas (urban vs. woodland area; distance of ∼30 km). Thermodynamics of the atmosphere had a significant impact on concentrations of N-NO3- and N-NH4+ in precipitation, as well as the circulation of air masses and remote N sources responsible for transboundary inflow of pollutants.
Lv, Ying; Huang, Guohe; Sun, Wei
2013-01-01
A scenario-based interval two-phase fuzzy programming (SITF) method was developed for water resources planning in a wetland ecosystem. The SITF approach incorporates two-phase fuzzy programming, interval mathematical programming, and scenario analysis within a general framework. It can tackle fuzzy and interval uncertainties in terms of cost coefficients, resources availabilities, water demands, hydrological conditions and other parameters within a multi-source supply and multi-sector consumption context. The SITF method has the advantage in effectively improving the membership degrees of the system objective and all fuzzy constraints, so that both higher satisfactory grade of the objective and more efficient utilization of system resources can be guaranteed. Under the systematic consideration of water demands by the ecosystem, the SITF method was successfully applied to Baiyangdian Lake, which is the largest wetland in North China. Multi-source supplies (including the inter-basin water sources of Yuecheng Reservoir and Yellow River), and multiple water users (including agricultural, industrial and domestic sectors) were taken into account. The results indicated that, the SITF approach would generate useful solutions to identify long-term water allocation and transfer schemes under multiple economic, environmental, ecological, and system-security targets. It can address a comparative analysis for the system satisfactory degrees of decisions under various policy scenarios. Moreover, it is of significance to quantify the relationship between hydrological change and human activities, such that a scheme on ecologically sustainable water supply to Baiyangdian Lake can be achieved. Copyright © 2012 Elsevier B.V. All rights reserved.
A Wearable Inertial Measurement Unit for Long-Term Monitoring in the Dependency Care Area
Rodríguez-Martín, Daniel; Pérez-López, Carlos; Samà, Albert; Cabestany, Joan; Català, Andreu
2013-01-01
Human movement analysis is a field of wide interest since it enables the assessment of a large variety of variables related to quality of life. Human movement can be accurately evaluated through Inertial Measurement Units (IMU), which are wearable and comfortable devices with long battery life. The IMU's movement signals might be, on the one hand, stored in a digital support, in which an analysis is performed a posteriori. On the other hand, the signal analysis might take place in the same IMU at the same time as the signal acquisition through online classifiers. The new sensor system presented in this paper is designed for both collecting movement signals and analyzing them in real-time. This system is a flexible platform useful for collecting data via a triaxial accelerometer, a gyroscope and a magnetometer, with the possibility to incorporate other information sources in real-time. A μSD card can store all inertial data and a Bluetooth module is able to send information to other external devices and receive data from other sources. The system presented is being used in the real-time detection and analysis of Parkinson's disease symptoms, in gait analysis, and in a fall detection system. PMID:24145917
A wearable inertial measurement unit for long-term monitoring in the dependency care area.
Rodríguez-Martín, Daniel; Pérez-López, Carlos; Samà, Albert; Cabestany, Joan; Català, Andreu
2013-10-18
Human movement analysis is a field of wide interest since it enables the assessment of a large variety of variables related to quality of life. Human movement can be accurately evaluated through Inertial Measurement Units (IMU), which are wearable and comfortable devices with long battery life. The IMU's movement signals might be, on the one hand, stored in a digital support, in which an analysis is performed a posteriori. On the other hand, the signal analysis might take place in the same IMU at the same time as the signal acquisition through online classifiers. The new sensor system presented in this paper is designed for both collecting movement signals and analyzing them in real-time. This system is a flexible platform useful for collecting data via a triaxial accelerometer, a gyroscope and a magnetometer, with the possibility to incorporate other information sources in real-time. A µSD card can store all inertial data and a Bluetooth module is able to send information to other external devices and receive data from other sources. The system presented is being used in the real-time detection and analysis of Parkinson's disease symptoms, in gait analysis, and in a fall detection system.
Fish-Eye Observing with Phased Array Radio Telescopes
NASA Astrophysics Data System (ADS)
Wijnholds, S. J.
The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.
NASA Technical Reports Server (NTRS)
Chin, Mian
2012-01-01
We present a global model analysis of the impact of long-range transport and anthropogenic emissions on the aerosol trends in the major pollution regions in the northern hemisphere and in the Arctic in the past three decades. We will use the Goddard Chemistry Aerosol Radiation and Transport (GOCART) model to analyze the multi-spatial and temporal scale data, including observations from Terra, Aqua, and CALIPSO satellites and from the long-term surface monitoring stations. We will analyze the source attribution (SA) and source-receptor (SR) relationships in North America, Europe, East Asia, South Asia, and the Arctic at the surface and free troposphere and establish the quantitative linkages between emissions from different source regions. We will discuss the implications for regional air quality and climate change.
Tunno, Brett J; Dalton, Rebecca; Michanowicz, Drew R; Shmool, Jessie L C; Kinnee, Ellen; Tripathy, Sheila; Cambal, Leah; Clougherty, Jane E
2016-01-01
Health effects of fine particulate matter (PM2.5) vary by chemical composition, and composition can help to identify key PM2.5 sources across urban areas. Further, this intra-urban spatial variation in concentrations and composition may vary with meteorological conditions (e.g., mixing height). Accordingly, we hypothesized that spatial sampling during atmospheric inversions would help to better identify localized source effects, and reveal more distinct spatial patterns in key constituents. We designed a 2-year monitoring campaign to capture fine-scale intra-urban variability in PM2.5 composition across Pittsburgh, PA, and compared both spatial patterns and source effects during “frequent inversion” hours vs 24-h weeklong averages. Using spatially distributed programmable monitors, and a geographic information systems (GIS)-based design, we collected PM2.5 samples across 37 sampling locations per year to capture variation in local pollution sources (e.g., proximity to industry, traffic density) and terrain (e.g., elevation). We used inductively coupled plasma mass spectrometry (ICP-MS) to determine elemental composition, and unconstrained factor analysis to identify source suites by sampling scheme and season. We examined spatial patterning in source factors using land use regression (LUR), wherein GIS-based source indicators served to corroborate factor interpretations. Under both summer sampling regimes, and for winter inversion-focused sampling, we identified six source factors, characterized by tracers associated with brake and tire wear, steel-making, soil and road dust, coal, diesel exhaust, and vehicular emissions. For winter 24-h samples, four factors suggested traffic/fuel oil, traffic emissions, coal/industry, and steel-making sources. In LURs, as hypothesized, GIS-based source terms better explained spatial variability in inversion-focused samples, including a greater contribution from roadway, steel, and coal-related sources. Factor analysis produced source-related constituent suites under both sampling designs, though factors were more distinct under inversion-focused sampling. PMID:26507005
The sustainability of healthcare innovations: a concept analysis.
Fleiszer, Andrea R; Semenic, Sonia E; Ritchie, Judith A; Richer, Marie-Claire; Denis, Jean-Louis
2015-07-01
To report on an analysis of the concept of the sustainability of healthcare innovations. While there have been significant empirical, theoretical and practical contributions made towards the development and implementation of healthcare innovations, there has been less attention paid to their sustainability. Yet many desired healthcare innovations are not sustained over the long term. There is a need to increase clarity around the concept of innovation sustainability to guide the advancement of knowledge on this topic. Concept analysis. We included literature reviews, theoretical and empirical articles, books and grey literature obtained through database searching (ABI/INFORM, Academic Search Complete, Business Source Complete, CINAHL, Embase, MEDLINE and Web of Science) from 1996-May 2014, reference harvesting and citation searching. We examined sources according to terms and definitions, characteristics, preconditions, outcomes and boundaries to evaluate the maturity of the concept. This concept is partially mature. Healthcare innovation sustainability remains a multi-dimensional, multi-factorial notion that is used inconsistently or ambiguously and takes on different meanings at different times in different contexts. We propose a broad conceptualization that consists of three characteristics: benefits, routinization or institutionalization, and development. We also suggest that sustained innovations are influenced by a variety of preconditions or factors, which are innovation-, context-, leadership- and process-related. Further conceptual development is essential to continue advancing our understanding of the sustainability of healthcare innovations, especially in nursing where this topic remains largely unexplored. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.
2016-12-01
Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.
Localization of sound sources in a room with one microphone
NASA Astrophysics Data System (ADS)
Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre
2017-08-01
Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.
Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L
2016-01-01
The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex and evolving public views of health issues.
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
DCCA analysis of renewable and conventional energy prices
NASA Astrophysics Data System (ADS)
Paiva, Aureliano Sancho Souza; Rivera-Castro, Miguel Angel; Andrade, Roberto Fernandes Silva
2018-01-01
Here we investigate the inter-influence of oil prices and renewable energy sources. The non-stationary time series are scrutinized within the Detrended Cross-Correlation Analysis (DCCA) framework, where the resulting DCCA coefficient provides a useful and reliable index to the evaluate the cross correlation between events at the same time instant as well as at a suitably chosen time lags. The analysis is based on the quotient of two successive daily closing oil prices and composite indices of renewable energy sources in USA and Europe in the period 2006-2015, which was subject to several social and economic driving forces, as the increase of social pressure in favor of the use of non-fossil energy sources and the worldwide economic crisis that started in 2008. The DCCA coefficient is evaluated for different window sizes, extracting information for short and long term correlation between the indices. Particularly, strong correlation between the behavior of the two distinct economic sectors are observed for large time intervals during the worst period of the economic crisis (2008-2012), hinting at a very cautious behavior of the economic agents. Before and after this period, the behavior of two economic sectors are overwhelmingly uncorrelated or very weakly correlated. The results reported here may be useful to select proper strategies in future similar scenarios.
NASA Astrophysics Data System (ADS)
Hindshaw, Ruth S.; Tosca, Nicholas J.; Piotrowski, Alexander M.; Tipper, Edward T.
2018-03-01
The identification of sediment sources to the ocean is a prerequisite to using marine sediment cores to extract information on past climate and ocean circulation. Sr and Nd isotopes are classical tools with which to trace source provenance. Despite considerable interest in the Arctic Ocean, the circum-Arctic source regions are poorly characterised in terms of their Sr and Nd isotopic compositions. In this study we present Sr and Nd isotope data from the Paleogene Central Basin sediments of Svalbard, including the first published data of stream suspended sediments from Svalbard. The stream suspended sediments exhibit considerable isotopic variation (ɛNd = -20.6 to -13.4; 87Sr / 86Sr = 0.73421 to 0.74704) which can be related to the depositional history of the sedimentary formations from which they are derived. In combination with analysis of the clay mineralogy of catchment rocks and sediments, we suggest that the Central Basin sedimentary rocks were derived from two sources. One source is Proterozoic sediments derived from Greenlandic basement rocks which are rich in illite and have high 87Sr / 86Sr and low ɛNd values. The second source is Carboniferous to Jurassic sediments derived from Siberian basalts which are rich in smectite and have low 87Sr / 86Sr and high ɛNd values. Due to a change in depositional conditions throughout the Paleogene (from deep sea to continental) the relative proportions of these two sources vary in the Central Basin formations. The modern stream suspended sediment isotopic composition is then controlled by modern processes, in particular glaciation, which determines the present-day exposure of the formations and therefore the relative contribution of each formation to the stream suspended sediment load. This study demonstrates that the Nd isotopic composition of stream suspended sediments exhibits seasonal variation, which likely mirrors longer-term hydrological changes, with implications for source provenance studies based on fixed end-members through time.
A Chandra X-Ray Study of NGC 1068 IL the Luminous X-Ray Source Population
NASA Technical Reports Server (NTRS)
Smith, David A.; Wilson, Andrew S.
2003-01-01
We present an analysis of the compact X-ray source population in the Seyfert 2 galaxy NGC 1068, imaged with a approx. 50 ks Chandra observation. We find a total of 84 compact sources on the S3 chip, of which 66 are located within the 25.0 B-mag/arcsec isophote of the galactic disk of NGC 1068. Spectra have been obtained for the 21 sources with at least 50 counts and modeled with both multicolor disk blackbody and power-law models. The power-law model provides the better description of the spectrum for 18 of these sources. For fainter sources, the spectral index has been estimated from the hardness ratio. Five sources have 0.4 - 8 keV intrinsic luminosities greater than 10(exp 39)ergs/ s, assuming that their emission is isotropic and that they are associated with NGC 1068. We refer to these sources as intermediate-luminosity X-ray objects (ISOs). If these five sources are X-ray binaries accreting with luminosities that are both sub-Eddington and isotropic, then the implied source masses are approx greater than 7 solar mass, and so they are inferred to be black holes. Most of the spectrally modeled sources have spectral shapes similar to Galactic black hole candidates. However, the brightest compact source in NGC 1068 has a spectrum that is much harder than that found in Galactic black hole candidates and other ISOs. The brightest source also shows large amplitude variability on both short-term and long-term timescales, with the count rate possibly decreasing by a factor of 2 in approx. 2 ks during our Chundra observation, and the source flux decreasing by a factor of 5 between our observation and the grating observations taken just over 9 months later. The ratio of the number of sources with luminosities greater than 2.1 x 10(exp 38) ergs/s in the 0.4 - 8 keV band to the rate of massive (greater than 5 solar mass) star formation is the same, to within a factor of 2, for NGC 1068, the Antennae, NGC 5194 (the main galaxy in M51), and the Circinus galaxy. This suggests that the rate of production of X-ray binaries per massive star is approximately the same for galaxies with currently active star formation, including "starbursts."
NASA Astrophysics Data System (ADS)
Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.
2015-04-01
We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particles (NR-PM1) in the southeastern US. Measurements were performed in both rural and urban sites in the greater Atlanta area, GA and Centreville, AL for approximately one year, as part of Southeastern Center of Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important but not dominant contributions to total OA in urban sites. Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA (Isoprene-OA) is only deconvolved in warmer months and contributes 18-36% of total OA. The presence of Isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79%) of OA in all sites. MO-OOA correlates well with ozone in summer, but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100% of total measured nitrates in summer. Further, the contribution of organic nitrates to total OA is estimated to be 5-12% in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern US. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer, possibly due to stagnant air mass and a dominant amount of regional SOA in the southeastern US. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observed that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern US. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially higher in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term, intensive measurements.
NASA Astrophysics Data System (ADS)
Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.
2015-07-01
We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particulate matter (NR-PM1) in the southeastern USA. Measurements were performed in both rural and urban sites in the greater Atlanta area, Georgia (GA), and Centreville, Alabama (AL), for approximately 1 year as part of Southeastern Center for Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR-PM1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important, but not dominant, contributions to total OA in urban sites (i.e., 21-38 % of total OA depending on site and season). Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA factor (isoprene-OA) is only deconvolved in warmer months and contributes 18-36 % of total OA. The presence of isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79 %) of OA in all sites. MO-OOA correlates well with ozone in summer but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100 % to the total measured nitrates in summer. Furthermore, the contribution of organic nitrates to total OA is estimated to be 5-12 % in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern USA. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer due possibly to stagnant air mass and a dominant amount of regional secondary organic aerosol (SOA) in the southeastern USA. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observe that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern USA. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially stronger in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term intensive measurements.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
Long-term monitoring of the Sedlec Ossuary - Analysis of hygrothermal conditions
NASA Astrophysics Data System (ADS)
Pavlík, Zbyšek; Balík, Lukáš; Maděra, Jiří; Černý, Robert
2016-07-01
The Sedlec Ossuary is one of the twelve UNESCO World Heritage Sites in the Czech Republic. Although the ossuary is listed among the most visited Czech tourist attractions, its technical state is almost critical and a radical renovation is necessary. On this account, hygrothermal performance of the ossuary is experimentally researched in the presented paper in order to get information on moisture sources and to get necessary data for optimized design of renovation treatments and reconstruction solutions that will allow preserve the historical significance of this attractive heritage site. Within the performed experimental analysis, the interior and exterior climatic conditions are monitored over an almost three year period together with relative humidity and temperature profiles measured in the most damage parts of the ossuary chapel. On the basis of measured data, the long-term hygrothermal state of the ossuary building is accessed and the periods of possible surface condensation are identified.
Analysis of airframe/engine interactions - An integrated control perspective
NASA Technical Reports Server (NTRS)
Schmidt, David K.; Schierman, John D.; Garg, Sanjay
1990-01-01
Techniques for the analysis of the dynamic interactions between airframe/engine dynamical systems are presented. Critical coupling terms are developed that determine the significance of these interactions with regard to the closed loop stability and performance of the feedback systems. A conceptual model is first used to indicate the potential sources of the coupling, how the coupling manifests itself, and how the magnitudes of these critical coupling terms are used to quantify the effects of the airframe/engine interactions. A case study is also presented involving an unstable airframe with thrust vectoring for attitude control. It is shown for this system with classical, decentralized control laws that there is little airframe/engine interaction, and the stability and performance with those control laws is not affected. Implications of parameter uncertainty in the coupling dynamics is also discussed, and effects of these parameter variations are also demonstrated to be small for this vehicle configuration.
NASA Technical Reports Server (NTRS)
Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.
1977-01-01
Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.
In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less
NASA Astrophysics Data System (ADS)
Fawole, Olusegun G.; Cai, Xiaoming; Levine, James G.; Pinker, Rachel T.; MacKenzie, A. R.
2016-12-01
The West African region, with its peculiar climate and atmospheric dynamics, is a prominent source of aerosols. Reliable and long-term in situ measurements of aerosol properties are not readily available across the region. In this study, Version 2 Level 1.5 Aerosol Robotic Network (AERONET) data were used to study the absorption and size distribution properties of aerosols from dominant sources identified by trajectory analysis. The trajectory analysis was used to define four sources of aerosols over a 10 year period. Sorting the AERONET aerosol retrievals by these putative sources, the hypothesis that there exists an optically distinct gas flaring signal was tested. Dominance of each source cluster varies with season: desert-dust (DD) and biomass burning (BB) aerosols are dominant in months prior to the West African Monsoon (WAM); urban (UB) and gas flaring (GF) aerosol are dominant during the WAM months. BB aerosol, with single scattering albedo (SSA) at 675 nm value of 0.86 ± 0.03 and GF aerosol with SSA (675 nm) value of 0.9 ± 0.07, is the most absorbing of the aerosol categories. The range of Absorption Angstr&öm Exponent (AAE) for DD, BB, UB and GF classes are 1.99 ± 0.35, 1.45 ± 0.26, 1.21 ± 0.38 and 0.98 ± 0.25, respectively, indicating different aerosol composition for each source. The AAE (440-870 nm) and Angstr&öm Exponent (AE) (440-870 nm) relationships further show the spread and overlap of the variation of these optical and microphysical properties, presumably due in part to similarity in the sources of aerosols and in part, due to mixing of air parcels from different sources en route to the measurement site.
NASA Astrophysics Data System (ADS)
Galluzzi, V.; Massardi, M.; Bonaldi, A.; Casasola, V.; Gregorini, L.; Trombetti, T.; Burigana, C.; Bonato, M.; De Zotti, G.; Ricci, R.; Stevens, J.; Ekers, R. D.; Bonavera, L.; di Serego Alighieri, S.; Liuzzo, E.; López-Caniego, M.; Paladino, R.; Toffolatti, L.; Tucci, M.; Callingham, J. R.
2018-03-01
We present high sensitivity (σP ≃ 0.6 mJy) polarimetric observations in seven bands, from 2.1 to 38 GHz, of a complete sample of 104 compact extragalactic radio sources brighter than 200 mJy at 20 GHz. Polarization measurements in six bands, in the range 5.5-38 GHz, for 53 of these objects were reported by Galluzzi et al. We have added new measurements in the same six bands for another 51 sources and measurements at 2.1 GHz for the full sample of 104 sources. Also, the previous measurements at 18, 24, 33, and 38 GHz were re-calibrated using the updated model for the flux density absolute calibrator, PKS1934-638, not available for the earlier analysis. The observations, carried out with the Australia Telescope Compact Array, achieved a 90 per cent detection rate (at 5σ) in polarization. 89 of our sources have a counterpart in the 72-231 MHz GLEAM (GaLactic and Extragalactic All-sky Murchison Widefield Array) survey, providing an unparalleled spectral coverage of 2.7 decades of frequency for these sources. While the total intensity data from 5.5 to 38 GHz could be interpreted in terms of single component emission, a joint analysis of more extended total intensity spectra presented here, and of the polarization spectra, reveals that over 90 per cent of our sources show clear indications of at least two emission components. We interpret this as an evidence of recurrent activity. Our high sensitivity polarimetry has allowed a 5σ detection of the weak circular polarization for ˜ 38 per cent of the data set, and a deeper estimate of 20 GHz polarization source counts than has been possible so far.
General analytical solutions for DC/AC circuit-network analysis
NASA Astrophysics Data System (ADS)
Rubido, Nicolás; Grebogi, Celso; Baptista, Murilo S.
2017-06-01
In this work, we present novel general analytical solutions for the currents that are developed in the edges of network-like circuits when some nodes of the network act as sources/sinks of DC or AC current. We assume that Ohm's law is valid at every edge and that charge at every node is conserved (with the exception of the source/sink nodes). The resistive, capacitive, and/or inductive properties of the lines in the circuit define a complex network structure with given impedances for each edge. Our solution for the currents at each edge is derived in terms of the eigenvalues and eigenvectors of the Laplacian matrix of the network defined from the impedances. This derivation also allows us to compute the equivalent impedance between any two nodes of the circuit and relate it to currents in a closed circuit which has a single voltage generator instead of many input/output source/sink nodes. This simplifies the treatment that could be done via Thévenin's theorem. Contrary to solving Kirchhoff's equations, our derivation allows to easily calculate the redistribution of currents that occurs when the location of sources and sinks changes within the network. Finally, we show that our solutions are identical to the ones found from Circuit Theory nodal analysis.
NASA Astrophysics Data System (ADS)
Hadi, Nik Azran Ab; Rashid, Wan Norhisyam Abd; Hashim, Nik Mohd Zarifie; Mohamad, Najmiah Radiah; Kadmin, Ahmad Fauzan
2017-10-01
Electricity is the most powerful energy source in the world. Engineer and technologist combined and cooperated to invent a new low-cost technology and free carbon emission where the carbon emission issue is a major concern now due to global warming. Renewable energy sources such as hydro, wind and wave are becoming widespread to reduce the carbon emissions, on the other hand, this effort needs several novel methods, techniques and technologies compared to coal-based power. Power quality of renewable sources needs in depth research and endless study to improve renewable energy technologies. The aim of this project is to investigate the impact of renewable electric generator on its local distribution system. The power farm was designed to connect to the local distribution system and it will be investigated and analyzed to make sure that energy which is supplied to customer is clean. The MATLAB tools are used to simulate the overall analysis. At the end of the project, a summary of identifying various voltage fluctuates data sources is presented in terms of voltage flicker. A suggestion of the analysis impact of wave power generation on its local distribution is also presented for the development of wave generator farms.
NASA Astrophysics Data System (ADS)
Čufar, Aljaž; Batistoni, Paola; Conroy, Sean; Ghani, Zamir; Lengar, Igor; Milocco, Alberto; Packer, Lee; Pillon, Mario; Popovichev, Sergey; Snoj, Luka; JET Contributors
2017-03-01
At the Joint European Torus (JET) the ex-vessel fission chambers and in-vessel activation detectors are used as the neutron production rate and neutron yield monitors respectively. In order to ensure that these detectors produce accurate measurements they need to be experimentally calibrated. A new calibration of neutron detectors to 14 MeV neutrons, resulting from deuterium-tritium (DT) plasmas, is planned at JET using a compact accelerator based neutron generator (NG) in which a D/T beam impinges on a solid target containing T/D, producing neutrons by DT fusion reactions. This paper presents the analysis that was performed to model the neutron source characteristics in terms of energy spectrum, angle-energy distribution and the effect of the neutron generator geometry. Different codes capable of simulating the accelerator based DT neutron sources are compared and sensitivities to uncertainties in the generator's internal structure analysed. The analysis was performed to support preparation to the experimental measurements performed to characterize the NG as a calibration source. Further extensive neutronics analyses, performed with this model of the NG, will be needed to support the neutron calibration experiments and take into account various differences between the calibration experiment and experiments using the plasma as a source of neutrons.
NASA Astrophysics Data System (ADS)
Johnson, J. Bruce; Reeve, S. W.; Burns, W. A.; Allen, Susan D.
2010-04-01
Termed Special Nuclear Material (SNM) by the Atomic Energy Act of 1954, fissile materials, such as 235U and 239Pu, are the primary components used to construct modern nuclear weapons. Detecting the clandestine presence of SNM represents an important capability for Homeland Security. An ideal SNM sensor must be able to detect fissile materials present at ppb levels, be able to distinguish between the source of the detected fissile material, i.e., 235U, 239Pu, 233U or other fission source, and be able to perform the discrimination in near real time. A sensor with such capabilities would provide not only rapid identification of a threat but, ultimately, information on the potential source of the threat. For example, current detection schemes for monitoring clandestine nuclear testing and nuclear fuel reprocessing to provide weapons grade fissile material rely largely on passive air sampling combined with a subsequent instrumental analysis or some type of wet chemical analysis of the collected material. It would be highly useful to have a noncontact method of measuring isotopes capable of providing forensic information rapidly at ppb levels of detection. Here we compare the use of Kr, Xe and I as "canary" species for distinguishing between 235U and 239Pu fission sources by spectroscopic methods.
Tarrass, Faissal; Benjelloun, Meryem; Benjelloun, Omar
2008-07-01
Water is a vital aspect of hemodialysis. During the procedure, large volumes of water are used to prepare dialysate and clean and reprocess machines. This report evaluates the technical and economic feasibility of recycling hemodialysis wastewater for irrigation uses, such as watering gardens and landscape plantings. Water characteristics, possible recycling methods, and production costs of treated water are discussed in terms of the quality of the generated wastewater. A cost-benefit analysis is also performed through comparison of intended cost with that of seawater desalination, which is widely used in irrigation.
Amplitude and Recurrence Time of LP activity at Mt. Etna, Italy
NASA Astrophysics Data System (ADS)
Cauchie, Léna; Saccorotti, Gilberto; Bean, Christopher
2013-04-01
The manifestation of Long-Period (LP) activity is attested on many volcanoes worldwide and is thought to be associated with the resonant oscillations of subsurface, fluid-filled, cracks and conduits. Nonetheless the actual source mechanism that originates the resonance is still unclear. Different models have been proposed so far, including (i) fluid flow instabilities as periodic degassing and (ii) brittle failure in viscous magmas. Since LP activity usually precedes and accompanies volcanic eruption, the understanding of these sources is crucial for the hazard assessment and eruption early warning. The work is aimed at improving the understanding of the LP source mechanism through a statistical analysis of detailed LP catalogues. The behaviour of LP activity is compared with the empirical laws governing earthquakes recurrence (e.g., Gutenberg-Richter [GR] and Gamma-law distributions), in order to understand what relationships, if any, exist between these two apparently different earthquake classes. In particular, about 13000 events were detected on Mount Etna in August 2005 through a STA/LTA method. For this given period, the volcano does not present particular sign of unrest. The manifestation of the LP events is sustained in time over all the period of analysis. From the analysis of the directional properties, it turns out that the events of this first catalog propagate from 2 distinct sources . Furthermore, the events exhibit a high degree of waveform similarity, and provide a criterion for classification/source separation. The events were then grouped into families of comparable waveforms, resulting also in a separation for their source locations. We then used template signals of each family for a Matched-Filtering of the continuous data streams, in order to discriminate small-amplitude events previously undetected by the STA/LTA triggering method. This procedure allowed for a significant enrichment of the catalogues. The retrieved amplitude distributions, similar for both families, differ instead significantly from the Gutenberg-Richter law, and the inter-event times distributions don't follow a typical Gamma-law. In order to compare these results with a catalogue for which the source mechanism is well-established, we applied the same analysis procedure to a dataset from Stromboli Volcano, where LP activity is closely related to VLP (Very-Long-Period) pulses, in turn associated with the summit explosions. Again, catalogues of thousands of LP events were achieved over one month of seismic records (July 2011). Our results indicate a similar behaviour in terms of both amplitude and inter-event time distributions, with respect to what observed at Mt. Etna. This suggests that the Etna's LP data are likely related with a degassing process occurring at depth. Nonetheless, further studies are needed in order to quantify the time recurrence and amplitude distribution of brittle failure in viscous, stressed magmas. Hopefully, these steps will lead to an improved understanding of LP activity in different volcanic contexts, in turn clarifying its significance in terms of eruption forecasting.
NASA Astrophysics Data System (ADS)
Harmon, T. C.; Rat'ko, A.; Dietrich, H.; Park, Y.; Wijsboom, Y. H.; Bendikov, M.
2008-12-01
Inorganic nitrogen (nitrate (NO3-) and ammonium (NH+)) from chemical fertilizer and livestock waste is a major source of pollution in groundwater, surface water and the air. While some sources of these chemicals, such as waste lagoons, are well-defined, their application as fertilizer has the potential to create distributed or non-point source pollution problems. Scalable nitrate sensors (small and inexpensive) would enable us to better assess non-point source pollution processes in agronomic soils, groundwater and rivers subject to non-point source inputs. This work describes the fabrication and testing of inexpensive PVC-membrane- based ion selective electrodes (ISEs) for monitoring nitrate levels in soil water environments. ISE-based sensors have the advantages of being easy to fabricate and use, but suffer several shortcomings, including limited sensitivity, poor precision, and calibration drift. However, modern materials have begun to yield more robust ISE types in laboratory settings. This work emphasizes the in situ behavior of commercial and fabricated sensors in soils subject to irrigation with dairy manure water. Results are presented in the context of deployment techniques (in situ versus soil lysimeters), temperature compensation, and uncertainty analysis. Observed temporal responses of the nitrate sensors exhibited diurnal cycling with elevated nitrate levels at night and depressed levels during the day. Conventional samples collected via lysimeters validated this response. It is concluded that while modern ISEs are not yet ready for long-term, unattended deployment, short-term installations (on the order of 2 to 4 days) are viable and may provide valuable insights into nitrogen dynamics in complex soil systems.
Data Science: History repeated? - The heritage of the Free and Open Source GIS community
NASA Astrophysics Data System (ADS)
Löwe, Peter; Neteler, Markus
2014-05-01
Data Science is described as the process of knowledge extraction from large data sets by means of scientific methods. The discipline draws heavily from techniques and theories from many fields, which are jointly used to furthermore develop information retrieval on structured or unstructured very large datasets. While the term Data Science was already coined in 1960, the current perception of this field places is still in the first section of the hype cycle according to Gartner, being well en route from the technology trigger stage to the peak of inflated expectations. In our view the future development of Data Science could benefit from the analysis of experiences from related evolutionary processes. One predecessor is the area of Geographic Information Systems (GIS). The intrinsic scope of GIS is the integration and storage of spatial information from often heterogeneous sources, data analysis, sharing of reconstructed or aggregated results in visual form or via data transfer. GIS is successfully applied to process and analyse spatially referenced content in a wide and still expanding range of science areas, spanning from human and social sciences like archeology, politics and architecture to environmental and geoscientific applications, even including planetology. This paper presents proven patterns for innovation and organisation derived from the evolution of GIS, which can be ported to Data Science. Within the GIS landscape, three strategic interacting tiers can be denoted: i) Standardisation, ii) applications based on closed-source software, without the option of access to and analysis of the implemented algorithms, and iii) Free and Open Source Software (FOSS) based on freely accessible program code enabling analysis, education and ,improvement by everyone. This paper focuses on patterns gained from the synthesis of three decades of FOSS development. We identified best-practices which evolved from long term FOSS projects, describe the role of community-driven global umbrella organisations such as OSGeo, as well as the standardization of innovative services. The main driver is the acknowledgement of a meritocratic attitude. These patterns follow evolutionary processes of establishing and maintaining a web-based democratic culture spawning new kinds of communication and projects. This culture transcends the established compartmentation and stratification of science by creating mutual benefits for the participants, irrespective of their respective research interest and standing. Adopting these best practices will enable the emerging Data Science communities to avoid pitfalls and to accelerate the progress to stages of productivity.
Kuang, Yuan-wen; Zhou, Guo-yi; Wen, Da-zhi; Li, Jiong; Sun, Fang-fang
2011-09-01
Concentrations of polycyclic aromatic hydrocarbons (PAHs) were examined and potential sources of PAHs were identified from the dated tree-rings of Masson pine (Pinus massoniana L.) near two industrial sites (Danshuikeng, DSK and Xiqiaoshan, XQS) in the Pearl River Delta of south China. Total concentrations of PAHs (∑PAHs) were revealed with similar patterns of temporal trends in the tree-rings at both sites, suggesting tree-rings recorded the historical variation in atmospheric PAHs. The differences of individual PAHs and of ∑PAHs detected in the tree-rings between the two sites reflected the historical differences of airborne PAHs. Regional changes in industrial activities might contribute to the site-specific and period-specific patterns of the tree-ring PAHs. The diagnostic PAH ratios of Ant/(Ant + PA), FL/(FL + Pyr), and BaA/(BaA + Chr)) revealed that PAHs in the tree-rings at both sites mainly stemmed from the combustion process (pyrogenic sources). Principal component analysis further confirmed that wood burning, coal combustion, diesel, and gasoline-powered vehicular emissions were the dominant contributors of PAHs sources at DSK, while diesel combustion, gasoline and natural gas combustion, and incomplete coal combustion were responsible for the main origins of PAHs at XQS. Tree-ring analysis of PAHs was indicative of PAHs from a mixture of sources of combustion, thus minimizing the bias of short-term active air sampling.
Soltani, Amanallah; Roslan, Samsilah
2013-03-01
Reading decoding ability is a fundamental skill to acquire word-specific orthographic information necessary for skilled reading. Decoding ability and its underlying phonological processing skills have been heavily investigated typically among developing students. However, the issue has rarely been noticed among students with intellectual disability who commonly suffer from reading decoding problems. This study is aimed at determining the contributions of phonological awareness, phonological short-term memory, and rapid automated naming, as three well known phonological processing skills, to decoding ability among 60 participants with mild intellectual disability of unspecified origin ranging from 15 to 23 years old. The results of the correlation analysis revealed that all three aspects of phonological processing are significantly correlated with decoding ability. Furthermore, a series of hierarchical regression analysis indicated that after controlling the effect of IQ, phonological awareness, and rapid automated naming are two distinct sources of decoding ability, but phonological short-term memory significantly contributes to decoding ability under the realm of phonological awareness. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genn Saji
2006-07-01
The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less
The long-term problems of contaminated land: Sources, impacts and countermeasures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baes, C.F. III
1986-11-01
This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').
NASA Astrophysics Data System (ADS)
Hyer, E. J.; Zhang, J. L.; Reid, J. S.; Curtis, C. A.; Westphal, D. L.
2007-12-01
Quantitative models of the transport and evolution of atmospheric pollution have graduated from the laboratory to become a part of the operational activity of forecast centers. Scientists studying the composition and variability of the atmosphere put great efforts into developing methods for accurately specifying sources of pollution, including natural and anthropogenic biomass burning. These methods must be adapted for use in operational contexts, which impose additional strictures on input data and methods. First, only input data sources available in near real-time are suitable for use in operational applications. Second, operational applications must make use of redundant data sources whenever possible. This is a shift in philosophy: in a research context, the most accurate and complete data set will be used, whereas in an operational context, the system must be designed with maximum redundancy. The goal in an operational context is to produce, to the extent possible, consistent and timely output, given sometimes inconsistent inputs. The Naval Aerosol Analysis and Prediction System (NAAPS), a global operational aerosol analysis and forecast system, recently began incorporating assimilation of satellite-derived aerosol optical depth. Assimilation of satellite AOD retrievals has dramatically improved aerosol analyses and forecasts from this system. The use of aerosol data assimilation also changes the strategy for improving the smoke source function. The absolute magnitude of emissions events can be refined through feedback from the data assimilation system, both in real- time operations and in post-processing analysis of data assimilation results. In terms of the aerosol source functions, the largest gains in model performance are now to be gained by reducing data latency and minimizing missed detections. In this presentation, recent model development work on the Fire Locating and Monitoring of Burning Emissions (FLAMBE) system that provides smoke aerosol boundary conditions for NAAPS is described, including redundant integration of multiple satellite platforms and development of feedback loops between the data assimilation system and smoke source.
The effects of pre-processing strategies in sentiment analysis of online movie reviews
NASA Astrophysics Data System (ADS)
Zin, Harnani Mat; Mustapha, Norwati; Murad, Masrah Azrifah Azmi; Sharef, Nurfadhlina Mohd
2017-10-01
With the ever increasing of internet applications and social networking sites, people nowadays can easily express their feelings towards any products and services. These online reviews act as an important source for further analysis and improved decision making. These reviews are mostly unstructured by nature and thus, need processing like sentiment analysis and classification to provide a meaningful information for future uses. In text analysis tasks, the appropriate selection of words/features will have a huge impact on the effectiveness of the classifier. Thus, this paper explores the effect of the pre-processing strategies in the sentiment analysis of online movie reviews. In this paper, supervised machine learning method was used to classify the reviews. The support vector machine (SVM) with linear and non-linear kernel has been considered as classifier for the classification of the reviews. The performance of the classifier is critically examined based on the results of precision, recall, f-measure, and accuracy. Two different features representations were used which are term frequency and term frequency-inverse document frequency. Results show that the pre-processing strategies give a significant impact on the classification process.
Inverse Source Data-Processing Strategies for Radio-Frequency Localization in Indoor Environments.
Gennarelli, Gianluca; Al Khatib, Obada; Soldovieri, Francesco
2017-10-27
Indoor positioning of mobile devices plays a key role in many aspects of our daily life. These include real-time people tracking and monitoring, activity recognition, emergency detection, navigation, and numerous location based services. Despite many wireless technologies and data-processing algorithms have been developed in recent years, indoor positioning is still a problem subject of intensive research. This paper deals with the active radio-frequency (RF) source localization in indoor scenarios. The localization task is carried out at the physical layer thanks to receiving sensor arrays which are deployed on the border of the surveillance region to record the signal emitted by the source. The localization problem is formulated as an imaging one by taking advantage of the inverse source approach. Different measurement configurations and data-processing/fusion strategies are examined to investigate their effectiveness in terms of localization accuracy under both line-of-sight (LOS) and non-line of sight (NLOS) conditions. Numerical results based on full-wave synthetic data are reported to support the analysis.
Inverse Source Data-Processing Strategies for Radio-Frequency Localization in Indoor Environments
Gennarelli, Gianluca; Al Khatib, Obada; Soldovieri, Francesco
2017-01-01
Indoor positioning of mobile devices plays a key role in many aspects of our daily life. These include real-time people tracking and monitoring, activity recognition, emergency detection, navigation, and numerous location based services. Despite many wireless technologies and data-processing algorithms have been developed in recent years, indoor positioning is still a problem subject of intensive research. This paper deals with the active radio-frequency (RF) source localization in indoor scenarios. The localization task is carried out at the physical layer thanks to receiving sensor arrays which are deployed on the border of the surveillance region to record the signal emitted by the source. The localization problem is formulated as an imaging one by taking advantage of the inverse source approach. Different measurement configurations and data-processing/fusion strategies are examined to investigate their effectiveness in terms of localization accuracy under both line-of-sight (LOS) and non-line of sight (NLOS) conditions. Numerical results based on full-wave synthetic data are reported to support the analysis. PMID:29077071
Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests
NASA Astrophysics Data System (ADS)
Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.
2015-12-01
Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.
Peptidome analysis of human skim milk in term and preterm milk
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, Jun; Cui, Xian-wei; Zhang, Jun
Highlights: •A method was developed for preparation of peptide extracts from human milk. •Analysis of the extracts by LC–MS/MS resulted in the detection of 1000–3000 peptide-like features. •419 Peptides were identified by LC–MS/MS from 34 proteins. •Isotope dimethyl labeling analysis revealed 41 peptides differentially expressed. -- Abstract: The abundant proteins in human milk have been well characterized and are known to provide nutritional, protective, and developmental advantages to both term and preterm infants. Due to the difficulties associated with detection technology of the peptides, the expression of the peptides present in human milk is not known widely. In recent years,more » peptidome analysis has received increasing attention. In this report, the analysis of endogenous peptides in human milk was done by mass spectrometry. A method was also developed by our researchers, which can be used in the extraction of peptide from human milk. Analysis of the extracts by LC–MS/MS resulted in the detection of 1000–3000 Da peptide-like features. Out of these, 419 peptides were identified by MS/MS. The identified peptides were found to originate from 34 proteins, of which several have been reported. Analysis of the peptides’ cleavage sites showed that the peptides are cleaved with regulations. This may reflect the protease activity and distribution in human body, and also represent the biological state of the tissue and provide a fresh source for biomarker discovery. Isotope dimethyl labeling analysis was also used to test the effects of premature delivery on milk protein composition in this study. Differences in peptides expression between breast milk in term milk (38–41 weeks gestation) and preterm milk (28–32 weeks gestation) were investigated in this study. 41 Peptides in these two groups were found expressed differently. 23 Peptides were present at higher levels in preterm milk, and 18 were present at higher levels in term milk.« less
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Stansfield, Claire; Brunton, Ginny; Rees, Rebecca
2014-06-01
When literature searching for systematic reviews, it is good practice to search widely across different information sources. Little is known about the contributions of different publication formats (e.g. journal article and book chapter) and sources, especially for studies of people's views. Studies from four reviews spanning three public health areas (active transport, motherhood and obesity) were analysed in terms of publication formats and the information sources they were identified from. They comprised of 229 studies exploring people's perceptions, beliefs and experiences ('views studies') and were largely qualitative. Although most (61%) research studies were published within journals, nearly a third (29%) were published as research reports and 5% were published in books. The remainder consisted of theses, conference papers and raw datasets. Two-thirds of studies (66%) were located in a total of 19 bibliographic databases, and 15 databases provided studies that were not identified elsewhere. PubMed was a good source for all reviews. Supplementary information sources were important for identifying studies in all publication formats. Undertaking sensitive searches across a range of information sources is essential for locating views studies in all publication formats. We discuss some benefits and challenges of utilising different information sources. Copyright © 2013 John Wiley & Sons, Ltd.
Discriminating Simulated Vocal Tremor Source Using Amplitude Modulation Spectra
Carbonell, Kathy M.; Lester, Rosemary A.; Story, Brad H.; Lotto, Andrew J.
2014-01-01
Objectives/Hypothesis Sources of vocal tremor are difficult to categorize perceptually and acoustically. This paper describes a preliminary attempt to discriminate vocal tremor sources through the use of spectral measures of the amplitude envelope. The hypothesis is that different vocal tremor sources are associated with distinct patterns of acoustic amplitude modulations. Study Design Statistical categorization methods (discriminant function analysis) were used to discriminate signals from simulated vocal tremor with different sources using only acoustic measures derived from the amplitude envelopes. Methods Simulations of vocal tremor were created by modulating parameters of a vocal fold model corresponding to oscillations of respiratory driving pressure (respiratory tremor), degree of vocal fold adduction (adductory tremor) and fundamental frequency of vocal fold vibration (F0 tremor). The acoustic measures were based on spectral analyses of the amplitude envelope computed across the entire signal and within select frequency bands. Results The signals could be categorized (with accuracy well above chance) in terms of the simulated tremor source using only measures of the amplitude envelope spectrum even when multiple sources of tremor were included. Conclusions These results supply initial support for an amplitude-envelope based approach to identify the source of vocal tremor and provide further evidence for the rich information about talker characteristics present in the temporal structure of the amplitude envelope. PMID:25532813
Follow-up of high energy neutrinos detected by the ANTARES telescope
NASA Astrophysics Data System (ADS)
Mathieu, Aurore
2016-04-01
The ANTARES telescope is well-suited to detect high energy neutrinos produced in astrophysical transient sources as it can observe a full hemisphere of the sky with a high duty cycle. Potential neutrino sources are gamma-ray bursts, core-collapse supernovae and flaring active galactic nuclei. To enhance the sensitivity of ANTARES to such sources, a detection method based on follow-up observations from the neutrino direction has been developed. This program, denoted as TAToO, includes a network of robotic optical telescopes (TAROT, Zadko and MASTER) and the Swift-XRT telescope, which are triggered when an "interesting" neutrino is detected by ANTARES. A follow-up of special events, such as neutrino doublets in time/space coincidence or a single neutrino having a very high energy or in the specific direction of a local galaxy, significantly improves the perspective for the detection of transient sources. The analysis of early and long term follow-up observations to search for fast and slowly varying transient sources, respectively, has been performed and the results covering optical and X-ray data are presented in this contribution.
Vergeynst, Lidewei L; Sause, Markus G R; Hamstad, Marvin A; Steppe, Kathy
2015-01-01
When drought occurs in plants, acoustic emission (AE) signals can be detected, but the actual causes of these signals are still unknown. By analyzing the waveforms of the measured signals, it should, however, be possible to trace the characteristics of the AE source and get information about the underlying physiological processes. A problem encountered during this analysis is that the waveform changes significantly from source to sensor and lack of knowledge on wave propagation impedes research progress made in this field. We used finite element modeling and the well-known pencil lead break source to investigate wave propagation in a branch. A cylindrical rod of polyvinyl chloride was first used to identify the theoretical propagation modes. Two wave propagation modes could be distinguished and we used the finite element model to interpret their behavior in terms of source position for both the PVC rod and a wooden rod. Both wave propagation modes were also identified in drying-induced signals from woody branches, and we used the obtained insights to provide recommendations for further AE research in plant science.
The Simultaneous Medicina-Planck Experiment: data acquisition, reduction and first results
NASA Astrophysics Data System (ADS)
Procopio, P.; Massardi, M.; Righini, S.; Zanichelli, A.; Ricciardi, S.; Libardi, P.; Burigana, C.; Cuttaia, F.; Mack, K.-H.; Terenzi, L.; Villa, F.; Bonavera, L.; Morgante, G.; Trigilio, C.; Trombetti, T.; Umana, G.
2011-10-01
The Simultaneous Medicina-Planck Experiment (SiMPlE) is aimed at observing a selected sample of 263 extragalactic and Galactic sources with the Medicina 32-m single-dish radio telescope in the same epoch as the Planck satellite observations. The data, acquired with a frequency coverage down to 5 GHz and combined with Planck at frequencies above 30 GHz, will constitute a useful reference catalogue of bright sources over the whole Northern hemisphere. Furthermore, source observations performed in different epochs and comparisons with other catalogues will allow the investigation of source variabilities on different time-scales. In this work, we describe the sample selection, the ongoing data acquisition campaign, the data reduction procedures, the developed tools and the comparison with other data sets. We present 5 and 8.3 GHz data for the SiMPlE Northern sample, consisting of 79 sources with δ≥ 45° selected from our catalogue and observed during the first 6 months of the project. A first analysis of their spectral behaviour and long-term variability is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rioja, M.; Dodson, R., E-mail: maria.rioja@icrar.org
2011-04-15
We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using themore » Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.« less
The RATIO method for time-resolved Laue crystallography
Coppens, Philip; Pitak, Mateusz; Gembicky, Milan; Messerschmidt, Marc; Scheins, Stephan; Benedict, Jason; Adachi, Shin-ichi; Sato, Tokushi; Nozawa, Shunsuke; Ichiyanagi, Kohei; Chollet, Matthieu; Koshihara, Shin-ya
2009-01-01
A RATIO method for analysis of intensity changes in time-resolved pump–probe Laue diffraction experiments is described. The method eliminates the need for scaling the data with a wavelength curve representing the spectral distribution of the source and removes the effect of possible anisotropic absorption. It does not require relative scaling of series of frames and removes errors due to all but very short term fluctuations in the synchrotron beam. PMID:19240334
2013-09-30
Circulation (HC) in terms of the meridional streamfunction. The interannual variability of the Atlantic HC in boreal summer was examined using the EOF...large-scale circulations in the NAVGEM model and the source of predictability for the seasonal variation of the Atlantic TCs. We have been working...EOF analysis of Meridional Circulation (JAS). (a) The leading mode (M1); (b) variance explained by the first 10 modes. 9
2013-09-30
founded Quantum Intelligence, Inc. She was principal investigator (PI) for six contracts awarded by the DoD Small Business Innovation Research (SBIR... Quantum Intelligence, Inc. CLA is a computer-based learning agent, or agent collaboration, capable of ingesting and processing data sources. We have...opportunities all need to be addressed consciously and consistently. Following a series of deliberate experiments, long-term procedural improvements to the
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
Fatty Acids Modulate Excitability in Guinea-Pig Hippocampal Slices
1991-01-01
141-147. 32. Taube J. S. and Schwartzkroin P . A . (1988) M .- hanisms of long-term potentiation: a current-source density analysis. J. Neurosci. 8, 1645...pyrami- given volley size to elicit a synaptic potential, while dale to record the resultant population postsynaptic poten- stearic acid (100 p M) and...population spike amplitude (0) and population PSP size ( A ) with exposure to 250 p M capric acid in a representative experiment. Synaptic potentials
Zhao, Fangkun; Shi, Bei; Liu, Ruixin; Zhou, Wenkai; Shi, Dong; Zhang, Jinsong
2018-04-03
The distribution pattern and knowledge structure of choroidal neovascularization (CNV) was surveyed based on literatures in PubMed. Published scientific papers about CNV were retrieved from Jan 1st, 2012 to May 31st, 2017. Extracted MeSH terms were analyzed quantitatively by using Bibliographic Item Co-Occurrence Matrix Builder (BICOMB) and high-frequency MeSH terms were identified. Hierarchical cluster analysis was conducted by SPSS 19.0 according to the MeSH term-source article matrix. High-frequency MeSH terms co-occurrence matrix was constructed to support strategic diagram and social network analysis (SNA). According to the searching strategy, all together 2366 papers were included, and the number of annual papers changed slightly from Jan 1st, 2012 to May 31st, 2017. Among all the extracted MeSH terms, 44 high-frequency MeSH terms were identified and hotspots were clustered into 6 categories. In the strategic diagram, clinical drug therapy, pathology and diagnosis related researches of CNV were well developed. In contrast, the metabolism, etiology, complications, prevention and control of CNV in animal models, and genetics related researches of CNV were relatively immature, which offers potential research space for future study. As for the SNA result, the position status of each component was described by the centrality values. The studies on CNV are relatively divergent and the 6 research categories concluded from this study could reflect the publication trends on CNV to some extent. By providing a quantitative bibliometric research across a 5-year span, it could help to depict an overall command of the latest topics and provide some hints for researchers when launching new projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Toll, J.; Cothern, K.
1995-12-31
The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas
2017-10-01
In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of the release with its associated source term and perform a forward model simulation to study the consequences of the iodine release. Results of these procedures are compared with the known release location and reported information about its time variation. We find that our algorithm could successfully locate the actual release site. The estimated release period is also in agreement with the values reported by IAEA and the reported total released activity of 342 GBq is within the 99 % confidence interval of the posterior distribution of our most likely model.
Cesari, D; De Benedetto, G E; Bonasoni, P; Busetto, M; Dinoi, A; Merico, E; Chirizzi, D; Cristofanelli, P; Donateo, A; Grasso, F M; Marinoni, A; Pennetta, A; Contini, D
2018-01-15
Comparison of fine and coarse fractions in terms of sources and dynamics is scarce in southeast Mediterranean countries; differences are relevant because of the importance of natural sources like sea spray and Saharan dust advection, because most of the monitoring networks are limited to PM 10 . In this work, the main seasonal variabilities of sources and processes involving fine and coarse PM (particulate matter) were studied at the Environmental-Climate Observatory of Lecce (Southern Italy). Simultaneous PM 2.5 and PM 10 samples were collected between July 2013 and July 2014 and chemically analysed to determine concentrations of several species: OC (organic carbon) and EC (elemental carbon) via thermo-optical analysis, 9 major ions via IC, and 23 metals via ICP-MS. Data was processed through mass closure analysis and Positive Matrix Factorization (PMF) receptor model characterizing seasonal variabilities of nine sources contributions. Organic and inorganic secondary aerosol accounts for 43% of PM 2.5 and 12% of PM 2.5-10 with small seasonal changes. SIA (secondary inorganic aerosol) seasonal pattern is opposite to that of SOC (secondary organic carbon). SOC is larger during the cold period, sulphate (the major contributor to SIA) is larger during summer. Two forms of nitrate were identified: NaNO 3 , correlated with chloride depletion and aging of sea-spray, mainly present in PM 2.5-10 ; NH 4 NO 3 more abundant in PM 2.5 . Biomass burning is a relevant source with larger contribution during autumn and winter because of the influence of domestic heating, however, is not negligible in spring and summer, because of the contributions of fires and agricultural practices. Mass closure analysis and PMF results identify two soil sources: crustal associated to long range transport and carbonates associated to local resuspended dust. Both sources contributes to the coarse fraction and have different dynamics with crustal source contributing mainly in high winds from SE conditions and carbonates during high winds from North direction. Copyright © 2017 Elsevier B.V. All rights reserved.
The mass-zero spin-two field and gravitational theory.
NASA Technical Reports Server (NTRS)
Coulter, C. A.
1972-01-01
Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.
Technical note: A linear model for predicting δ13 Cprotein.
Pestle, William J; Hubbe, Mark; Smith, Erin K; Stevenson, Joseph M
2015-08-01
Development of a model for the prediction of δ(13) Cprotein from δ(13) Ccollagen and Δ(13) Cap-co . Model-generated values could, in turn, serve as "consumer" inputs for multisource mixture modeling of paleodiet. Linear regression analysis of previously published controlled diet data facilitated the development of a mathematical model for predicting δ(13) Cprotein (and an experimentally generated error term) from isotopic data routinely generated during the analysis of osseous remains (δ(13) Cco and Δ(13) Cap-co ). Regression analysis resulted in a two-term linear model (δ(13) Cprotein (%) = (0.78 × δ(13) Cco ) - (0.58× Δ(13) Cap-co ) - 4.7), possessing a high R-value of 0.93 (r(2) = 0.86, P < 0.01), and experimentally generated error terms of ±1.9% for any predicted individual value of δ(13) Cprotein . This model was tested using isotopic data from Formative Period individuals from northern Chile's Atacama Desert. The model presented here appears to hold significant potential for the prediction of the carbon isotope signature of dietary protein using only such data as is routinely generated in the course of stable isotope analysis of human osseous remains. These predicted values are ideal for use in multisource mixture modeling of dietary protein source contribution. © 2015 Wiley Periodicals, Inc.
Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I
2017-08-15
Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gambino, Nadia, E-mail: gambinon@ethz.ch; Brandstätter, Markus; Rollinger, Bob
2014-09-15
In this work, a new diagnostic tool for laser-produced plasmas (LPPs) is presented. The detector is based on a multiple array of six motorized Langmuir probes. It allows to measure the dynamics of a LPP in terms of charged particles detection with particular attention to droplet-based LPP sources for EUV lithography. The system design permits to temporally resolve the angular and radial plasma charge distribution and to obtain a hemispherical mapping of the ions and electrons around the droplet plasma. The understanding of these dynamics is fundamental to improve the debris mitigation techniques for droplet-based LPP sources. The device hasmore » been developed, built, and employed at the Laboratory for Energy Conversion, ETH Zürich. The experimental results have been obtained on the droplet-based LPP source ALPS II. For the first time, 2D mappings of the ion kinetic energy distribution around the droplet plasma have been obtained with an array of multiple Langmuir probes. These measurements show an anisotropic expansion of the ions in terms of kinetic energy and amount of ion charge around the droplet target. First estimations of the plasma density and electron temperature were also obtained from the analysis of the probe current signals.« less
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar
2017-04-01
Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was found single dominating source over southern part while over Bangladesh, both vehicular, biomass burning and industrial sources were significant. Copyright © 2016 Elsevier Ltd. All rights reserved.
Phase I of the Near Term Hybrid Passenger Vehicle Development Program. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-10-01
The results of Phase I of the Near-Term Hybrid Vehicle Program are summarized. This phase of the program ws a study leading to the preliminary design of a 5-passenger hybrid vehicle utilizing two energy sources (electricity and gasoline/diesel fuel) to minimize petroleum usage on a fleet basis. This report presents the following: overall summary of the Phase I activity; summary of the individual tasks; summary of the hybrid vehicle design; summary of the alternative design options; summary of the computer simulations; summary of the economic analysis; summary of the maintenance and reliability considerations; summary of the design for crash safety;more » and bibliography.« less
Solar-powered irrigation systems. Technical progress report, July 1977--January 1978
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1978-02-28
Dispersed solar thermal power systems applied to farm irrigation energy needs are analyzed. The 17 western states, containing 84% of nationwide irrigated croplands and consuming 93% of nationwide irrigation energy, have been selected to determine were solar irrigation systems can compete most favorably with conventional energy sources. Financial analysis of farms, according to size and ownership, was accomplished to permit realistic comparative analyses of system lifetime costs. Market potential of optimized systems has been estimated for the 17-state region for near-term (1985) and intermediate-term (2000) applications. Technical, economic, and institutional factors bearing on penetration and capture of this market aremore » being identified.« less
Long Term Hydrological (Radiological) Site Monitoring Data
Quality Data Asset includes all current and historical data on the quality of water with regard to the presence of water pollutants of all kinds regulated by the Clean Water Act. Under the new Interagency Agreement with the Department of Energy (DOE), the Radiation & Indoor Environments National Laboratory (R&IE), Office of Radiation and Indoor Air (ORIA), EPA, located in Las Vegas, NV, conducts a Long-Term Hydrological Monitoring Program (LTHMP) providing laboratory sampling/analysis and Quality Assurance and Control to measure radioactivity concentrations in the water sources near the sites of former underground nuclear explosions. The results of the LTHMP provide assurance that radioactive material from the tests have not migrated into water supplies.
Inflationary cosmology with Chaplygin gas in Palatini formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borowiec, Andrzej; Wojnar, Aneta; Stachowski, Aleksander
2016-01-01
We present a simple generalisation of the ΛCDM model which on the one hand reaches very good agreement with the present day experimental data and provides an internal inflationary mechanism on the other hand. It is based on Palatini modified gravity with quadratic Starobinsky term and generalized Chaplygin gas as a matter source providing, besides a current accelerated expansion, the epoch of endogenous inflation driven by type III freeze singularity. It follows from our statistical analysis that astronomical data favors negative value of the parameter coupling quadratic term into Einstein-Hilbert Lagrangian and as a consequence the bounce instead of initialmore » Big-Bang singularity is preferred.« less
Evaluation and Cross-Comparison of Lexical Entities of Biological Interest (LexEBI)
Rebholz-Schuhmann, Dietrich; Kim, Jee-Hyub; Yan, Ying; Dixit, Abhishek; Friteyre, Caroline; Hoehndorf, Robert; Backofen, Rolf; Lewin, Ian
2013-01-01
Motivation Biomedical entities, their identifiers and names, are essential in the representation of biomedical facts and knowledge. In the same way, the complete set of biomedical and chemical terms, i.e. the biomedical “term space” (the “Lexeome”), forms a key resource to achieve the full integration of the scientific literature with biomedical data resources: any identified named entity can immediately be normalized to the correct database entry. This goal does not only require that we are aware of all existing terms, but would also profit from knowing all their senses and their semantic interpretation (ambiguities, nestedness). Result This study compiles a resource for lexical terms of biomedical interest in a standard format (called “LexEBI”), determines the overall number of terms, their reuse in different resources and the nestedness of terms. LexEBI comprises references for protein and gene entries and their term variants and chemical entities amongst other terms. In addition, disease terms have been identified from Medline and PubmedCentral and added to LexEBI. Our analysis demonstrates that the baseforms of terms from the different semantic types show only little polysemous use. Nonetheless, the term variants of protein and gene names (PGNs) frequently contain species mentions, which should have been avoided according to protein annotation guidelines. Furthermore, the protein and gene entities as well as the chemical entities, both do comprise enzymes leading to hierarchical polysemy, and a large portion of PGNs make reference to a chemical entity. Altogether, according to our analysis based on the Medline distribution, 401,869 unique PGNs in the documents contain a reference to 25,022 chemical entities, 3,125 disease terms or 1,576 species mentions. Conclusion LexEBI delivers the complete biomedical and chemical Lexeome in a standardized representation (http://www.ebi.ac.uk/Rebholz-srv/LexEBI/). The resource provides the disease terms as open source content, and fully interlinks terms across resources. PMID:24124474
Wu, Tsung-Jung; Schriml, Lynn M.; Chen, Qing-Rong; Colbert, Maureen; Crichton, Daniel J.; Finney, Richard; Hu, Ying; Kibbe, Warren A.; Kincaid, Heather; Meerzaman, Daoud; Mitraka, Elvira; Pan, Yang; Smith, Krista M.; Srivastava, Sudhir; Ward, Sari; Yan, Cheng; Mazumder, Raja
2015-01-01
Bio-ontologies provide terminologies for the scientific community to describe biomedical entities in a standardized manner. There are multiple initiatives that are developing biomedical terminologies for the purpose of providing better annotation, data integration and mining capabilities. Terminology resources devised for multiple purposes inherently diverge in content and structure. A major issue of biomedical data integration is the development of overlapping terms, ambiguous classifications and inconsistencies represented across databases and publications. The disease ontology (DO) was developed over the past decade to address data integration, standardization and annotation issues for human disease data. We have established a DO cancer project to be a focused view of cancer terms within the DO. The DO cancer project mapped 386 cancer terms from the Catalogue of Somatic Mutations in Cancer (COSMIC), The Cancer Genome Atlas (TCGA), International Cancer Genome Consortium, Therapeutically Applicable Research to Generate Effective Treatments, Integrative Oncogenomics and the Early Detection Research Network into a cohesive set of 187 DO terms represented by 63 top-level DO cancer terms. For example, the COSMIC term ‘kidney, NS, carcinoma, clear_cell_renal_cell_carcinoma’ and TCGA term ‘Kidney renal clear cell carcinoma’ were both grouped to the term ‘Disease Ontology Identification (DOID):4467 / renal clear cell carcinoma’ which was mapped to the TopNodes_DOcancerslim term ‘DOID:263 / kidney cancer’. Mapping of diverse cancer terms to DO and the use of top level terms (DO slims) will enable pan-cancer analysis across datasets generated from any of the cancer term sources where pan-cancer means including or relating to all or multiple types of cancer. The terms can be browsed from the DO web site (http://www.disease-ontology.org) and downloaded from the DO’s Apache Subversion or GitHub repositories. Database URL: http://www.disease-ontology.org PMID:25841438
ERIC Educational Resources Information Center
Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.
2008-01-01
A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…
Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux
NASA Astrophysics Data System (ADS)
Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.
2017-12-01
Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.
Is amplitude loss of sonic waveforms due to intrinsic attenuation or source coupling to the medium?
Lee, Myung W.
2006-01-01
Sonic waveforms acquired in gas-hydrate-bearing sediments indicate strong amplitude loss associated with an increase in sonic velocity. Because the gas hydrate increases sonic velocities, the amplitude loss has been interpreted as due to intrinsic attenuation caused by the gas hydrate in the pore space, which apparently contradicts conventional wave propagation theory. For a sonic source in a fluid-filled borehole, the signal amplitude transmitted into the formation depends on the physical properties of the formation, including any pore contents, in the immediate vicinity of the source. A signal in acoustically fast material, such as gas-hydrate-bearing sediments, has a smaller amplitude than a signal in acoustically slower material. Therefore, it is reasonable to interpret the amplitude loss in the gas-hydrate-bearing sediments in terms of source coupling to the surrounding medium as well as intrinsic attenuation. An analysis of sonic waveforms measured at the Mallik 5L-38 well, Northwest Territories, Canada, indicates that a significant part of the sonic waveform's amplitude loss is due to a source-coupling effect. All amplitude analyses of sonic waveforms should include the effect of source coupling in order to accurately characterize the formation's intrinsic attenuation.
NASA thesaurus. Volume 3: Definitions
NASA Technical Reports Server (NTRS)
1988-01-01
Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.
NASA Astrophysics Data System (ADS)
Twohig, Sarah; Pattison, Ian; Sander, Graham
2017-04-01
Fine sediment poses a significant threat to UK river systems in terms of vegetation, aquatic habitats and morphology. Deposition of fine sediment onto the river bed reduces channel capacity resulting in decreased volume to contain high flow events. Once the in channel problem has been identified managers are under pressure to sustainably mitigate flood risk. With climate change and land use adaptations increasing future pressures on river catchments it is important to consider the connectivity of fine sediment throughout the river catchment and its influence on channel capacity, particularly in systems experiencing long term aggradation. Fine sediment erosion is a continuing concern in the River Eye, Leicestershire. The predominately rural catchment has a history of flooding within the town of Melton Mowbray. Fine sediment from agricultural fields has been identified as a major contributor of sediment delivery into the channel. Current mitigation measures are not sustainable or successful in preventing the continuum of sediment throughout the catchment. Identifying the potential sources and connections of fine sediment would provide insight into targeted catchment management. 'Sensitive Catchment Integrated Modelling Analysis Platforms' (SCIMAP) is a tool often used by UK catchment managers to identify potential sources and routes of sediment within a catchment. SCIMAP is a risk based model that combines hydrological (rainfall) and geomorphic controls (slope, land cover) to identify the risk of fine sediment being transported from source into the channel. A desktop version of SCIMAP was run for the River Eye at a catchment scale using 5m terrain, rainfall and land cover data. A series of SCIMAP model runs were conducted changing individual parameters to determine the sensitivity of the model. Climate Change prediction data for the catchment was used to identify potential areas of future connectivity and erosion risk for catchment managers. The results have been subjected to field validation as part of a wider research project which provides an indication of the robustness of widespread models as effective management tools.
Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications
Khodak, Andrei
2017-08-21
Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less
Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khodak, Andrei
Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less
NASA Astrophysics Data System (ADS)
Braban, Christine; Tang, Sim; Bealey, Bill; Roberts, Elin; Stephens, Amy; Galloway, Megan; Greenwood, Sarah; Sutton, Mark; Nemitz, Eiko; Leaver, David
2017-04-01
Ambient ammonia measurements have been undertaken both in the atmosphere to understand sources, concentrations at background and vulnerable ecosystems and for long term monitoring of concentrations. As a pollutant which is projected to increase concentration in the coming decades with significant policy challenges to implementing mitigation strategies it is useful to assess what has been measured, where and why. In this study a review of the literature, has shown that ammonia measurements are frequently not publically reported and in general not reposited in the open data centres, available for research. The specific sectors where measurements have been undertaken are: agricultural point source assessments, agricultural surface exchange measurements, sensitive ecosystem monitoring, landscape/regional studies and governmental long term monitoring. Less frequently ammonia is measured as part of an intensive atmospheric chemistry field campaign. Technology is developing which means a shift from chemical denuder methods to spectroscopic techniques may be possible, however chemical denuding techniques with off-line laboratory analysis will likely be an economical approach for some time to come. This paper reviews existing datasets from the different sectors of research and integrates them for a global picture to allow both a long term understanding and facilitate comparison with future measurements.
On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel
2018-05-01
We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.
Work Life Stress and Career Resilience of Licensed Nursing Facility Administrators.
Myers, Dennis R; Rogers, Rob; LeCrone, Harold H; Kelley, Katherine; Scott, Joel H
2018-04-01
Career resilience provided a frame for understanding how Licensed Nursing Facility Administrators (LNFAs) sustain role performance and even thrive in stressful skilled nursing facility work environments. Quantitative and qualitative analyses of in-depth interviews with18 LNFAs, averaging 24 years of experience were conducted by a five-member research team. Analysis was informed by evidence-based frameworks for career resilience in the health professions as well as the National Association of Long-Term Care Administrator Boards' (NAB) five domains of competent administrative practice. Findings included six sources of work stressors and six sources of professional satisfaction. Also, participants identified seven strategic principles and 10 administrative practices for addressing major sources of stress. Recommendations are provided for research and evidence-based application of the career resilience perspective to LNFA practice aimed at reducing role abandonment and energizing the delivery of the quality of care that each resident deserves.
NASA Astrophysics Data System (ADS)
Yuan, Zibing; Yadav, Varun; Turner, Jay R.; Louie, Peter K. K.; Lau, Alexis Kai Hon
2013-09-01
Despite extensive emission control measures targeting motor vehicles and to a lesser extent other sources, annual-average PM10 mass concentrations in Hong Kong have remained relatively constant for the past several years and for some air quality metrics, such as the frequency of poor visibility days, conditions have degraded. The underlying drivers for these long-term trends were examined by performing source apportionment on eleven years (1998-2008) of data for seven monitoring sites in the Hong Kong PM10 chemical speciation network. Nine factors were resolved using Positive Matrix Factorization. These factors were assigned to emission source categories that were classified as local (operationally defined as within the Hong Kong Special Administrative Region) or non-local based on temporal and spatial patterns in the source contribution estimates. This data-driven analysis provides strong evidence that local controls on motor vehicle emissions have been effective in reducing motor vehicle-related ambient PM10 burdens with annual-average contributions at neighborhood- and larger-scale monitoring stations decreasing by ˜6 μg m-3 over the eleven year period. However, this improvement has been offset by an increase in annual-average contributions from non-local contributions, especially secondary sulfate and nitrate, of ˜8 μg m-3 over the same time period. As a result, non-local source contributions to urban-scale PM10 have increased from 58% in 1998 to 70% in 2008. Most of the motor vehicle-related decrease and non-local source driven increase occurred over the period 1998-2004 with more modest changes thereafter. Non-local contributions increased most dramatically for secondary sulfate and secondary nitrate factors and thus combustion-related control strategies, including but not limited to power plants, are needed for sources located in the Pearl River Delta and more distant regions to improve air quality conditions in Hong Kong. PMF-resolved source contribution estimates were also used to examine differential contributions of emission source categories during high PM episodes compared to study-average behavior. While contributions from all source categories increased to some extent on high PM days, the increases were disproportionately high for the non-local sources. Thus, controls on emission sources located outside the Hong Kong Special Administrative Region will be needed to effectively decrease the frequency and severity of high PM episodes.
Yo, Chia-Hung; Lee, Si-Huei; Chang, Shy-Shin; Lee, Matthew Chien-Hung; Lee, Chien-Chang
2014-01-01
Objectives We performed a systematic review and meta-analysis of studies on high-sensitivity C-reactive protein (hs-CRP) assays to see whether these tests are predictive of atrial fibrillation (AF) recurrence after cardioversion. Design Systematic review and meta-analysis. Data sources PubMed, EMBASE and Cochrane databases as well as a hand search of the reference lists in the retrieved articles from inception to December 2013. Study eligibility criteria This review selected observational studies in which the measurements of serum CRP were used to predict AF recurrence. An hs-CRP assay was defined as any CRP test capable of measuring serum CRP to below 0.6 mg/dL. Primary and secondary outcome measures We summarised test performance characteristics with the use of forest plots, hierarchical summary receiver operating characteristic curves and bivariate random effects models. Meta-regression analysis was performed to explore the source of heterogeneity. Results We included nine qualifying studies comprising a total of 347 patients with AF recurrence and 335 controls. A CRP level higher than the optimal cut-off point was an independent predictor of AF recurrence after cardioversion (summary adjusted OR: 3.33; 95% CI 2.10 to 5.28). The estimated pooled sensitivity and specificity for hs-CRP was 71.0% (95% CI 63% to 78%) and 72.0% (61% to 81%), respectively. Most studies used a CRP cut-off point of 1.9 mg/L to predict long-term AF recurrence (77% sensitivity, 65% specificity), and 3 mg/L to predict short-term AF recurrence (73% sensitivity, 71% specificity). Conclusions hs-CRP assays are moderately accurate in predicting AF recurrence after successful cardioversion. PMID:24556243
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
Source apportionment of airborne particulates through receptor modeling: Indian scenario
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Murari, Vishnu; Kumar, Manish; Raju, M. P.
2015-10-01
Airborne particulate chemistry mostly governed by associated sources and apportionment of specific sources is extremely essential to delineate explicit control strategies. The present submission initially deals with the publications (1980s-2010s) of Indian origin which report regional heterogeneities of particulate concentrations with reference to associated species. Such meta-analyses clearly indicate the presence of reservoir of both primary and secondary aerosols in different geographical regions. Further, identification of specific signatory molecules for individual source category was also evaluated in terms of their scientific merit and repeatability. Source signatures mostly resemble international profile while, in selected cases lack appropriateness. In India, source apportionment (SA) of airborne particulates was initiated way back in 1985 through factor analysis, however, principal component analysis (PCA) shares a major proportion of applications (34%) followed by enrichment factor (EF, 27%), chemical mass balance (CMB, 15%) and positive matrix factorization (PMF, 9%). Mainstream SA analyses identify earth crust and road dust resuspensions (traced by Al, Ca, Fe, Na and Mg) as a principal source (6-73%) followed by vehicular emissions (traced by Fe, Cu, Pb, Cr, Ni, Mn, Ba and Zn; 5-65%), industrial emissions (traced by Co, Cr, Zn, V, Ni, Mn, Cd; 0-60%), fuel combustion (traced by K, NH4+, SO4-, As, Te, S, Mn; 4-42%), marine aerosols (traced by Na, Mg, K; 0-15%) and biomass/refuse burning (traced by Cd, V, K, Cr, As, TC, Na, K, NH4+, NO3-, OC; 1-42%). In most of the cases, temporal variations of individual source contribution for a specific geographic region exhibit radical heterogeneity possibly due to unscientific orientation of individual tracers for specific source and well exaggerated by methodological weakness, inappropriate sample size, implications of secondary aerosols and inadequate emission inventories. Conclusively, a number of challenging issues and specific recommendations have been included which need to be considered for a scientific apportionment of particulate sources in different geographical regions of India.
An atlas of thermal data for biomass and other fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaur, S.; Reed, T.B.
1995-06-01
Biomass is recognized as a major source of renewable energy. In order to convert biomass energy to more useful forms, it is necessary to have accurate scientific data on the thermal properties of biomass. This Atlas has been written to supply a uniform source of that information. In the last few decades Thermal analysis (TA) tools such as thermogravimetry, differential thermal analysis, thermo mechanical analysis, etc. have become more important. The data obtained from these techniques can provide useful information in terms of reaction mechanism, kinetic parameters, thermal stability, phase transformation, heat of reaction, etc. for gas-solid and gas-liquid systems.more » Unfortunately, there are no ASTM standards set for the collection of these types of data using TA techniques and therefore, different investigators use different conditions which suit their requirements for measuring this thermal data. As a result, the information obtained from different laboratories is not comparable. This Atlas provides the ability to compare new laboratory results with a wide variety of related data available in the literature and helps ensure consistency in using these data.« less
Substance flow analysis as a tool for urban water management.
Chèvre, N; Guignard, C; Rossi, L; Pfeifer, H-R; Bader, H-P; Scheidegger, R
2011-01-01
Human activity results in the production of a wide range of pollutants that can enter the water cycle through stormwater or wastewater. Among others, heavy metals are still detected in high concentrations around urban areas and their impact on aquatic organisms is of major concern. In this study, we propose to use a substance flow analysis as a tool for heavy metals management in urban areas. We illustrate the approach with the case of copper in Lausanne, Switzerland. The results show that around 1,500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for benthic organisms. The major sources of copper in receiving surface water are roofs and catenaries of trolleybuses. They represent 75% of the total input of copper into the urban water system. Actions to reduce copper pollution should therefore focus on these sources. Substance flow analysis also highlights that copper enters surface water mainly during rain events, i.e., without passing through any treatment procedure. A reduction in pollution could also be achieved by improving stormwater management. In conclusion, the study showed that substance flow analysis is a very effective tool for sustainable urban water management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferreira, Eduardo G.A.; Marumo, Julio T.; Vicente, Roberto
2012-07-01
Portland cement materials are widely used as engineered barriers in repositories for radioactive waste. The capacity of such barriers to avoid the disposed of radionuclides to entering the biosphere in the long-term depends on the service life of those materials. Thus, the performance assessment of structural materials under a series of environmental conditions prevailing at the environs of repositories is a matter of interest. The durability of cement paste foreseen as backfill in a deep borehole for disposal of disused sealed radioactive sources is investigated in the development of the repository concept. Results are intended to be part of themore » body of evidence in the safety case of the proposed disposal technology. This paper presents the results of X-Ray Diffraction (XRD) Analysis of cement paste exposed to varying temperatures and simulated groundwater after samples received the radiation dose that the cement paste will accumulate until complete decay of the radioactive sources. The XRD analysis of cement paste samples realized in this work allowed observing some differences in the results of cement paste specimens that were submitted to different treatments. The cluster analysis of results was able to group tested samples according to the applied treatments. Mineralogical differences, however, are tenuous and, apart from ettringite, are hardly observed. The absence of ettringite in all the seven specimens that were kept in dry storage at high temperature had hardly occurred by natural variations in the composition of hydrated cement paste because ettringite is observed in all tested except the seven specimens. Therefore this absence is certainly the result of the treatments and could be explained by the decomposition of ettringite. Although the temperature of decomposition is about 110-120 deg. C, it may be initially decomposed to meta-ettringite, an amorphous compound, above 50 deg. C in the absence of water. Influence of irradiation on the mineralogical composition was not observed when the treatment was analyzed individually or when analyzed under the possible synergic effect with other treatments. However, the radiation dose to which specimens were exposed is only a fraction of the accumulated dose in cement paste until complete decay of some sources. Therefore, in the short term, the conditions deemed to prevail in the repository environment may not influence the properties of cement paste at detectable levels. Under the conditions presented in this work, it is not possible to predict the long term evolution of these properties. (authors)« less
Erratum to Surface‐wave green’s tensors in the near field
Haney, Matthew M.; Hisashi Nakahara,
2016-01-01
Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).
Flowsheets and source terms for radioactive waste projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forsberg, C.W.
1985-03-01
Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.
The importance of geospatial data to calculate the optimal distribution of renewable energies
NASA Astrophysics Data System (ADS)
Díaz, Paula; Masó, Joan
2013-04-01
Specially during last three years, the renewable energies are revolutionizing the international trade while they are geographically diversifying markets. Renewables are experiencing a rapid growth in power generation. According to REN21 (2012), during last six years, the total renewables capacity installed grew at record rates. In 2011, the EU raised its share of global new renewables capacity till 44%. The BRICS nations (Brazil, Russia, India and China) accounted for about 26% of the total global. Moreover, almost twenty countries in the Middle East, North Africa, and sub-Saharan Africa have currently active markets in renewables. The energy return ratios are commonly used to calculate the efficiency of the traditional energy sources. The Energy Return On Investment (EROI) compares the energy returned for a certain source and the energy used to get it (explore, find, develop, produce, extract, transform, harvest, grow, process, etc.). These energy return ratios have demonstrated a general decrease of efficiency of the fossil fuels and gas. When considering the limitations of the quantity of energy produced by some sources, the energy invested to obtain them and the difficulties of finding optimal locations for the establishment of renewables farms (e.g. due to an ever increasing scarce of appropriate land) the EROI becomes relevant in renewables. A spatialized EROI, which uses variables with spatial distribution, enables the optimal position in terms of both energy production and associated costs. It is important to note that the spatialized EROI can be mathematically formalized and calculated the same way for different locations in a reproducible way. This means that having established a concrete EROI methodology it is possible to generate a continuous map that will highlight the best productive zones for renewable energies in terms of maximum energy return at minimum cost. Relevant variables to calculate the real energy invested are the grid connections between production and consumption, transportation loses and efficiency of the grid. If appropriate, the spatialized EROI analysis could include any indirect costs that the source of energy might produce, such as visual impacts, food market impacts and land price. Such a spatialized study requires GIS tools to compute operations using both spatial relations like distances and frictions, and topological relations like connectivity, not easy to consider in the way that EROI is currently calculated. In a broader perspective, by applying the EROI to various energy sources, a comparative analysis of the efficiency to obtain different source can be done in a quantitative way. The increase in energy investment is also accompanied by the increase of manufactures and policies. Further efforts will be necessary in the coming years to provide energy access through smart grids and to determine the efficient areas in terms of cost of production and energy returned on investment. The authors present the EROI as a reliable solution to address the input and output energy relationship and increase the efficiency in energy investment considering the appropriate geospatial variables. The spatialized EROI can be a useful tool to consider by decision makers when designing energy policies and programming energy funds, because it is an objective demonstration of which energy sources are more convenient in terms of costs and efficiency.
Cruz Minguillón, María; Querol, Xavier; Alastuey, Andrés; Monfort, Eliseo; Vicente Miró, José
2007-10-01
Principal component analysis (PCA) coupled with a multilinear regression analysis (MLRA) was applied to PM(10) speciation data series (2002-2005) from four sampling sites in a highly industrialised area (ceramic production) in the process of implementing emission abatement technology. Five common factors with similar chemical profiles were identified at all the sites: mineral, regional background (influenced by the industrial estate located on the coast: an oil refinery and a power plant), sea spray, industrial 1 (manufacture and use of glaze components, including frit fusion) and road traffic. The contribution of the regional background differs slightly from site to site. The mineral factor, attributed to the sum of several sources (mainly the ceramic industry, but also with minor contributions from soil resuspension and African dust outbreaks) contributes between 9 and 11 microg m(-3) at all the sites. Source industrial 1 entails an increase in PM(10) levels between 4 and 5 microg m(-3) at the urban sites and 2 microg m(-3) at the suburban background site. However, after 2004, this source contributed less than 2 microg m(-3) at most sites, whereas the remaining sources did not show an upward or downward trend along the study period. This gradual decrease in the contribution of source industrial 1 coincides with the implementation of PM abatement technology in the frit fusion kilns of the area. This relationship enables us to assess the efficiency of the implementation of environmental technologies in terms of their impact on air quality.
Musicians have better memory than nonmusicians: A meta-analysis.
Talamini, Francesca; Altoè, Gianmarco; Carretti, Barbara; Grassi, Massimo
2017-01-01
Several studies have found that musicians perform better than nonmusicians in memory tasks, but this is not always the case, and the strength of this apparent advantage is unknown. Here, we conducted a meta-analysis with the aim of clarifying whether musicians perform better than nonmusicians in memory tasks. Education Source; PEP (WEB)-Psychoanalytic Electronic Publishing; Psychology and Behavioral Science (EBSCO); PsycINFO (Ovid); PubMed; ScienceDirect-AllBooks Content (Elsevier API); SCOPUS (Elsevier API); SocINDEX with Full Text (EBSCO) and Google Scholar were searched for eligible studies. The selected studies involved two groups of participants: young adult musicians and nonmusicians. All the studies included memory tasks (loading long-term, short-term or working memory) that contained tonal, verbal or visuospatial stimuli. Three meta-analyses were run separately for long-term memory, short-term memory and working memory. We collected 29 studies, including 53 memory tasks. The results showed that musicians performed better than nonmusicians in terms of long-term memory, g = .29, 95% CI (.08-.51), short-term memory, g = .57, 95% CI (.41-.73), and working memory, g = .56, 95% CI (.33-.80). To further explore the data, we included a moderator (the type of stimulus presented, i.e., tonal, verbal or visuospatial), which was found to influence the effect size for short-term and working memory, but not for long-term memory. In terms of short-term and working memory, the musicians' advantage was large with tonal stimuli, moderate with verbal stimuli, and small or null with visuospatial stimuli. The three meta-analyses revealed a small effect size for long-term memory, and a medium effect size for short-term and working memory, suggesting that musicians perform better than nonmusicians in memory tasks. Moreover, the effect of the moderator suggested that, the type of stimuli influences this advantage.
Musicians have better memory than nonmusicians: A meta-analysis
Altoè, Gianmarco; Carretti, Barbara; Grassi, Massimo
2017-01-01
Background Several studies have found that musicians perform better than nonmusicians in memory tasks, but this is not always the case, and the strength of this apparent advantage is unknown. Here, we conducted a meta-analysis with the aim of clarifying whether musicians perform better than nonmusicians in memory tasks. Methods Education Source; PEP (WEB)—Psychoanalytic Electronic Publishing; Psychology and Behavioral Science (EBSCO); PsycINFO (Ovid); PubMed; ScienceDirect—AllBooks Content (Elsevier API); SCOPUS (Elsevier API); SocINDEX with Full Text (EBSCO) and Google Scholar were searched for eligible studies. The selected studies involved two groups of participants: young adult musicians and nonmusicians. All the studies included memory tasks (loading long-term, short-term or working memory) that contained tonal, verbal or visuospatial stimuli. Three meta-analyses were run separately for long-term memory, short-term memory and working memory. Results We collected 29 studies, including 53 memory tasks. The results showed that musicians performed better than nonmusicians in terms of long-term memory, g = .29, 95% CI (.08–.51), short-term memory, g = .57, 95% CI (.41–.73), and working memory, g = .56, 95% CI (.33–.80). To further explore the data, we included a moderator (the type of stimulus presented, i.e., tonal, verbal or visuospatial), which was found to influence the effect size for short-term and working memory, but not for long-term memory. In terms of short-term and working memory, the musicians’ advantage was large with tonal stimuli, moderate with verbal stimuli, and small or null with visuospatial stimuli. Conclusions The three meta-analyses revealed a small effect size for long-term memory, and a medium effect size for short-term and working memory, suggesting that musicians perform better than nonmusicians in memory tasks. Moreover, the effect of the moderator suggested that, the type of stimuli influences this advantage. PMID:29049416
Hirsch, Robert M.; De Cicco, Laura A.
2015-01-01
Evaluating long-term changes in river conditions (water quality and discharge) is an important use of hydrologic data. To carry out such evaluations, the hydrologist needs tools to facilitate several key steps in the process: acquiring the data records from a variety of sources, structuring it in ways that facilitate the analysis, processing the data with routines that extract information about changes that may be happening, and displaying findings with graphical techniques. A pair of tightly linked R packages, called dataRetrieval and EGRET (Exploration and Graphics for RivEr Trends), have been developed for carrying out each of these steps in an integrated manner. They are designed to easily accept data from three sources: U.S. Geological Survey hydrologic data, U.S. Environmental Protection Agency (EPA) STORET data, and user-supplied flat files. The dataRetrieval package not only serves as a “front end” to the EGRET package, it can also be used to easily download many types of hydrologic data and organize it in ways that facilitate many other hydrologic applications. The EGRET package has components oriented towards the description of long-term changes in streamflow statistics (high flow, average flow, and low flow) as well as changes in water quality. For the water-quality analysis, it uses Weighted Regressions on Time, Discharge and Season (WRTDS) to describe long-term trends in both concentration and flux. EGRET also creates a wide range of graphical presentations of the water-quality data and of the WRTDS results. This report serves as a user guide to these two R packages, providing detailed guidance on installation and use of the software, documentation of the analysis methods used, as well as guidance on some of the kinds of questions and approaches that the software can facilitate.
Identifing Atmospheric Pollutant Sources Using Artificial Neural Networks
NASA Astrophysics Data System (ADS)
Paes, F. F.; Campos, H. F.; Luz, E. P.; Carvalho, A. R.
2008-05-01
The estimation of the area source pollutant strength is a relevant issue for atmospheric environment. This characterizes an inverse problem in the atmospheric pollution dispersion. In the inverse analysis, an area source domain is considered, where the strength of such area source term is assumed unknown. The inverse problem is solved by using a supervised artificial neural network: multi-layer perceptron. The conection weights of the neural network are computed from delta rule - learning process. The neural network inversion is compared with results from standard inverse analysis (regularized inverse solution). In the regularization method, the inverse problem is formulated as a non-linear optimization approach, whose the objective function is given by the square difference between the measured pollutant concentration and the mathematical models, associated with a regularization operator. In our numerical experiments, the forward problem is addressed by a source-receptor scheme, where a regressive Lagrangian model is applied to compute the transition matrix. The second order maximum entropy regularization is used, and the regularization parameter is calculated by the L-curve technique. The objective function is minimized employing a deterministic scheme (a quasi-Newton algorithm) [1] and a stochastic technique (PSO: particle swarm optimization) [2]. The inverse problem methodology is tested with synthetic observational data, from six measurement points in the physical domain. The best inverse solutions were obtained with neural networks. References: [1] D. R. Roberti, D. Anfossi, H. F. Campos Velho, G. A. Degrazia (2005): Estimating Emission Rate and Pollutant Source Location, Ciencia e Natura, p. 131-134. [2] E.F.P. da Luz, H.F. de Campos Velho, J.C. Becceneri, D.R. Roberti (2007): Estimating Atmospheric Area Source Strength Through Particle Swarm Optimization. Inverse Problems, Desing and Optimization Symposium IPDO-2007, April 16-18, Miami (FL), USA, vol 1, p. 354-359.
Discrimination of particulate matter emission sources using stochastic methods
NASA Astrophysics Data System (ADS)
Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek
2016-12-01
Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
NASA Astrophysics Data System (ADS)
Ciprini, S.; Takalo, L. O.; Tosti, G.; Raiteri, C. M.; Fiorucci, M.; Villata, M.; Nucciarelli, G.; Lanteri, L.; Nilsson, K.; Ros, J. A.
2007-05-01
Aims:New data and results on the optical behavior of the prominent blazar PKS 0735+178 (also known as OI 158, S3 0735+17, DA 237, 1ES 0735+178, 3EG J0737+1721) are presented, through the most continuous BVRI data available in the period 1994-2004 (about 500 nights of observations). In addition, the whole historical light curve, and a new photometric calibration of comparison stars in the field of this source are reported. Methods: Several methods for time series analysis of sparse data sets are developed, adapted, and applied to the reconstructed historical light curve and to each observing season of our unpublished optical database on PKS 0735+178. Optical spectral indexes are calculated from the multi-band observations and studied on long-term (years) durations as well. For the first time in this source, variability modes, characteristic timescales, and the signal power spectrum are explored and identified over 3 decades in time with sufficient statistics. The novel investigation of mid-term optical scales (days, weeks), could be also applied and compared to blazar gamma-ray light curves that will be provided, on the same timescales, by the forthcoming GLAST observatory. Results: In the last 10 years the optical emission of PKS 0735+178 exhibited a rather achromatic behavior and a variability mode resembling the shot-noise. The source was at an intermediate or low brightness level, showing a mild flaring activity and a superimposition/succession of rapid and slower flares, without extraordinary and isolated outbursts, but, at any rate, characterized by one major active phase in 2001. Several mid-term scales of variability were found, the more common falling into duration intervals of about 27-28 days, 50-56 days and 76-79 days. Rapid variability in the historical light curve appears to be modulated by a general, slower, and rather oscillating temporal trend, where typical amplitudes of about 4.5, 8.5, and 11-13 years can be identified. This spectral and temporal analysis, accompanying our data publication, suggests the occurrence of distinctive signatures at mid-term durations that can likely be of transitory nature. On the other hand the possible pseudo-cyclical or multi-component modulations at long times could be more stable, recurrent and correlated to the bimodal radio flux behavior and the twisted radio structure observed over several years in this blazar.