Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux
NASA Astrophysics Data System (ADS)
Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.
2017-12-01
Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.
BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Lotz
1997-02-15
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less
10 CFR 960.3-1-5 - Basis for site evaluations.
Code of Federal Regulations, 2013 CFR
2013-01-01
... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...
10 CFR 960.3-1-5 - Basis for site evaluations.
Code of Federal Regulations, 2010 CFR
2010-01-01
... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...
10 CFR 960.3-1-5 - Basis for site evaluations.
Code of Federal Regulations, 2011 CFR
2011-01-01
... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...
10 CFR 960.3-1-5 - Basis for site evaluations.
Code of Federal Regulations, 2012 CFR
2012-01-01
... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...
10 CFR 960.3-1-5 - Basis for site evaluations.
Code of Federal Regulations, 2014 CFR
2014-01-01
... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...
Building Assessment Survey and Evaluation Study: Summarized Data - Test Space Pollutant Sources
information collected regarding sources that may have potential impact on the building in terms of indoor air quality including sources such as past or current water damage, pesticide application practices, special use spaces, etc.
Generation of GHS Scores from TEST and online sources
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...
Generation of Alternative Assessment Scores using TEST and online data sources
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...
Generation of GHS Scores from TEST and online sources ...
Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat toxicity, developmental toxicity, endocrine activity, and mutagenicity. It can be used to evaluate ecotoxicity (in terms of acute fathead minnow toxicity) and fate (in terms of bioconcentration factor). It also be used to estimate a variety of key physicochemical properties such as melting point, boiling point, vapor pressure, water solubility, and bioconcentration factor. A web-based version of T.E.S.T. is currently being developed to allow predictions to be made from other web tools. Online data sources such as from NCCT’s Chemistry Dashboard, REACH dossiers, or from ChemHat.org can also be utilized to obtain GHS (Global Harmonization System) scores for comparing alternatives. The purpose of this talk is to show how GHS (Global Harmonization Score) data can be obtained from literature sources and from T.E.S.T. (Toxicity Estimation Software Tool). This data will be used to compare chemical alternatives in the alternatives assessment dashboard (a 2018 CSS product).
Observation-based source terms in the third-generation wave model WAVEWATCH
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.
2015-12-01
Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.
Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy
ERIC Educational Resources Information Center
Hall, Matthew L.; Bavelier, Daphne
2011-01-01
Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory--perception, encoding, and recall--in this effect. The present study…
Unsupervised Segmentation of Head Tissues from Multi-modal MR Images for EEG Source Localization.
Mahmood, Qaiser; Chodorowski, Artur; Mehnert, Andrew; Gellermann, Johanna; Persson, Mikael
2015-08-01
In this paper, we present and evaluate an automatic unsupervised segmentation method, hierarchical segmentation approach (HSA)-Bayesian-based adaptive mean shift (BAMS), for use in the construction of a patient-specific head conductivity model for electroencephalography (EEG) source localization. It is based on a HSA and BAMS for segmenting the tissues from multi-modal magnetic resonance (MR) head images. The evaluation of the proposed method was done both directly in terms of segmentation accuracy and indirectly in terms of source localization accuracy. The direct evaluation was performed relative to a commonly used reference method brain extraction tool (BET)-FMRIB's automated segmentation tool (FAST) and four variants of the HSA using both synthetic data and real data from ten subjects. The synthetic data includes multiple realizations of four different noise levels and several realizations of typical noise with a 20% bias field level. The Dice index and Hausdorff distance were used to measure the segmentation accuracy. The indirect evaluation was performed relative to the reference method BET-FAST using synthetic two-dimensional (2D) multimodal magnetic resonance (MR) data with 3% noise and synthetic EEG (generated for a prescribed source). The source localization accuracy was determined in terms of localization error and relative error of potential. The experimental results demonstrate the efficacy of HSA-BAMS, its robustness to noise and the bias field, and that it provides better segmentation accuracy than the reference method and variants of the HSA. They also show that it leads to a more accurate localization accuracy than the commonly used reference method and suggest that it has potential as a surrogate for expert manual segmentation for the EEG source localization problem.
Air Pollution Manual, Part 1--Evaluation. Second Edition.
ERIC Educational Resources Information Center
Giever, Paul M., Ed.
Due to the great increase in technical knowledge and improvement in procedures, this second edition has been prepared to update existing information. Air pollution legislation is reviewed. Sources of air pollution are examined extensively. They are treated in terms of natural sources, man-made sources, metropolitan regional emissions, emission…
Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, D.; Brunett, A.; Passerini, S.
Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less
NASA Astrophysics Data System (ADS)
Marques, G.; Fraga, C. C. S.; Medellin-Azuara, J.
2016-12-01
The expansion and operation of urban water supply systems under growing demands, hydrologic uncertainty and water scarcity requires a strategic combination of supply sources for reliability, reduced costs and improved operational flexibility. The design and operation of such portfolio of water supply sources involves integration of long and short term planning to determine what and when to expand, and how much to use of each supply source accounting for interest rates, economies of scale and hydrologic variability. This research presents an integrated methodology coupling dynamic programming optimization with quadratic programming to optimize the expansion (long term) and operations (short term) of multiple water supply alternatives. Lagrange Multipliers produced by the short-term model provide a signal about the marginal opportunity cost of expansion to the long-term model, in an iterative procedure. A simulation model hosts the water supply infrastructure and hydrologic conditions. Results allow (a) identification of trade offs between cost and reliability of different expansion paths and water use decisions; (b) evaluation of water transfers between urban supply systems; and (c) evaluation of potential gains by reducing water system losses as a portfolio component. The latter is critical in several developing countries where water supply system losses are high and often neglected in favor of more system expansion.
Fuel-conservative engine technology
NASA Technical Reports Server (NTRS)
Dugan, J. F., Jr.; Mcaulay, J. E.; Reynolds, T. W.; Strack, W. C.
1975-01-01
Aircraft fuel consumption is discussed in terms of its efficient use, and the conversion of energy from sources other than petroleum. Topics discussed include: fuel from coal and oil shale, hydrogen deficiency of alternate sources, alternate fuels evaluation program, and future engines.
Short-term energy outlook. Volume 2. Methodology
NASA Astrophysics Data System (ADS)
1983-05-01
Recent changes in forecasting methodology for nonutility distillate fuel oil demand and for the near-term petroleum forecasts are discussed. The accuracy of previous short-term forecasts of most of the major energy sources published in the last 13 issues of the Outlook is evaluated. Macroeconomic and weather assumptions are included in this evaluation. Energy forecasts for 1983 are compared. Structural change in US petroleum consumption, the use of appropriate weather data in energy demand modeling, and petroleum inventories, imports, and refinery runs are discussed.
Seismic hazard assessment over time: Modelling earthquakes in Taiwan
NASA Astrophysics Data System (ADS)
Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting
2017-04-01
To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.
A routinely applied atmospheric dispersion model was modified to evaluate alternative modeling techniques which allowed for more detailed source data, onsite meteorological data, and several dispersion methodologies. These were evaluated with hourly SO2 concentrations measured at...
Moraghebi, Roksana; Kirkeby, Agnete; Chaves, Patricia; Rönn, Roger E; Sitnicka, Ewa; Parmar, Malin; Larsson, Marcus; Herbst, Andreas; Woods, Niels-Bjarne
2017-08-25
Mesenchymal stromal cells (MSCs) are currently being evaluated in numerous pre-clinical and clinical cell-based therapy studies. Furthermore, there is an increasing interest in exploring alternative uses of these cells in disease modelling, pharmaceutical screening, and regenerative medicine by applying reprogramming technologies. However, the limited availability of MSCs from various sources restricts their use. Term amniotic fluid has been proposed as an alternative source of MSCs. Previously, only low volumes of term fluid and its cellular constituents have been collected, and current knowledge of the MSCs derived from this fluid is limited. In this study, we collected amniotic fluid at term using a novel collection system and evaluated amniotic fluid MSC content and their characteristics, including their feasibility to undergo cellular reprogramming. Amniotic fluid was collected at term caesarean section deliveries using a closed catheter-based system. Following fluid processing, amniotic fluid was assessed for cellularity, MSC frequency, in-vitro proliferation, surface phenotype, differentiation, and gene expression characteristics. Cells were also reprogrammed to the pluripotent stem cell state and differentiated towards neural and haematopoietic lineages. The average volume of term amniotic fluid collected was approximately 0.4 litres per donor, containing an average of 7 million viable mononuclear cells per litre, and a CFU-F content of 15 per 100,000 MNCs. Expanded CFU-F cultures showed similar surface phenotype, differentiation potential, and gene expression characteristics to MSCs isolated from traditional sources, and showed extensive expansion potential and rapid doubling times. Given the high proliferation rates of these neonatal source cells, we assessed them in a reprogramming application, where the derived induced pluripotent stem cells showed multigerm layer lineage differentiation potential. The potentially large donor base from caesarean section deliveries, the high yield of term amniotic fluid MSCs obtainable, the properties of the MSCs identified, and the suitability of the cells to be reprogrammed into the pluripotent state demonstrated these cells to be a promising and plentiful resource for further evaluation in bio-banking, cell therapy, disease modelling, and regenerative medicine applications.
Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry
NASA Astrophysics Data System (ADS)
Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter
2014-06-01
Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.
1987-06-15
INTRODUCTIONo ....... .. .......... _..*.. ........ o... 19 The Inner Harbor Navigation Canal Complex............19 Sources Used During Research...PAGE BLANK Sources Used During Research The archival sources utilized during the research for the project include: Annual Reports to the Chief of...the term used to describe a condition where the earth is eroded by an underground source of water. The effect of this type of erosion would be that
Source term evaluation model for high-level radioactive waste repository with decay chain build-up.
Chopra, Manish; Sunny, Faby; Oza, R B
2016-09-18
A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.
pyJac: Analytical Jacobian generator for chemical kinetics
NASA Astrophysics Data System (ADS)
Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen
2017-06-01
Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using pyJac via matrix evaluation timing comparisons; the routines produced by pyJac outperformed first-order finite differences by 3-7.5 times and the existing analytical Jacobian software TChem by 1.1-2.2 times on a single-threaded basis. It is noted that TChem is not thread-safe, while pyJac is easily parallelized, and hence can greatly outperform TChem on multicore CPUs. The Jacobian matrix generator we describe here will be useful for reducing the cost of integrating chemical source terms with implicit algorithms in particular and algorithms that require an accurate Jacobian matrix in general. Furthermore, the open-source release of the program and Python-based implementation will enable wide adoption.
EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young
2003-02-27
Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less
2016-04-01
phosphate use by these recombinant strains was evaluated because carbon use by these strains is still undergoing optimization by LBNL. The E . coli ...plasmids, had successful growth when transformed into a different E . coli background, which correlated with IMPA degradation. Ultimately, the...transformed E . coli strains, optimized at ECBC, were able to grow using IMPA as the phosphate source. 15. SUBJECT TERMS Acetylcholinesterase (AChE
NASA Astrophysics Data System (ADS)
Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.
2016-12-01
A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter wi...
A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter wi...
A new sensor system for mobile and aerial emission sampling was developed for open area sources, such as open burning. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, and black carbon, samplers for particulate matter with ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools
This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...
Ma, Jing; Hipel, Keith W; Hanson, Mark L
2017-12-21
A comprehensive evaluation of public participation in rural domestic waste (RDW) source-separated collection in China was carried out within a social-dimension framework, specifically in terms of public perception, awareness, attitude, and willingness to pay for RDW management. The evaluation was based on a case study conducted in Guilin, Guangxi Zhuang Autonomous Region, China, which is a representative of most inland areas of the country with a GDP around the national average. It was found that unlike urban residents, rural residents maintained a high rate of recycling, but in a spontaneous manner; they paid more attention to issues closely related to their daily lives, but less attention to those at the general level; their awareness of RDW source-separated collection was low and different age groups showed significantly different preferences regarding the sources of knowledge acquirement. Among potential information sources, village committees played a very important role in knowledge dissemination; for the respondents' pro-environmental attitudes, the influencing factor of "lack of legislation/policy" was considered to be significant; mandatory charges for waste collection and disposal had a high rate of acceptance among rural residents; and high monthly incomes had a positive correlation with both public pro-environmental attitudes and public willingness to pay for extra charges levied by RDW management. These observations imply that, for decision-makers in the short term, implementing mandatory RDW source-separated collection programs with enforced guidelines and economic compensation is more effective, while in the long run, promoting pro-environmental education to rural residents is more important.
ERIC Educational Resources Information Center
Valle, Victor M.
Intended as a contribution to a workshop discussion on program evaluation in higher education, the paper covers five major evaluation issues. First, it deals with evaluation concepts, explaining the purposes of evaluation; pertinent terms; and the sources of evaluation in public health procedures, the scientific method, the systems approach, and…
Evaluation of actuator energy storage and power sources for spacecraft applications
NASA Technical Reports Server (NTRS)
Simon, William E.; Young, Fred M.
1993-01-01
The objective of this evaluation is to determine an optimum energy storage/power source combination for electrical actuation systems for existing (Solid Rocket Booster (SRB), Shuttle) and future (Advanced Launch System (ALS), Shuttle Derivative) vehicles. Characteristic of these applications is the requirement for high power pulses (50-200 kW) for short times (milliseconds to seconds), coupled with longer-term base or 'housekeeping' requirements (5-16 kW). Specific study parameters (e.g., weight, volume, etc.) as stated in the proposal and specified in the Statement of Work (SOW) are included.
K. Ellum; H.O. Liechty; M.A. Blazier
2013-01-01
Straw harvesting can supplement traditional revenues generated by loblolly pine (Pinus taeda L.) plantation management. However, repeated raking may alter soil properties and nutrition. In northcentral Louisiana, a study was conducted to evaluate the long-term effects of intensive straw raking and fertilizer source (inorganic or organic) on nitrogen...
ERIC Educational Resources Information Center
Mylonas, Kostas; Furnham, Adrian; Divale, William; Leblebici, Cigdem; Gondim, Sonia; Moniz, Angela; Grad, Hector; Alvaro, Jose Luis; Cretu, Romeo Zeno; Filus, Ania; Boski, Pawel
2014-01-01
Several sources of bias can plague research data and individual assessment. When cultural groups are considered, across or even within countries, it is essential that the constructs assessed and evaluated are as free as possible from any source of bias and specifically from bias caused due to culturally specific characteristics. Employing the…
Cross-Disciplinary and Intermode Agreement on the Description and Evaluation of Landscape Resources
ERIC Educational Resources Information Center
Zube, Ervin H.
1974-01-01
Data obtained from ground reconnaissance and from office studies employing aerial photography show that the extent of agreement between environmental designers and resource managers on the use of descriptive and evaluative landscape terms is generally high. Use of remote data sources for science resource assessments is supported. (DT)
Evaluation of a Brief Homework Assignment Designed to Reduce Citation Problems
ERIC Educational Resources Information Center
Schuetze, Pamela
2004-01-01
I evaluated a brief homework assignment designed to reduce citation problems in research-based term papers. Students in 2 developmental psychology classes received a brief presentation and handout defining plagiarism with tips on how to cite sources to avoid plagiarizing. In addition, students in 1 class completed 2 brief homework assignments in…
Matrix effect and recovery terminology issues in regulated drug bioanalysis.
Huang, Yong; Shi, Robert; Gee, Winnie; Bonderud, Richard
2012-02-01
Understanding the meaning of the terms used in the bioanalytical method validation guidance is essential for practitioners to implement best practice. However, terms that have several meanings or that have different interpretations exist within bioanalysis, and this may give rise to differing practices. In this perspective we discuss an important but often confusing term - 'matrix effect (ME)' - in regulated drug bioanalysis. The ME can be interpreted as either the ionization change or the measurement bias of the method caused by the nonanalyte matrix. The ME definition dilemma makes its evaluation challenging. The matrix factor is currently used as a standard method for evaluation of ionization changes caused by the matrix in MS-based methods. Standard additions to pre-extraction samples have been suggested to evaluate the overall effects of a matrix from different sources on the analytical system, because it covers ionization variation and extraction recovery variation. We also provide our personal views on the term 'recovery'.
Owen, Julia P; Wipf, David P; Attias, Hagai T; Sekihara, Kensuke; Nagarajan, Srikantan S
2012-03-01
In this paper, we present an extensive performance evaluation of a novel source localization algorithm, Champagne. It is derived in an empirical Bayesian framework that yields sparse solutions to the inverse problem. It is robust to correlated sources and learns the statistics of non-stimulus-evoked activity to suppress the effect of noise and interfering brain activity. We tested Champagne on both simulated and real M/EEG data. The source locations used for the simulated data were chosen to test the performance on challenging source configurations. In simulations, we found that Champagne outperforms the benchmark algorithms in terms of both the accuracy of the source localizations and the correct estimation of source time courses. We also demonstrate that Champagne is more robust to correlated brain activity present in real MEG data and is able to resolve many distinct and functionally relevant brain areas with real MEG and EEG data. Copyright © 2011 Elsevier Inc. All rights reserved.
Evaluation of the communications impact of a low power arcjet thruster
NASA Technical Reports Server (NTRS)
Carney, Lynnette M.
1988-01-01
The interaction of a 1 kW arcjet thruster plume with a communications signal is evaluated. A two-parameter, source flow equation has been used to represent the far flow field distribution of the arcjet plume in a realistic spacecraft configuration. Modelling the plume as a plasma slab, the interaction of the plume with a 4 GHz communications signal is then evaluated in terms of signal attenuation and phase shift between transmitting and receiving antennas. Except for propagation paths which pass very near the arcjet source, the impacts to transmission appear to be negligible. The dominant signal loss mechanism is refraction of the beam rather than absorption losses due to collisions. However, significant reflection of the signal at the sharp vacuum-plasma boundary may also occur for propagation paths which pass near the source.
Comparison of advanced rechargeable batteries for autonomous underwater vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Descroix, J.P.; Chagnon, G.
1994-12-31
For AUV to be promising in the field of military oceanic and scientific missions, it is of great importance that power sources must meet the system needs. In view of this, this article will address the present and near term options for electric power sources. Evaluation is based on a hypothetical AUV. It is expected that considerable results will be achieved with respect to the possible options and cost needed in the manufacture of such power sources. 5 refs.
Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source.more » The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.« less
Analysis and Synthesis of Tonal Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Allen, Matthew P.; Rizzi, Stephen A.; Burdisso, Ricardo; Okcu, Selen
2012-01-01
Fixed and rotary wing aircraft operations can have a significant impact on communities in proximity to airports. Simulation of predicted aircraft flyover noise, paired with listening tests, is useful to noise reduction efforts since it allows direct annoyance evaluation of aircraft or operations currently in the design phase. This paper describes efforts to improve the realism of synthesized source noise by including short term fluctuations, specifically for inlet-radiated tones resulting from the fan stage of turbomachinery. It details analysis performed on an existing set of recorded turbofan data to isolate inlet-radiated tonal fan noise, then extract and model short term tonal fluctuations using the analytic signal. Methodologies for synthesizing time-variant tonal and broadband turbofan noise sources using measured fluctuations are also described. Finally, subjective listening test results are discussed which indicate that time-variant synthesized source noise is perceived to be very similar to recordings.
NASA Astrophysics Data System (ADS)
Taha, M. P. M.; Drew, G. H.; Longhurst, P. J.; Smith, R.; Pollard, S. J. T.
The passive and active release of bioaerosols during green waste composting, measured at source is reported for a commercial composting facility in South East (SE) England as part of a research programme focused on improving risk assessments at composting facilities. Aspergillus fumigatus and actinomycetes concentrations of 9.8-36.8×10 6 and 18.9-36.0×10 6 cfu m -3, respectively, measured during the active turning of green waste compost, were typically 3-log higher than previously reported concentrations from static compost windrows. Source depletion curves constructed for A. fumigatus during compost turning and modelled using SCREEN3 suggest that bioaerosol concentrations could reduce to background concentrations of 10 3 cfu m -3 within 100 m of this site. Authentic source term data produced from this study will help to refine the risk assessment methodologies that support improved permitting of compost facilities.
Ohm's Law and Solar Energy. Courseware Evaluation for Vocational and Technical Education.
ERIC Educational Resources Information Center
Gates, Earl; And Others
This courseware evaluation rates the Ohm's Law and Solar Energy program developed by the Iowa Department of Public Instruction. (The program--not contained in this document--covers Ohm's law and resistance problems, passive solar energy, and project ideas and sources.) Part A describes the program in terms of subject area (construction and…
Johnson, W B; Lall, R; Bongar, B; Nordlund, M D
1999-01-01
Objective personality assessment instruments offer a comparatively underutilized source of clinical data in attempts to evaluate and predict risk for suicide. In contrast to focal suicide risk measures, global personality inventories may be useful in identification of long-standing styles that predispose persons to eventual suicidal behavior. This article reviews the empirical literature regarding the efficacy of established personality inventories in predicting suicidality. The authors offer several recommendations for future research with these measures and conclude that such objective personality instruments offer only marginal utility as sources of clinical information in comprehensive suicide risk evaluations. Personality inventories may offer greatest utility in long-term assessment of suicide risk.
The numerical dynamic for highly nonlinear partial differential equations
NASA Technical Reports Server (NTRS)
Lafon, A.; Yee, H. C.
1992-01-01
Problems associated with the numerical computation of highly nonlinear equations in computational fluid dynamics are set forth and analyzed in terms of the potential ranges of spurious behaviors. A reaction-convection equation with a nonlinear source term is employed to evaluate the effects related to spatial and temporal discretizations. The discretization of the source term is described according to several methods, and the various techniques are shown to have a significant effect on the stability of the spurious solutions. Traditional linearized stability analyses cannot provide the level of confidence required for accurate fluid dynamics computations, and the incorporation of nonlinear analysis is proposed. Nonlinear analysis based on nonlinear dynamical systems complements the conventional linear approach and is valuable in the analysis of hypersonic aerodynamics and combustion phenomena.
Modeling Vortex Generators in the Wind-US Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2010-01-01
A source term model which simulates the effects of vortex generators was implemented into the Wind-US Navier Stokes code. The source term added to the Navier-Stokes equations simulates the lift force which would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, supersonic flow in a rectangular duct with a counterrotating vortex generator pair, and subsonic flow in an S-duct with 22 co-rotating vortex generators. The validation results indicate that the source term vortex generator model provides a useful tool for screening vortex generator configurations and gives comparable results to solutions computed using a gridded vane.
The long-term effectiveness of a FeSO4 + Na2S2O4 reductant solution blend for in situ saturated zone treatment of dissolved and solid phase Cr(VI) in a high pH chromite ore processing solid waste (COPSW) fill material was investigated. Two field pilot injection studies were cond...
Performance evaluation of WAVEWATCH III model in the Persian Gulf using different wind resources
NASA Astrophysics Data System (ADS)
Kazeminezhad, Mohammad Hossein; Siadatmousavi, Seyed Mostafa
2017-07-01
The third-generation wave model, WAVEWATCH III, was employed to simulate bulk wave parameters in the Persian Gulf using three different wind sources: ERA-Interim, CCMP, and GFS-Analysis. Different formulations for whitecapping term and the energy transfer from wind to wave were used, namely the Tolman and Chalikov (J Phys Oceanogr 26:497-518, 1996), WAM cycle 4 (BJA and WAM4), and Ardhuin et al. (J Phys Oceanogr 40(9):1917-1941, 2010) (TEST405 and TEST451 parameterizations) source term packages. The obtained results from numerical simulations were compared to altimeter-derived significant wave heights and measured wave parameters at two stations in the northern part of the Persian Gulf through statistical indicators and the Taylor diagram. Comparison of the bulk wave parameters with measured values showed underestimation of wave height using all wind sources. However, the performance of the model was best when GFS-Analysis wind data were used. In general, when wind veering from southeast to northwest occurred, and wind speed was high during the rotation, the model underestimation of wave height was severe. Except for the Tolman and Chalikov (J Phys Oceanogr 26:497-518, 1996) source term package, which severely underestimated the bulk wave parameters during stormy condition, the performances of other formulations were practically similar. However, in terms of statistics, the Ardhuin et al. (J Phys Oceanogr 40(9):1917-1941, 2010) source terms with TEST405 parameterization were the most successful formulation in the Persian Gulf when compared to in situ and altimeter-derived observations.
An Evaluation Methodology for Longitudinal Studies of Short Term Cancer Research Training Programs
Padilla, Luz A.; Venkatesh, Raam; Daniel, Casey L.; Desmond, Renee A.; Brooks, C. Michael; Waterbor, John W.
2014-01-01
The need to familiarize medical students and graduate health professional students with research training opportunities that cultivate the appeal of research careers is vital to the future of research. Comprehensive evaluation of a cancer research training program can be achieved through longitudinal tracking of program alumni to assess the program’s impact on each participant’s career path and professional achievements. With advances in technology and smarter means of communication, effective ways to track alumni have changed. In order to collect data on the career outcomes and achievements of nearly 500 short-term cancer research training program alumni from 1999–2013, we sought to contact each alumnus to request completion of a survey instrument online, or by means of a telephone interview. The effectiveness of each contact method that we used was quantified according to ease of use and time required. The most reliable source of contact information for tracking alumni from the early years of the program was previous tracking results; and for alumni from the later years, the most important source of contact information was university alumni records that provided email addresses and telephone numbers. Personal contacts with former preceptors were sometimes helpful, as were generic search engines and people search engines. Social networking was of little value for most searches. Using information from two or more sources in combination was most effective in tracking alumni. These results provide insights and tools for other research training programs that wish to track their alumni for long-term program evaluation. PMID:25412722
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genn Saji
2006-07-01
The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less
The Development of Lifecycle Data for Hydrogen Fuel Production and Delivery
DOT National Transportation Integrated Search
2017-10-01
An evaluation of renewable hydrogen production technologies anticipated to be available in the short, mid- and long-term timeframes was conducted. Renewable conversion pathways often rely on a combination of renewable and fossil energy sources, with ...
Casanova, Lisa M; Walters, Adam; Naghawatte, Ajith; Sobsey, Mark D
2012-06-01
Sri Lanka was devastated by the 2004 Indian Ocean tsunami. During recovery, the Red Cross distributed approximately 12,000 free ceramic water filters. This cross-sectional study was an independent post-implementation assessment of 452 households that received filters, to determine the proportion still using filters, household characteristics associated with use, and quality of household drinking water. The proportion of continued users was high (76%). The most common household water sources were taps or shallow wells. The majority (82%) of users used filtered water for drinking only. Mean filter flow rate was 1.12 L/hr (0.80 L/hr for households with taps and 0.71 for those with wells). Water quality varied by source; households using tap water had source water of high microbial quality. Filters improved water quality, reducing Escherichia coli for households (largely well users) with high levels in their source water. Households were satisfied with filters and are potentially long-term users. To promote sustained use, recovery filter distribution efforts should try to identify households at greatest long-term risk, particularly those who have not moved to safer water sources during recovery. They should be joined with long-term commitment to building supply chains and local production capacity to ensure safe water access.
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.
Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei
2017-04-01
Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.
Francová, Anna; Chrastný, Vladislav; Šillerová, Hana; Vítková, Martina; Kocourková, Jana; Komárek, Michael
2017-01-01
Samples of lichens, snow and particulate matter (PM 10 , 24 h) are used for the source identification of air pollution in the heavily industrialized region of Ostrava, Upper Silesia, Czech Republic. An integrated approach that uses different environmental samples for metal concentration and Pb isotope analyses was applied. The broad range of isotope ratios in the samples indicates a combination of different pollution sources, the strongest among them being the metallurgical industry, bituminous coal combustion and traffic. Snow samples are proven as the most relevant indicator for tracing metal(loid)s and recent local contamination in the atmosphere. Lichens can be successfully used as tracers of the long-term activity of local and remote sources of contamination. The combination of PM 10 with snow can provide very useful information for evaluation of current pollution sources. Copyright © 2016 Elsevier Ltd. All rights reserved.
Two-micron Laser Atmospheric Wind Sounder (LAWS) pointing/tracking study
NASA Technical Reports Server (NTRS)
Manlief, Scott
1995-01-01
The objective of the study was to identify and model major sources of short-term pointing jitter for a free-flying, full performance 2 micron LAWS system and evaluate the impact of the short-term jitter on wind-measurement performance. A fast steering mirror controls system was designed for the short-term jitter compensation. The performance analysis showed that the short-term jitter performance of the controls system over the 5.2 msec round-trip time for a realistic spacecraft environment was = 0.3 micro rad, rms, within the specified value of less than 0.5 micro rad, rms, derived in a 2 micron LAWS System Study. Disturbance modes were defined for: (1) the Bearing and Power Transfer Assembly (BAPTA) scan bearing, (2) the spacecraft reaction wheel torques, and (3) the solar array drive torques. The scan bearing disturbance was found to be the greatest contributing noise source to the jitter performance. Disturbances from the fast steering mirror reaction torques and a boom-mounted cross-link antenna clocking were also considered but were judged to be small compared to the three principal disturbance sources above and were not included in the final controls analysis.
Natural convection in symmetrically heated vertical parallel plates with discrete heat sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manca, O.; Nardini, S.; Naso, V.
Laminar air natural convection in a symmetrically heated vertical channel with uniform flush-mounted discrete heat sources has been experimentally investigated. The effects of heated strips location and of their number are pointed out in terms of the maximum wall temperatures. A flow visualization in the entrance region of the channel was carried out and air temperatures and velocities in two cross sections have been measured. Dimensionless local heat transfer coefficients have been evaluated and monomial correlations among relevant parameters have bee derived in the local Rayleigh number range 10--10{sup 6}. Channel Nusselt number has been correlated in a polynomial formmore » in terms of channel Rayleigh number.« less
Radionuclides in the Arctic seas from the former Soviet Union: Potential health and ecological risks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Layton, D W; Edson, R; Varela, M
1999-11-15
The primary goal of the assessment reported here is to evaluate the health and environmental threat to coastal Alaska posed by radioactive-waste dumping in the Arctic and Northwest Pacific Oceans by the FSU. In particular, the FSU discarded 16 nuclear reactors from submarines and an icebreaker in the Kara Sea near the island of Novaya Zemlya, of which 6 contained spent nuclear fuel (SNF); disposed of liquid and solid wastes in the Sea of Japan; lost a {sup 90}Sr-powered radioisotope thermoelectric generator at sea in the Sea of Okhotsk; and disposed of liquid wastes at several sites in the Pacificmore » Ocean, east of the Kamchatka Peninsula. In addition to these known sources in the oceans, the RAIG evaluated FSU waste-disposal practices at inland weapons-development sites that have contaminated major rivers flowing into the Arctic Ocean. The RAIG evaluated these sources for the potential for release to the environment, transport, and impact to Alaskan ecosystems and peoples through a variety of scenarios, including a worst-case total instantaneous and simultaneous release of the sources under investigation. The risk-assessment process described in this report is applicable to and can be used by other circumpolar countries, with the addition of information about specific ecosystems and human life-styles. They can use the ANWAP risk-assessment framework and approach used by ONR to establish potential doses for Alaska, but add their own specific data sets about human and ecological factors. The ANWAP risk assessment addresses the following Russian wastes, media, and receptors: dumped nuclear submarines and icebreaker in Kara Sea--marine pathways; solid reactor parts in Sea of Japan and Pacific Ocean--marine pathways; thermoelectric generator in Sea of Okhotsk--marine pathways; current known aqueous wastes in Mayak reservoirs and Asanov Marshes--riverine to marine pathways; and Alaska as receptor. For these waste and source terms addressed, other pathways, such as atmospheric transport, could be considered under future-funded research efforts for impacts to Alaska. The ANWAP risk assessment does not address the following wastes, media, and receptors: radioactive sources in Alaska (except to add perspective for Russian source term); radioactive wastes associated with Russian naval military operations and decommissioning; Russian production reactor and spent-fuel reprocessing facilities nonaqueous source terms; atmospheric, terrestrial and nonaqueous pathways; and dose calculations for any circumpolar locality other than Alaska. These other, potentially serious sources of radioactivity to the Arctic environment, while outside the scope of the current ANWAP mandate, should be considered for future funding research efforts.« less
NASA Astrophysics Data System (ADS)
Lee, Seong-Sun; Lee, Seung Hyun; Lee, Kang-Kun
2016-04-01
A research for the contamination of chlorinated ethenes such as trichloroethylene (TCE) at an industrial complex, Wonju, Korea, was carried out based on 17 rounds of groundwater quality data collection from 2009 to 2015. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones to groundwater discharge area like a stream. The remediation efficiency according to the remedial actions was evaluated by tracing a time-series of plume evaluation and temporal mass discharge at three transects (Source, Transect-1, Transect-2) which was assigned along the groundwater flow path. Also, based on long term monitoring data, dissolved TCE concentration and mass of residual TCE in the initial stage of disposal were estimated to evaluate the efficiency of in situ remediation. The results of temporal and spatial monitoring before remedial actions showed that a TCE plume originating from main and local source zones continues to be discharged to a stream. However, from the end of intensive remedial actions from 2012 to 2013, the aqueous concentrations of TCE plume present at and around the main source areas decreased significantly. Especially, during the intensive remediation period, the early average mass discharge (26.58 g/day) at source transect was decreased to average 4.99 g/day. Estimated initial dissolved concentration and residual mass of TCE in the initial stage of disposal decreased rapidly after an intensive remedial action in 2013 and it is expected to be continuously decreased from the end of remedial actions to 2020. This study demonstrates that long term monitoring data are useful in assessing the effectiveness of remedial actions at chlorinated ethenes contaminated site. Acknowledgements This project is supported by the Korea Ministry of Environment under "The GAIA Project (173-092-009)"and "R&D Project on Environmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.
1995-04-01
This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less
McDonnell, S; Troiano, R P; Barker, N; Noji, E; Hlady, W G; Hopkins, R
1995-12-01
Two three-stage cluster surveys were conducted in South Dade County, Florida, 14 months apart, to assess recovery following Hurricane Andrew. Response rates were 75 per cent and 84 per cent. Sources of assistance used in recovery from Hurricane Andrew differed according to race, per capita income, ethnicity, and education. Reports of improved living situation post-hurricane were not associated with receiving relief assistance, but reports of a worse situation were associated with loss of income, being exploited, or job loss. The number of households reporting problems with crime and community violence doubled between the two surveys. Disaster relief efforts had less impact on subjective long-term recovery than did job or income loss or housing repair difficulties. Existing sources of assistance were used more often than specific post-hurricane relief resources. The demographic make-up of a community may determine which are the most effective means to inform them after a disaster and what sources of assistance may be useful.
NASA Astrophysics Data System (ADS)
Nan, Tongchao; Li, Kaixuan; Wu, Jichun; Yin, Lihe
2018-04-01
Sustainability has been one of the key criteria of effective water exploitation. Groundwater exploitation and water-table decline at Haolebaoji water source site in the Ordos basin in NW China has drawn public attention due to concerns about potential threats to ecosystems and grazing land in the area. To better investigate the impact of production wells at Haolebaoji on the water table, an adapted algorithm called the random walk on grid method (WOG) is applied to simulate the hydraulic head in the unconfined and confined aquifers. This is the first attempt to apply WOG to a real groundwater problem. The method can not only evaluate the head values but also the contributions made by each source/sink term. One is allowed to analyze the impact of source/sink terms just as if one had an analytical solution. The head values evaluated by WOG match the values derived from the software Groundwater Modeling System (GMS). It suggests that WOG is effective and applicable in a heterogeneous aquifer with respect to practical problems, and the resultant information is useful for groundwater management.
Suspended-sediment sources in an urban watershed, Northeast Branch Anacostia River, Maryland
Devereux, Olivia H.; Prestegaard, Karen L.; Needelman, Brian A.; Gellis, Allen C.
2010-01-01
Fine sediment sources were characterized by chemical composition in an urban watershed, the Northeast Branch Anacostia River, which drains to the Chesapeake Bay. Concentrations of 63 elements and two radionuclides were measured in possible land-based sediment sources and suspended sediment collected from the water column at the watershed outlet during storm events. These tracer concentrations were used to determine the relative quantity of suspended sediment contributed by each source. Although this is an urbanized watershed, there was not a distinct urban signature that can be evaluated except for the contributions from road surfaces. We identified the sources of fine sediment by both physiographic province (Piedmont and Coastal Plain) and source locale (streambanks, upland and street residue) by using different sets of elemental tracers. The Piedmont contributed the majority of the fine sediment for seven of the eight measured storms. The streambanks contributed the greatest quantity of fine sediment when evaluated by source locale. Street residue contributed 13% of the total suspended sediment on average and was the source most concentrated in anthropogenically enriched elements. Combining results from the source locale and physiographic province analyses, most fine sediment in the Northeast Branch watershed is derived from streambanks that contain sediment eroded from the Piedmont physiographic province of the watershed. Sediment fingerprinting analyses are most useful when longer term evaluations of sediment erosion and storage are also available from streambank-erosion measurements, sediment budget and other methods.
NASA Astrophysics Data System (ADS)
Johnson, Lawrence; Ferry, Cécile; Poinssot, Christophe; Lovera, Patrick
2005-11-01
A source-term model for the short-term release of radionuclides from spent nuclear fuel (SNF) has been developed. It provides quantitative estimates of the fraction of various radionuclides that are expected to be released rapidly (the instant release fraction, or IRF) when water contacts the UO 2 or MOX fuel after container breaching in a geological repository. The estimates are based on correlation of leaching data for radionuclides with fuel burnup and fission gas release. Extrapolation of the data to higher fuel burnup values is based on examination of data on fuel restructuring, such as rim development, and on fission gas release data, which permits bounding IRF values to be estimated assuming that radionuclide releases will be less than fission gas release. The consideration of long-term solid-state changes influencing the IRF prior to canister breaching is addressed by evaluating alpha self-irradiation enhanced diffusion, which may gradually increase the accumulation of fission products at grain boundaries.
Mapping water availability, projected use and cost in the western United States
NASA Astrophysics Data System (ADS)
Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara
2014-05-01
New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.
Modeling Vortex Generators in a Navier-Stokes Code
NASA Technical Reports Server (NTRS)
Dudek, Julianne C.
2011-01-01
A source-term model that simulates the effects of vortex generators was implemented into the Wind-US Navier-Stokes code. The source term added to the Navier-Stokes equations simulates the lift force that would result from a vane-type vortex generator in the flowfield. The implementation is user-friendly, requiring the user to specify only three quantities for each desired vortex generator: the range of grid points over which the force is to be applied and the planform area and angle of incidence of the physical vane. The model behavior was evaluated for subsonic flow in a rectangular duct with a single vane vortex generator, subsonic flow in an S-duct with 22 corotating vortex generators, and supersonic flow in a rectangular duct with a counter-rotating vortex-generator pair. The model was also used to successfully simulate microramps in supersonic flow by treating each microramp as a pair of vanes with opposite angles of incidence. The validation results indicate that the source-term vortex-generator model provides a useful tool for screening vortex-generator configurations and gives comparable results to solutions computed using gridded vanes.
EVALUATING THE POTENTIAL FOR CHLORINATED SOLVENT DEGRADATION FROM HYDROGEN CONCENTRATIONS
Long-term monitoring of a large trichioroethylene (TCE) and 1,1,1-trichloroethane (TCA) ground water plume in Minnesota indicated that these contaminants attenuated with distance from the source. Mathematical modelling indicated that sufficient time had passed for the plume to fu...
Energy requirement for the production of silicon solar arrays
NASA Technical Reports Server (NTRS)
Lindmayer, J.; Wihl, M.; Scheinne, A.; Morrison, A. D.
1977-01-01
Photovoltaics is subject of an extensive technology assessment in terms of its net energy potential as an alternate energy source. Reduction of quartzite pebbles, refinement, crystal growth, cell processing and panel building are evaluated for energy expenditure compared to direct, indirect, and overhead energies.
Understanding the Climate of Deceit.
ERIC Educational Resources Information Center
Kincheloe, Joe L.; Staley, George
1983-01-01
Briefly discusses propaganda of the past four decades, defines the term, reviews its earliest uses, and outlines today's propaganda vehicles--mass media, special interest groups, and marketing techniques. A propaganda analysis program for educating today's youth is proposed which includes eight questions for evaluating the source of media…
Possible consequences of severe accidents at the Lubiatowo site, Poland
NASA Astrophysics Data System (ADS)
Seibert, Petra; Philipp, Anne; Hofman, Radek; Gufler, Klaus; Sholly, Steven
2014-05-01
The construction of a nuclear power plant is under consideration in Poland. One of the sites under discussion is near Lubiatowo, located on the cost of the Baltic Sea northwest of Gdansk. An assessment of possible environmental consequences is carried out for 88 real meteorological cases with the Lagrangian particle dispersion model FLEXPART. Based on literature research, three reactor designs (ABWR, EPR, AP 1000) were identified as being under discussion in Poland. For each of the designs, a set of accident scenarios was evaluated and two source terms per reactor design were selected for analysis. One of the selected source terms was a relatively large release while the second one was a severe accident with an intact containment. Considered endpoints of the calculations are ground contamination with Cs-137 and time-integrated concentrations of I-131 in air as well as committed doses. They are evaluated on a grid of ca. 3 km mesh size covering eastern Central Europe.
Physical/chemical closed-loop water-recycling
NASA Technical Reports Server (NTRS)
Herrmann, Cal C.; Wydeven, Theodore
1991-01-01
Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on Earth, in regions where extensive water recycling is needed or where advanced water treatment is essential to meet EPA health standards.
Schedl, Markus
2012-01-01
Different term weighting techniques such as [Formula: see text] or BM25 have been used intensely for manifold text-based information retrieval tasks. Their use for modeling term profiles for named entities and subsequent calculation of similarities between these named entities have been studied to a much smaller extent. The recent trend of microblogging made available massive amounts of information about almost every topic around the world. Therefore, microblogs represent a valuable source for text-based named entity modeling. In this paper, we present a systematic and comprehensive evaluation of different term weighting measures , normalization techniques , query schemes , index term sets , and similarity functions for the task of inferring similarities between named entities, based on data extracted from microblog posts . We analyze several thousand combinations of choices for the above mentioned dimensions, which influence the similarity calculation process, and we investigate in which way they impact the quality of the similarity estimates. Evaluation is performed using three real-world data sets: two collections of microblogs related to music artists and one related to movies. For the music collections, we present results of genre classification experiments using as benchmark genre information from allmusic.com. For the movie collection, we present results of multi-class classification experiments using as benchmark categories from IMDb. We show that microblogs can indeed be exploited to model named entity similarity with remarkable accuracy, provided the correct settings for the analyzed aspects are used. We further compare the results to those obtained when using Web pages as data source.
Wright, Chris; Heneghan, Nicola; Eveleigh, Gillian; Calvert, Melanie; Freemantle, Nick
2011-01-01
Objective To evaluate effectiveness of physiotherapy management in patients experiencing whiplash associated disorder II, on clinically relevant outcomes in the short and longer term. Design Systematic review and meta-analysis. Two reviewers independently searched information sources, assessed studies for inclusion, evaluated risk of bias and extracted data. A third reviewer mediated disagreement. Assessment of risk of bias was tabulated across included trials. Quantitative synthesis was conducted on comparable outcomes across trials with similar interventions. Meta-analyses compared effect sizes, with random effects as primary analyses. Data sources Predefined terms were employed to search electronic databases. Additional studies were identified from key journals, reference lists, authors and experts. Eligibility criteria for selecting studies Randomised controlled trials (RCTs) published in English before 31 December 2010 evaluating physiotherapy management of patients (>16 years), experiencing whiplash associated disorder II. Any physiotherapy intervention was included, when compared with other types of management, placebo/sham, or no intervention. Measurements reported on ≥1 outcome from the domains within the international classification of function, disability and health, were included. Results 21 RCTs (2126 participants, 9 countries) were included. Interventions were categorised as active physiotherapy or a specific physiotherapy intervention. 20/21 trials were evaluated as high risk of bias and one as unclear. 1395 participants were incorporated in the meta-analyses on 12 trials. In evaluating short term outcome in the acute/sub-acute stage, there was some evidence that active physiotherapy intervention reduces pain and improves range of movement, and that a specific physiotherapy intervention may reduce pain. However, moderate/considerable heterogeneity suggested that treatments may differ in nature or effect in different trial patients. Differences between participants, interventions and trial designs limited potential meta-analyses. Conclusions Inconclusive evidence exists for the effectiveness of physiotherapy management for whiplash associated disorder II. There is potential benefit for improving range of movement and pain short term through active physiotherapy, and for improving pain through a specific physiotherapy intervention. PMID:22102642
Nutrient concentrations and loads in the northeastern United States - Status and trends, 1975-2003
Trench, Elaine C. Todd; Moore, Richard B.; Ahearn, Elizabeth A.; Mullaney, John R.; Hickman, R. Edward; Schwarz, Gregory E.
2012-01-01
The U.S. Geological Survey (USGS) National Water-Quality Assessment Program (NAWQA) began regional studies in 2003 to synthesize information on nutrient concentrations, trends, stream loads, and sources. In the northeastern United States, a study area that extends from Maine to central Virginia, nutrient data were evaluated for 130 USGS water-quality monitoring stations. Nutrient data were analyzed for trends in flow-adjusted concentrations, modeled instream (non-flow-adjusted) concentrations, and stream loads for 32 stations with 22 to 29 years of water-quality and daily mean streamflow record during 1975-2003 (termed the long-term period), and for 46 stations during 1993-2003 (termed the recent period), by using a coupled statistical model of streamflow and water quality developed by the USGS. Recent trends in flow-adjusted concentrations of one or more nutrients also were analyzed for 90 stations by using Tobit regression. Annual stream nutrient loads were estimated, and annual nutrient yields were calculated, for 47 stations for the long-term and recent periods, and for 37 additional stations that did not have a complete streamflow and water-quality record for 1993-2003. Nutrient yield information was incorporated for 9 drainage basins evaluated in a national NAWQA study, for a total of 93 stations evaluated for nutrient yields. Long-term downward trends in flow-adjusted concentrations of total nitrogen and total phosphorus (18 and 19 of 32 stations, respectively) indicate regional improvements in nutrient-related water-quality conditions. Most of the recent trends detected for total phosphorus were upward (17 of 83 stations), indicating possible reversals to the long-term improvements. Concentrations of nutrients in many streams persist at levels that are likely to affect aquatic habitat adversely and promote freshwater or coastal eutrophication. Recent trends for modeled instream concentrations, and modeled reference concentrations, were evaluated relative to ecoregion-based nutrient criteria proposed by the U.S. Environmental Protection Agency. Instream concentrations of total nitrogen and total phosphorus persist at levels higher than proposed criteria at more than one-third and about one-half, respectively, of the 46 stations analyzed. Long-term trends in nutrient loads were primarily downward, with downward trends in total nitrogen and total phosphorus loads detected at 12 and 17 of 32 stations, respectively. Upward trends were rare, with one upward trend for total nitrogen loads and none for total phosphorus. Trends in loads of nitrite-plus-nitrate nitrogen included 7 upward and 8 downward trends among 32 stations. Downward trends in loads of ammonia nitrogen and total Kjeldahl nitrogen were detected at all six stations evaluated. Long-term downward trends detected in four of the five largest drainage basins evaluated include: total nitrogen loads for the Connecticut, Delaware, and James Rivers; total Kjeldahl nitrogen and ammonia nitrogen loads for the Susquehanna River; ammonia nitrogen and nitrite-plus-nitrate nitrogen loads for the James River; and total phosphorus loads for the Connecticut and Delaware Rivers. No trends in load were detected for the Potomac River. Nutrient yields were evaluated relative to the extent of land development in 93 drainage basins. The undeveloped land-use category included forested drainage basins with undeveloped land ranging from 75 to 100 percent of basin area. Median total nitrogen yields for the 27 undeveloped drainage basins evaluated, including 9 basins evaluated in a national NAWQA study, ranged from 290 to 4,800 pounds per square mile per year (lb/mi2/yr). Total nitrogen yields even in the most pristine drainage basins may be elevated relative to natural conditions, because of high rates of atmospheric deposition of nitrogen in parts of the northeastern United States. Median total phosphorus yields ranged from 12 to 330 lb/mi2/yr for the 26 undeveloped basins evaluated. The undeveloped category includes some large drainage basins with point-source discharges and small percentages of developed land; in these basins, streamflow from undeveloped headwater areas dilutes streamflow in more urbanized reaches, and dampens but does not eliminate the point-source "signal" of higher nutrient loads. Median total nitrogen yields generally do not exceed 1,700 lb/mi2/yr, and median total phosphorus yields generally do not exceed 100 lb/mi2/yr, in the drainage basins that are least affected by human land-use and waste-disposal practices. Agricultural and urban land use has increased nutrient yields substantially relative to undeveloped drainage basins. Median total nitrogen yields for 24 agricultural basins ranged from 1,700 to 26,000 lb/mi2/yr, and median total phosphorus yields ranged from 94 to 1,000 lb/mi2/yr. The maximum estimated total nitrogen and total phosphorus yields, 32,000 and 16,000 lb/mi2/yr, respectively, for all stations in the region were in small (less than 50 square miles (mi2)) agricultural drainage basins. Median total nitrogen yields ranged from 1,400 to 17,000 lb/mi2/yr in 26 urbanized drainage basins, and median total phosphorus yields ranged from 43 to 1,900 lb/mi2/yr. Urbanized drainage basins with the highest nutrient yields are generally small (less than 300 mi2) and are drained by streams that receive major point-source discharges. Instream nutrient loads were evaluated relative to loads from point-source discharges in four drainage basins: the Quinebaug River Basin in Connecticut, Massachusetts, and Rhode Island; the Raritan River Basin in New Jersey; the Patuxent River Basin in Maryland; and the James River Basin in Virginia. Long-term downward trends in nutrient loads, coupled with similar trends in flow-adjusted nutrient concentrations, indicate long-term reductions in the delivery of most nutrients to these streams. However, the absence of recent downward trends in load for most nutrients, coupled with instream concentrations that exceed proposed nutrient criteria in several of these waste-receiving streams, indicates that challenges remain in reducing delivery of nutrients to streams from point sources. During dry years, the total nutrient load from point sources in some of the drainage basins approached or equaled the nutrient load transported by the stream.
Bayesian source term estimation of atmospheric releases in urban areas using LES approach.
Xue, Fei; Kikumoto, Hideki; Li, Xiaofeng; Ooka, Ryozo
2018-05-05
The estimation of source information from limited measurements of a sensor network is a challenging inverse problem, which can be viewed as an assimilation process of the observed concentration data and the predicted concentration data. When dealing with releases in built-up areas, the predicted data are generally obtained by the Reynolds-averaged Navier-Stokes (RANS) equations, which yields building-resolving results; however, RANS-based models are outperformed by large-eddy simulation (LES) in the predictions of both airflow and dispersion. Therefore, it is important to explore the possibility of improving the estimation of the source parameters by using the LES approach. In this paper, a novel source term estimation method is proposed based on LES approach using Bayesian inference. The source-receptor relationship is obtained by solving the adjoint equations constructed using the time-averaged flow field simulated by the LES approach based on the gradient diffusion hypothesis. A wind tunnel experiment with a constant point source downwind of a single building model is used to evaluate the performance of the proposed method, which is compared with that of the existing method using a RANS model. The results show that the proposed method reduces the errors of source location and releasing strength by 77% and 28%, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.
Source term evaluation for combustion modeling
NASA Technical Reports Server (NTRS)
Sussman, Myles A.
1993-01-01
A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.
Coordinated Home Care Training Manual.
ERIC Educational Resources Information Center
Michigan Univ., Ann Arbor. Home Care Training Center.
This manual is intended as a source of information and assistance in the planning, organization, implementation, and evaluation of home care programs. There are ten major sections: (1) Introduction (review of the history of home care and definition of pertinent terms), (2) Program Planning, (3) Organizational Structure, (4) Coordination and…
USDA-ARS?s Scientific Manuscript database
A long-term systems trial was established to evaluate management practices for organic production of northern highbush blueberry (Vaccinium corymbosum L.). The factorial experiment included two planting bed treatments (flat and raised beds), source and rate of fertilizer (feather meal and fish emuls...
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.C. Ryman
This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operatingmore » conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the Quality Assurance Requirements and Description (Ref. 7.28). The performance of the calculation and development of this document are carried out in accordance with AP-3.124, ''Design Calculation and Analyses'' (Ref. 7.29).« less
Angular dependence of source-target-detector in active mode standoff infrared detection
NASA Astrophysics Data System (ADS)
Pacheco-Londoño, Leonardo C.; Castro-Suarez, John R.; Aparicio-Bolaños, Joaquín. A.; Hernández-Rivera, Samuel P.
2013-06-01
Active mode standoff measurement using infrared spectroscopy were carried out in which the angle between target and the source was varied from 0-70° with respect to the surface normal of substrates containing traces of highly energetic materials (explosives). The experiments were made using three infrared sources: a modulated source (Mod-FTIR), an unmodulated source (UnMod-FTIR) and a scanning quantum cascade laser (QCL), part of a dispersive mid infrared (MIR) spectrometer. The targets consisted of PENT 200 μg/cm2 deposited on aluminum plates placed at 1 m from the sources. The evaluation of the three modalities was aimed at verifying the influence of the highly collimated laser beam in the detection in comparison with the other sources. The Mod-FTIR performed better than QCL source in terms of the MIR signal intensity decrease with increasing angle.
Life-cycle energy impacts for adapting an urban water supply system to droughts.
Lam, Ka Leung; Stokes-Draut, Jennifer R; Horvath, Arpad; Lane, Joe L; Kenway, Steven J; Lant, Paul A
2017-12-15
In recent years, cities in some water stressed regions have explored alternative water sources such as seawater desalination and potable water recycling in spite of concerns over increasing energy consumption. In this study, we evaluate the current and future life-cycle energy impacts of four alternative water supply strategies introduced during a decade-long drought in South East Queensland (SEQ), Australia. These strategies were: seawater desalination, indirect potable water recycling, network integration, and rainwater tanks. Our work highlights the energy burden of alternative water supply strategies which added approximately 24% life-cycle energy use to the existing supply system (with surface water sources) in SEQ even for a current post-drought low utilisation status. Over half of this additional life-cycle energy use was from the centralised alternative supply strategies. Rainwater tanks contributed an estimated 3% to regional water supply, but added over 10% life-cycle energy use to the existing system. In the future scenario analysis, we compare the life-cycle energy use between "Normal", "Dry", "High water demand" and "Design capacity" scenarios. In the "Normal" scenario, a long-term low utilisation of the desalination system and the water recycling system has greatly reduced the energy burden of these centralised strategies to only 13%. In contrast, higher utilisation in the unlikely "Dry" and "Design capacity" scenarios add 86% and 140% to life-cycle energy use of the existing system respectively. In the "High water demand" scenario, a 20% increase in per capita water use over 20 years "consumes" more energy than is used by the four alternative strategies in the "Normal" scenario. This research provides insight for developing more realistic long-term scenarios to evaluate and compare life-cycle energy impacts of drought-adaptation infrastructure and regional decentralised water sources. Scenario building for life-cycle assessments of water supply systems should consider i) climate variability and, therefore, infrastructure utilisation rate, ii) potential under-utilisation for both installed centralised and decentralised sources, and iii) the potential energy penalty for operating infrastructure well below its design capacity (e.g., the operational energy intensity of the desalination system is three times higher at low utilisation rates). This study illustrates that evaluating the life-cycle energy use and intensity of these type of supply sources without considering their realistic long-term operating scenario(s) can potentially distort and overemphasise their energy implications. To other water stressed regions, this work shows that managing long-term water demand is also important, in addition to acknowledging the energy-intensive nature of some alternative water sources. Copyright © 2017 Elsevier Ltd. All rights reserved.
Prediction of discretization error using the error transport equation
NASA Astrophysics Data System (ADS)
Celik, Ismail B.; Parsons, Don Roscoe
2017-06-01
This study focuses on an approach to quantify the discretization error associated with numerical solutions of partial differential equations by solving an error transport equation (ETE). The goal is to develop a method that can be used to adequately predict the discretization error using the numerical solution on only one grid/mesh. The primary problem associated with solving the ETE is the formulation of the error source term which is required for accurately predicting the transport of the error. In this study, a novel approach is considered which involves fitting the numerical solution with a series of locally smooth curves and then blending them together with a weighted spline approach. The result is a continuously differentiable analytic expression that can be used to determine the error source term. Once the source term has been developed, the ETE can easily be solved using the same solver that is used to obtain the original numerical solution. The new methodology is applied to the two-dimensional Navier-Stokes equations in the laminar flow regime. A simple unsteady flow case is also considered. The discretization error predictions based on the methodology presented in this study are in good agreement with the 'true error'. While in most cases the error predictions are not quite as accurate as those from Richardson extrapolation, the results are reasonable and only require one numerical grid. The current results indicate that there is much promise going forward with the newly developed error source term evaluation technique and the ETE.
2011-02-18
Control Limit Lower Control Limit Reaction Plan 1 Complaints from other suppliers (synopsis, award) SCG During award process Identify Sole- Source...Parts 0.0 1.0 0.0 Evaluate complaint, if valid remove item from contract. 2 Tracking timeline for procurement/reviews SCG During pre- award process...Review Solicitation 100.0 Determine where the document stands in the approval process. Adjust milestones and followup . 3 FAR/DPAP guidance SCG
An Investigation of the Influence of Waves on Sediment Processes in Skagit Bay
2011-09-30
source term parameterizations common to most surface wave models, including wave generation by wind , energy dissipation from whitecapping, and...I. Total energy and peak frequency. Coastal Engineering (29), 47-78. Zijlema, M. Computation of wind -wave spectra in coastal waters with SWAN on unstructured grids Coastal Engineering, 2010, 57, 267-277 ...supply and wind on tidal flat sediment transport. It will be used to evaluate the capabilities of state-of-the-art open source sediment models and to
Long term field evaluation reveals HLB resistance in Citrus relatives
USDA-ARS?s Scientific Manuscript database
Citrus huanglongbing (HLB) is a destructive disease with no known cure. To identify sources of HLB resistance in the subfamily Aurantioideae to which citrus belongs, we conducted a six-year field trial under natural disease challenge conditions in an HLB endemic region. The study included 65 Citrus ...
NEXT GENERATION LEACHING TESTS FOR EVALUATING LEACHING OF INORGANIC CONSTITUENTS
In the U.S. as in other countries, there is increased interest in using industrial by-products as alternative or secondary materials, helping to conserve virgin or raw materials. The LEAF and associated test methods are being used to develop the source term for leaching or any i...
Chatterji, Madhabi
2016-12-01
This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multicompare tests of the performance of different metaheuristics in EEG dipole source localization.
Escalona-Vargas, Diana Irazú; Lopez-Arevalo, Ivan; Gutiérrez, David
2014-01-01
We study the use of nonparametric multicompare statistical tests on the performance of simulated annealing (SA), genetic algorithm (GA), particle swarm optimization (PSO), and differential evolution (DE), when used for electroencephalographic (EEG) source localization. Such task can be posed as an optimization problem for which the referred metaheuristic methods are well suited. Hence, we evaluate the localization's performance in terms of metaheuristics' operational parameters and for a fixed number of evaluations of the objective function. In this way, we are able to link the efficiency of the metaheuristics with a common measure of computational cost. Our results did not show significant differences in the metaheuristics' performance for the case of single source localization. In case of localizing two correlated sources, we found that PSO (ring and tree topologies) and DE performed the worst, then they should not be considered in large-scale EEG source localization problems. Overall, the multicompare tests allowed to demonstrate the little effect that the selection of a particular metaheuristic and the variations in their operational parameters have in this optimization problem.
Lárraga-Gutiérrez, José Manuel; García-Garduño, Olivia Amanda; Treviño-Palacios, Carlos; Herrera-González, José Alfredo
2018-03-01
Flatbed scanners are the most frequently used reading instrument for radiochromic film dosimetry because its low cost, high spatial resolution, among other advantages. These scanners use a fluorescent lamp and a CCD array as light source and detector, respectively. Recently, manufacturers of flatbed scanners replaced the fluorescent lamp by light emission diodes (LED) as a light source. The goal of this work is to evaluate the performance of a commercial flatbed scanner with LED based source light for radiochromic film dosimetry. Film read out consistency, response uniformity, film-scanner sensitivity, long term stability and total dose uncertainty was evaluated. In overall, the performance of the LED flatbed scanner is comparable to that of a cold cathode fluorescent lamp (CCFL). There are important spectral differences between LED and CCFL lamps that results in a higher sensitivity of the LED scanner in the green channel. Total dose uncertainty, film response reproducibility and long-term stability of LED scanner are slightly better than those of the CCFL. However, the LED based scanner has a strong non-uniform response, up to 9%, that must be adequately corrected for radiotherapy dosimetry QA. The differences in light emission spectra between LED and CCFL lamps and its potential impact on film-scanner sensitivity suggest that the design of a dedicated flat-bed scanner with LEDs may improve sensitivity and dose uncertainty in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
2002-03-01
source term. Several publications provided a thorough accounting of the accident, including “ Chernobyl Record” [Mould], and the NRC technical report...Report on the Accident at the Chernobyl Nuclear Power Station” [NUREG-1250]. The most comprehensive study of transport models to predict the...from the Chernobyl Accident: The ATMES Report” [Klug, et al.]. The Atmospheric Transport 5 Model Evaluation Study (ATMES) report used data
Hybrid BEM/empirical approach for scattering of correlated sources in rocket noise prediction
NASA Astrophysics Data System (ADS)
Barbarino, Mattia; Adamo, Francesco P.; Bianco, Davide; Bartoccini, Daniele
2017-09-01
Empirical models such as the Eldred standard model are commonly used for rocket noise prediction. Such models directly provide a definition of the Sound Pressure Level through the quadratic pressure term by uncorrelated sources. In this paper, an improvement of the Eldred Standard model has been formulated. This new formulation contains an explicit expression for the acoustic pressure of each noise source, in terms of amplitude and phase, in order to investigate the sources correlation effects and to propagate them through a wave equation. In particular, the correlation effects between adjacent and not-adjacent sources have been modeled and analyzed. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach that allows an evaluation of the scattering effects. In the framework of the European Space Agency funded program VECEP (VEga Consolidation and Evolution Programme), these models have been applied for the prediction of the aeroacoustics loads of the VEGA (Vettore Europeo di Generazione Avanzata - Advanced Generation European Carrier Rocket) launch vehicle at lift-off and the results have been compared with experimental data.
Parvez, Shahid; Frost, Kali; Sundararajan, Madhura
2017-01-01
In the absence of shorter term disinfectant byproducts (DBPs) data on regulated Trihalomethanes (THMs) and Haloacetic acids (HAAs), epidemiologists and risk assessors have used long-term annual compliance (LRAA) or quarterly (QA) data to evaluate the association between DBP exposure and adverse birth outcomes, which resulted in inconclusive findings. Therefore, we evaluated the reliability of using long-term LRAA and QA data as an indirect measure for short-term exposure. Short-term residential tap water samples were collected in peak DBP months (May–August) in a community water system with five separate treatment stations and were sourced from surface or groundwater. Samples were analyzed for THMs and HAAs per the EPA (U.S. Environmental Protection Agency) standard methods (524.2 and 552.2). The measured levels of total THMs and HAAs were compared temporally and spatially with LRAA and QA data, which showed significant differences (p < 0.05). Most samples from surface water stations showed higher levels than LRAA or QA. Significant numbers of samples in surface water stations exceeded regulatory permissible limits: 27% had excessive THMs and 35% had excessive HAAs. Trichloromethane, trichloroacetic acid, and dichloroacetic acid were the major drivers of variability. This study suggests that LRAA and QA data are not good proxies of short-term exposure. Further investigation is needed to determine if other drinking water systems show consistent findings for improved regulation. PMID:28531123
Parvez, Shahid; Frost, Kali; Sundararajan, Madhura
2017-05-20
In the absence of shorter term disinfectant byproducts (DBPs) data on regulated Trihalomethanes (THMs) and Haloacetic acids (HAAs), epidemiologists and risk assessors have used long-term annual compliance (LRAA) or quarterly (QA) data to evaluate the association between DBP exposure and adverse birth outcomes, which resulted in inconclusive findings. Therefore, we evaluated the reliability of using long-term LRAA and QA data as an indirect measure for short-term exposure. Short-term residential tap water samples were collected in peak DBP months (May-August) in a community water system with five separate treatment stations and were sourced from surface or groundwater. Samples were analyzed for THMs and HAAs per the EPA (U.S. Environmental Protection Agency) standard methods (524.2 and 552.2). The measured levels of total THMs and HAAs were compared temporally and spatially with LRAA and QA data, which showed significant differences ( p < 0.05). Most samples from surface water stations showed higher levels than LRAA or QA. Significant numbers of samples in surface water stations exceeded regulatory permissible limits: 27% had excessive THMs and 35% had excessive HAAs. Trichloromethane, trichloroacetic acid, and dichloroacetic acid were the major drivers of variability. This study suggests that LRAA and QA data are not good proxies of short-term exposure. Further investigation is needed to determine if other drinking water systems show consistent findings for improved regulation.
Numerical and experimental evaluations of the flow past nested chevrons
NASA Technical Reports Server (NTRS)
Foss, J. F.; Foss, J. K.; Spalart, P. R.
1989-01-01
An effort is made to contribute to the development of CFD by relating the successful use of vortex dynamics in the computation of the pressure drop past a planar array of chevron-shaped obstructions. An ensemble of results was used to compute the loss coefficient k, stimulating an experimental program for the assessment of the measured loss coefficient for the same geometry. The most provocative result of this study has been the representation of kinetic energy production in terms of vorticity source terms.
ACG Clinical Guideline: Diagnosis and Management of Small Bowel Bleeding.
Gerson, Lauren B; Fidler, Jeff L; Cave, David R; Leighton, Jonathan A
2015-09-01
Bleeding from the small intestine remains a relatively uncommon event, accounting for ~5-10% of all patients presenting with gastrointestinal (GI) bleeding. Given advances in small bowel imaging with video capsule endoscopy (VCE), deep enteroscopy, and radiographic imaging, the cause of bleeding in the small bowel can now be identified in most patients. The term small bowel bleeding is therefore proposed as a replacement for the previous classification of obscure GI bleeding (OGIB). We recommend that the term OGIB should be reserved for patients in whom a source of bleeding cannot be identified anywhere in the GI tract. A source of small bowel bleeding should be considered in patients with GI bleeding after performance of a normal upper and lower endoscopic examination. Second-look examinations using upper endoscopy, push enteroscopy, and/or colonoscopy can be performed if indicated before small bowel evaluation. VCE should be considered a first-line procedure for small bowel investigation. Any method of deep enteroscopy can be used when endoscopic evaluation and therapy are required. VCE should be performed before deep enteroscopy if there is no contraindication. Computed tomographic enterography should be performed in patients with suspected obstruction before VCE or after negative VCE examinations. When there is acute overt hemorrhage in the unstable patient, angiography should be performed emergently. In patients with occult hemorrhage or stable patients with active overt bleeding, multiphasic computed tomography should be performed after VCE or CTE to identify the source of bleeding and to guide further management. If a source of bleeding is identified in the small bowel that is associated with significant ongoing anemia and/or active bleeding, the patient should be managed with endoscopic therapy. Conservative management is recommended for patients without a source found after small bowel investigation, whereas repeat diagnostic investigations are recommended for patients with initial negative small bowel evaluations and ongoing overt or occult bleeding.
Numerical modeling of materials processing applications of a pulsed cold cathode electron gun
NASA Astrophysics Data System (ADS)
Etcheverry, J. I.; Martínez, O. E.; Mingolo, N.
1998-04-01
A numerical study of the application of a pulsed cold cathode electron gun to materials processing is performed. A simple semiempirical model of the discharge is used, together with backscattering and energy deposition profiles obtained by a Monte Carlo technique, in order to evaluate the energy source term inside the material. The numerical computation of the heat equation with the calculated source term is performed in order to obtain useful information on melting and vaporization thresholds, melted radius and depth, and on the dependence of these variables on processing parameters such as operating pressure, initial voltage of the discharge and cathode-sample distance. Numerical results for stainless steel are presented, which demonstrate the need for several modifications of the experimental design in order to achieve a better efficiency.
On the structure of pressure fluctuations in simulated turbulent channel flow
NASA Technical Reports Server (NTRS)
Kim, John
1989-01-01
Pressure fluctuations in a turbulent channel flow are investigated by analyzing a database obtained from a direct numerical simulation. Detailed statistics associated with the pressure fluctuations are presented. Characteristics associated with the rapid (linear) and slow (nonlinear) pressure are discussed. It is found that the slow pressure fluctuations are larger than the rapid pressure fluctuations throughout the channel except very near the wall, where they are about the same magnitude. This is contrary to the common belief that the nonlinear source terms are negligible compared to the linear source terms. Probability density distributions, power spectra, and two-point correlations are examined to reveal the characteristics of the pressure fluctuations. The global dependence of the pressure fluctuations and pressure-strain correlations are also examined by evaluating the integral associated with Green's function representations of them. In the wall region where the pressure-strain terms are large, most contributions to the pressure-strain terms are from the wall region (i.e., local), whereas away from the wall where the pressure-strain terms are small, contributions are global. Structures of instantaneous pressure and pressure gradients at the wall and the corresponding vorticity field are examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabrera-Palmer, Belkis
Predicting the performance of radiation detection systems at field sites based on measured performance acquired under controlled conditions at test locations, e.g., the Nevada National Security Site (NNSS), remains an unsolved and standing issue within DNDO’s testing methodology. Detector performance can be defined in terms of the system’s ability to detect and/or identify a given source or set of sources, and depends on the signal generated by the detector for the given measurement configuration (i.e., source strength, distance, time, surrounding materials, etc.) and on the quality of the detection algorithm. Detector performance is usually evaluated in the performance and operationalmore » testing phases, where the measurement configurations are selected to represent radiation source and background configurations of interest to security applications.« less
Joint optimization of source, mask, and pupil in optical lithography
NASA Astrophysics Data System (ADS)
Li, Jia; Lam, Edmund Y.
2014-03-01
Mask topography effects need to be taken into consideration for more advanced resolution enhancement techniques in optical lithography. However, rigorous 3D mask model achieves high accuracy at a large computational cost. This work develops a combined source, mask and pupil optimization (SMPO) approach by taking advantage of the fact that pupil phase manipulation is capable of partially compensating for mask topography effects. We first design the pupil wavefront function by incorporating primary and secondary spherical aberration through the coefficients of the Zernike polynomials, and achieve optimal source-mask pair under the condition of aberrated pupil. Evaluations against conventional source mask optimization (SMO) without incorporating pupil aberrations show that SMPO provides improved performance in terms of pattern fidelity and process window sizes.
Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii
Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.
2009-01-01
Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.
Industry funding and the reporting quality of large long-term weight loss trials
Thomas, Olivia; Thabane, Lehana; Douketis, James; Chu, Rong; Westfall, Andrew O.; Allison, David B.
2009-01-01
Background Quality of reporting (QR) in industry-funded research is a concern of the scientific community. Greater scrutiny of industry-sponsored research reporting has been suggested, although differences in QR by sponsorship type have not been evaluated in weight loss interventions. Objective To evaluate the association of funding source and QR of long-term obesity randomized clinical trials. Methods We analyzed papers that reported long-term weight loss trials. Articles were obtained through searches of MEDLINE, HealthStar, and the Cochrane Controlled Trials Register between the years 1966–2003. QR scores were determined for each study based upon expanded criteria from the Consolidated Standards for Reporting Trials (CONSORT) checklist for a maximum score of 44 points. Studies were coded by category of industry support (0=no industry support, 1= industry support, 2= in kind contribution from industry and 3=duality of interest reported). Individual CONSORT reporting criteria were tabulated by funding type. An independent samples t-test compared differences in QR scores by funding source and the Wilcox-Mann-Whitney test and generalized estimating equations (GEE) were used for sensitivity analyses. Results Of the 63 RCTs evaluated, 67% were industry-supported trials. Industry funding was associated with higher QR score in long-term weight loss trials compared to non-industry funded studies (Mean QR (SD): Industry = 27.9 (4.1), Non-Industry =23.4 (4.1); p < 0.0005). The Wilcox-Mann-Whitney test confirmed this result (p<0.0005). Controlling for the year of publication and whether paper was published before the CONSORT statement was released in a GEE regression analysis, the direction and magnitude of effect was similar and statistically significant (p=0.035). Of the individual criteria that prior research has associated with biases, industry funding was associated with greater reporting of intent-to-treat analysis (p=0.0158), but was not different from non-industry studies in reporting of treatment allocation and blinding. Conclusion Our findings suggest that efforts to improve reporting quality be directed at all obesity RCTs irrespective of funding source. PMID:18711388
Industry funding and the reporting quality of large long-term weight loss trials.
Thomas, O; Thabane, L; Douketis, J; Chu, R; Westfall, A O; Allison, D B
2008-10-01
Quality of reporting (QR) in industry-funded research is a concern of the scientific community. Greater scrutiny of industry-sponsored research reporting has been suggested, although differences in QR by sponsorship type have not been evaluated in weight loss interventions. To evaluate the association of funding source and QR of long-term obesity randomized clinical trials (RCT). We analysed papers that reported long-term weight loss trials. Articles were obtained through searches of Medline, HealthStar, and the Cochrane Controlled Trials Register between the years 1966 and 2003. QR scores were determined for each study based upon expanded criteria from the Consolidated Standards for Reporting Trials (CONSORT) checklist for a maximum score of 44 points. Studies were coded by category of industry support (0=no industry support, 1=industry support, 2=in kind contribution from industry and 3=duality of interest reported). Individual CONSORT reporting criteria were tabulated by funding type. An independent samples t-test compared the differences in QR scores by funding source and the Wilcox-Mann-Whitney test and generalised estimating equations (GEE) were used for sensitivity analyses. Of the 63 RCTs evaluated, 67% were industry-supported trials. Industry funding was associated with higher QR score in long-term weight loss trials compared with nonindustry-funded studies (mean QR (s.d.): industry=27.9 (4.1), nonindustry=23.4 (4.1); P<0.0005). The Wilcox-Mann-Whitney test confirmed this result (P<0.0005). Controlling for the year of publication and whether the paper was published before the CONSORT statement was released in the GEE regression analysis, the direction and magnitude of effect were similar and statistically significant (P=0.035). Of the individual criteria that prior research has associated with biases, industry funding was associated with greater reporting of intent-to-treat analysis (P=0.0158), but was not different from nonindustry studies in reporting of treatment allocation and blinding. Our findings suggest that the efforts to improve reporting quality be directed to all obesity RCTs, irrespective of funding source.
Tanik, A
2000-01-01
The six main drinking water reservoirs of Istanbul are under the threat of pollution due to rapid population increase, unplanned urbanisation and insufficient infrastructure. In contrast to the present land use profile, the environmental evaluation of the catchment areas reveals that point sources of pollutants, especially of domestic origin, dominate over those from diffuse sources. The water quality studies also support these findings, emphasising that if no substantial precautions are taken, there will be no possibility of obtaining drinking water from them. In this paper, under the light of the present status of the reservoirs, possible and probable short- and long-term protective measures are outlined for reducing the impact of point sources. Immediate precautions mostly depend on reducing the pollution arising from the existing settlements. Long-term measures mainly emphasise the preparation of new land use plans taking into consideration the protection of unoccupied lands. Recommendations on protection and control of the reservoirs are stated.
USDA-ARS?s Scientific Manuscript database
Phomopsis seed decay (PSD) causes poor soybean seed quality worldwide. The primary causal agent of PSD is Phomopsis longicolla (syn. Diaporthe longicolla). Breeding for PSD-resistance is the most effective long-term strategy to control this disease. To develop soybean lines with resistance to PSD, m...
USDA-ARS?s Scientific Manuscript database
Optimal utilization of animal manures as a plant nutrient source should also prevent adverse impacts on water quality. The objective of this study was to evaluate long-term poultry litter and N fertilizer application on nutrient cycling following establishment of an alley cropping system with easter...
Analyzing Student and Employer Satisfaction with Cooperative Education through Multiple Data Sources
ERIC Educational Resources Information Center
Jiang, Yuheng Helen; Lee, Sally Wai Yin; Golab, Lukasz
2015-01-01
This paper reports on the analysis of three years research of undergraduate cooperative work term postings and employer and employee evaluations. The objective of the analysis was to determine the factors affecting student and employer success and satisfaction with the work-integrated learning experience. It was found that students performed…
Classes of Legitimate Evidence for Identifying Effective Teaching.
ERIC Educational Resources Information Center
Wagner, Paul A.
A criterion for selecting sources of evidence to evaluate effective teaching is described. It is suggested that teaching effectiveness is not measured solely in terms of cognitive change in students but in the extent to which academics practice teaching in accordance with the moral dictates of the profession. In developing a teacher effectiveness…
The U.S. Environmental Protection Agency (EPA) initiated the national PM2.5 Chemical Speciation Monitoring Network (CSN) in 2000 to support evaluation of long-term trends and to better quantify the impact of sources on particulate matter (PM) concentrations in the size range belo...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-08
... and marketing evaluation and strategies; and outreach and implementation of the project results. The... devise strategies and means to efficiently harvest the redfish resource in the Gulf of Maine (GOM) while... in terms of their potential effects on results. Sources of variability include: Area fished; seasonal...
Accuracy-preserving source term quadrature for third-order edge-based discretization
NASA Astrophysics Data System (ADS)
Nishikawa, Hiroaki; Liu, Yi
2017-09-01
In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.
Idowu, I A; Alkhaddar, R M; Atherton, W
2014-08-01
Mecoprop-p herbicide is often found in wells and water abstractions in many areas around Europe, the UK inclusive. There is a growing environmental and public health concern about mecoprop-p herbicide pollution in ground and surface water in England. Reviews suggest that extensive work has been carried out on the contribution of mecoprop-p herbicides from agricultural use whilst more work needs to be carried out on the contribution of mecoprop-p herbicide from non-agricultural use. The study covers two landfill sites in Weaver/Gowy Catchment. Mecoprop-p herbicide concentrations in the leachate quality range between 0.06 and 290 microg l1 in cells. High concentration ofmecoprop-p herbicide in the leachate quality suggests that there is a possible source term in the waste stream. This paper addresses the gap by exploring possible source terms of mecoprop-p herbicide contamination on landfill sites and evaluates the impact of public purchase, use and disposal alongside climate change on seasonal variations in mecoprop-p concentrations. Mecoprop-p herbicide was found to exceed the EU drinking water quality standards at the unsaturated zone/aquifer with observed average concentrations ranging between 0.005 and 7.96 microg l1. A route map for mecoprop-p herbicide source term contamination is essential for mitigation and pollution management with emphasis on both consumer and producer responsibility towards use of mecoprop-p product. In addition, improvement in data collection on mecoprop-p concentrations and detailed seasonal herbicide sales for non-agricultural purposes are needed to inform the analysis and decision process.
Physical/chemical closed-loop water-recycling for long-duration missions
NASA Technical Reports Server (NTRS)
Herrmann, Cal C.; Wydeven, Ted
1990-01-01
Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on earth, in regions where extensive water ecycling is needed or where advanced water treatment is essential to meet EPA health standards.
Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames
NASA Astrophysics Data System (ADS)
Heye, Colin; Raman, Venkat
2012-11-01
A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
NASA Astrophysics Data System (ADS)
Sakamoto, Hiroki; Yamamoto, Toshihiro
2017-09-01
This paper presents improvement and performance evaluation of the "perturbation source method", which is one of the Monte Carlo perturbation techniques. The formerly proposed perturbation source method was first-order accurate, although it is known that the method can be easily extended to an exact perturbation method. A transport equation for calculating an exact flux difference caused by a perturbation is solved. A perturbation particle representing a flux difference is explicitly transported in the perturbed system, instead of in the unperturbed system. The source term of the transport equation is defined by the unperturbed flux and the cross section (or optical parameter) changes. The unperturbed flux is provided by an "on-the-fly" technique during the course of the ordinary fixed source calculation for the unperturbed system. A set of perturbation particle is started at the collision point in the perturbed region and tracked until death. For a perturbation in a smaller portion of the whole domain, the efficiency of the perturbation source method can be improved by using a virtual scattering coefficient or cross section in the perturbed region, forcing collisions. Performance is evaluated by comparing the proposed method to other Monte Carlo perturbation methods. Numerical tests performed for a particle transport in a two-dimensional geometry reveal that the perturbation source method is less effective than the correlated sampling method for a perturbation in a larger portion of the whole domain. However, for a perturbation in a smaller portion, the perturbation source method outperforms the correlated sampling method. The efficiency depends strongly on the adjustment of the new virtual scattering coefficient or cross section.
QCD sum rules study of meson-baryon sigma terms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erkol, Gueray; Oka, Makoto; Turan, Guersevil
2008-11-01
The pion-baryon sigma terms and the strange-quark condensates of the octet and the decuplet baryons are calculated by employing the method of QCD sum rules. We evaluate the vacuum-to-vacuum transition matrix elements of two baryon interpolating fields in an external isoscalar-scalar field and use a Monte Carlo-based approach to systematically analyze the sum rules and the uncertainties in the results. We extract the ratios of the sigma terms, which have rather high accuracy and minimal dependence on QCD parameters. We discuss the sources of uncertainties and comment on possible strangeness content of the nucleon and the Delta.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barry, Kenneth
The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted tomore » the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced settling by particle growth are the dominant processes for determining DFs for expected conditions in an iPWR containment. These processes are dependent on the areato-volume (A/V) ratio, which should benefit iPWR designs because these reactors have higher A/Vs compared to existing LWRs.« less
MIXOPTIM: A tool for the evaluation and the optimization of the electricity mix in a territory
NASA Astrophysics Data System (ADS)
Bonin, Bernard; Safa, Henri; Laureau, Axel; Merle-Lucotte, Elsa; Miss, Joachim; Richet, Yann
2014-09-01
This article presents a method of calculation of the generation cost of a mixture of electricity sources, by means of a Monte Carlo simulation of the production output taking into account the fluctuations of the demand and the stochastic nature of the availability of the various power sources that compose the mix. This evaluation shows that for a given electricity mix, the cost has a non-linear dependence on the demand level. In the second part of the paper, we develop some considerations on the management of intermittence. We develop a method based on spectral decomposition of the imposed power fluctuations to calculate the minimal amount of the controlled power sources needed to follow these fluctuations. This can be converted into a viability criterion of the mix included in the MIXOPTIM software. In the third part of the paper, the MIXOPTIM cost evaluation method is applied to the multi-criteria optimization of the mix, according to three main criteria: the cost of the mix; its impact on climate in terms of CO2 production; and the security of supply.
C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation
NASA Astrophysics Data System (ADS)
Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E.; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong
2008-03-01
The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.
C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation.
Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong
2008-01-01
The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.
NASA Technical Reports Server (NTRS)
Yee, Helen M. C.; Kotov, D. V.; Wang, Wei; Shu, Chi-Wang
2013-01-01
The goal of this paper is to relate numerical dissipations that are inherited in high order shock-capturing schemes with the onset of wrong propagation speed of discontinuities. For pointwise evaluation of the source term, previous studies indicated that the phenomenon of wrong propagation speed of discontinuities is connected with the smearing of the discontinuity caused by the discretization of the advection term. The smearing introduces a nonequilibrium state into the calculation. Thus as soon as a nonequilibrium value is introduced in this manner, the source term turns on and immediately restores equilibrium, while at the same time shifting the discontinuity to a cell boundary. The present study is to show that the degree of wrong propagation speed of discontinuities is highly dependent on the accuracy of the numerical method. The manner in which the smearing of discontinuities is contained by the numerical method and the overall amount of numerical dissipation being employed play major roles. Moreover, employing finite time steps and grid spacings that are below the standard Courant-Friedrich-Levy (CFL) limit on shockcapturing methods for compressible Euler and Navier-Stokes equations containing stiff reacting source terms and discontinuities reveals surprising counter-intuitive results. Unlike non-reacting flows, for stiff reactions with discontinuities, employing a time step and grid spacing that are below the CFL limit (based on the homogeneous part or non-reacting part of the governing equations) does not guarantee a correct solution of the chosen governing equations. Instead, depending on the numerical method, time step and grid spacing, the numerical simulation may lead to (a) the correct solution (within the truncation error of the scheme), (b) a divergent solution, (c) a wrong propagation speed of discontinuities solution or (d) other spurious solutions that are solutions of the discretized counterparts but are not solutions of the governing equations. The present investigation for three very different stiff system cases confirms some of the findings of Lafon & Yee (1996) and LeVeque & Yee (1990) for a model scalar PDE. The findings might shed some light on the reported difficulties in numerical combustion and problems with stiff nonlinear (homogeneous) source terms and discontinuities in general.
NEXT GENERATION LEACHING TESTS FOR EVALUATING ...
In the U.S. as in other countries, there is increased interest in using industrial by-products as alternative or secondary materials, helping to conserve virgin or raw materials. The LEAF and associated test methods are being used to develop the source term for leaching or any inorganic constituents of potential concern (COPC) in determining what is environmentally acceptable. The leaching test methods include batch equilibrium, percolation column and semi-dynamic mass transport tests for monolithic and compacted granular materials. By testing over a range of values for pH, liquid/solid ratio, and physical form of the material, this approach allows one data set to be used to evaluate a range of management scenarios for a material, representing different environmental conditions (e.g., disposal or beneficial use). The results from these tests may be interpreted individually or integrated to identify a solid material’s characteristic leaching behavior. Furthermore the LEAF approach provides the ability to make meaningful comparisons of leaching between similar and dissimilar materials from national and worldwide origins. To present EPA's research under SHC to implement validated leaching tests referred to as the Leaching Environmental Assessment Framework (LEAF). The primary focus will be on the guidance for implementation of LEAF describing three case studies for developing source terms for evaluating inorganic constituents.
NASA Astrophysics Data System (ADS)
Pandey, Arun; Bandyopadhyay, M.; Sudhir, Dass; Chakraborty, A.
2017-10-01
Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.
Pandey, Arun; Bandyopadhyay, M; Sudhir, Dass; Chakraborty, A
2017-10-01
Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.
Healthy Steps: a systematic review of a preventive practice-based model of pediatric care.
Piotrowski, Caroline C; Talavera, Gregory A; Mayer, Joni A
2009-02-01
The preventive role of anticipatory guidance in pediatric practice has gained increasing importance over the last two decades, resulting in the development of competing models of practice-based care. Our goal was to systematically evaluate and summarize the literature pertaining to the Healthy Steps Program for Young Children, a widely cited and utilized preventive model of care and anticipatory guidance, Medline and the bibliographies of review articles for relevant studies were searched using the keywords: Healthy Steps, preventive care, pediatric practice and others. Other sources included references of retrieved publications, review articles, and books; government documents; and Internet sources. Relevant sources were selected on the basis of their empirical evaluation of some component of care (e.g., child outcomes, parent outcomes, quality of care). From 21 identified articles, 13 met the inclusion criteria of empirical evaluation. These evaluations were summarized and compared. Results indicated that the Healthy Steps program has been rigorously evaluated and shown to be effective in preventing negative child and parent outcomes and enhancing positive outcomes. Despite limited information concerning cost effectiveness, the Healthy Steps Program provides clear benefit through early screening, family-centered care, and evidence-based anticipatory guidance. It is recommended that the Healthy Steps program be more widely disseminated to relevant stakeholders, and further enhanced by improved linguistic and cultural sensitivity and long term evaluation of cost effectiveness.
Predicting vertically-nonsequential wetting patterns with a source-responsive model
Nimmo, John R.; Mitchell, Lara
2013-01-01
Water infiltrating into soil of natural structure often causes wetting patterns that do not develop in an orderly sequence. Because traditional unsaturated flow models represent a water advance that proceeds sequentially, they fail to predict irregular development of water distribution. In the source-responsive model, a diffuse domain (D) represents flow within soil matrix material following traditional formulations, and a source-responsive domain (S), characterized in terms of the capacity for preferential flow and its degree of activation, represents preferential flow as it responds to changing water-source conditions. In this paper we assume water undergoing rapid source-responsive transport at any particular time is of negligibly small volume; it becomes sensible at the time and depth where domain transfer occurs. A first-order transfer term represents abstraction from the S to the D domain which renders the water sensible. In tests with lab and field data, for some cases the model shows good quantitative agreement, and in all cases it captures the characteristic patterns of wetting that proceed nonsequentially in the vertical direction. In these tests we determined the values of the essential characterizing functions by inverse modeling. These functions relate directly to observable soil characteristics, rendering them amenable to evaluation and improvement through hydropedologic development.
Numerical study of supersonic combustion using a finite rate chemistry model
NASA Technical Reports Server (NTRS)
Chitsomboon, T.; Tiwari, S. N.; Kumar, A.; Drummond, J. P.
1986-01-01
The governing equations of two-dimensional chemically reacting flows are presented together with a global two-step chemistry model for H2-air combustion. The explicit unsplit MacCormack finite difference algorithm is used to advance the discrete system of the governing equations in time until convergence is attained. The source terms in the species equations are evaluated implicitly to alleviate stiffness associated with fast reactions. With implicit source terms, the species equations give rise to a block-diagonal system which can be solved very efficiently on vector-processing computers. A supersonic reacting flow in an inlet-combustor configuration is calculated for the case where H2 is injected into the flow from the side walls and the strut. Results of the calculation are compared against the results obtained by using a complete reaction model.
Managing multicentre clinical trials with open source.
Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan
2014-03-01
Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.
Convenience or Credibility? A Study of College Student Online Research Behaviors
ERIC Educational Resources Information Center
Biddix, J. Patrick; Chung, Chung Joo; Park, Han Woo
2011-01-01
The purpose of this study was to investigate where students turn for course-related assignments, whether an ordered pattern could be described in terms of which sources students turn to and how students evaluated the information they chose to use. Data were drawn from open-ended questionnaires (n = 282). Semantic network analysis was conducted…
ERIC Educational Resources Information Center
Alavi, Seyed Mohammad; Bordbar, Soodeh
2017-01-01
Differential Item Functioning (DIF) analysis is a key element in evaluating educational test fairness and validity. One of the frequently cited sources of construct-irrelevant variance is gender which has an important role in the university entrance exam; therefore, it causes bias and consequently undermines test validity. The present study aims…
ERIC Educational Resources Information Center
Kanyaprasith, Kamonwan; Finley, Fred N.; Phonphok, Nason
2015-01-01
This study evaluates a cross-cultural experience in science and mathematics teaching in Thailand--an internship program. In this study, qualitative data sources including semi-structured interviews, classroom observations, and pre-post questionnaire were collected from five groups of participants, which were: (a) administrators; (b) Thai…
ERIC Educational Resources Information Center
Jansen, Malte; Schroeders, Ulrich; Lüdtke, Oliver; Marsh, Herbert W.
2015-01-01
Students evaluate their achievement in a specific domain in relation to their achievement in other domains and form their self-concepts accordingly. These comparison processes have been termed "dimensional comparisons" and shown to be an important source of academic self-concepts in addition to social and temporal comparisons. Research…
The report gives results of an evaluation of the condition and air emissions from old, phase-2-certified wood heaters installed in homes and used regularly for hoe heating since the 1992/1993 heating season or earlier. (NOTE: Wood stoves have been identified as a major source of ...
Photovoltaics as a terrestrial energy source. Volume 3: An overview
NASA Technical Reports Server (NTRS)
Smith, J. L.
1980-01-01
Photovoltaic (PV) systems were evaluated in terms of their potential for terrestrial application A comprehensive overview of important issues which bear on photovoltaic (PV) systems development is presented. Studies of PV system costs, the societal implications of PV system development, and strategies in PV research and development in relationship to current energy policies are summarized.
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
Jia, Zhiqing; Zhu, Yajuan; Liu, Liying
2012-01-01
Background In a semi-arid ecosystem, water is one of the most important factors that affect vegetation dynamics, such as shrub plantation. A water use strategy, including the main water source that a plant species utilizes and water use efficiency (WUE), plays an important role in plant survival and growth. The water use strategy of a shrub is one of the key factors in the evaluation of stability and sustainability of a plantation. Methodology/Principal Findings Caragana intermedia is a dominant shrub of sand-binding plantations on sand dunes in the Gonghe Basin in northeastern Tibet Plateau. Understanding the water use strategy of a shrub plantation can be used to evaluate its sustainability and long-term stability. We hypothesized that C. intermedia uses mainly deep soil water and its WUE increases with plantation age. Stable isotopes of hydrogen and oxygen were used to determine the main water source and leaf carbon isotope discrimination was used to estimate long-term WUE. The root system was investigated to determine the depth of the main distribution. The results showed that a 5-year-old C. intermedia plantation used soil water mainly at a depth of 0–30 cm, which was coincident with the distribution of its fine roots. However, 9- or 25-year-old C. intermedia plantations used mainly 0–50 cm soil depth water and the fine root system was distributed primarily at soil depths of 0–50 cm and 0–60 cm, respectively. These sources of soil water are recharged directly by rainfall. Moreover, the long-term WUE of adult plantations was greater than that of juvenile plantations. Conclusions The C. intermedia plantation can change its water use strategy over time as an adaptation to a semi-arid environment, including increasing the depth of soil water used for root growth, and increasing long-term WUE. PMID:23029303
NASA Astrophysics Data System (ADS)
Kwon, Hyeokjun; Kang, Yoojin; Jang, Junwoo
2017-09-01
Color fidelity has been used as one of indices to evaluate the performance of light sources. Since the Color Rendering Index (CRI) was proposed at CIE, many color fidelity metrics have been proposed to increase the accuracy of the metric. This paper focuses on a comparison of the color fidelity metrics in an aspect of accuracy with human visual assessments. To visually evaluate the color fidelity of light sources, we made a simulator that reproduces the color samples under lighting conditions. In this paper, eighteen color samples of the Macbeth color checker under test light sources and reference illuminant for each of them are simulated and displayed on a well-characterized monitor. With only a spectrum set of the test light source and reference illuminant, color samples under any lighting condition can be reproduced. In this paper, the spectrums of the two LED and two OLED light sources that have similar values of CRI are used for the visual assessment. In addition, the results of the visual assessment are compared with the two color fidelity metrics that include CRI and IES TM-30-15 (Rf), proposed by Illuminating Engineering Society (IES) in 2015. Experimental results indicate that Rf outperforms CRI in terms of the correlation with visual assessment.
Free-electron laser emission architecture impact on extreme ultraviolet lithography
NASA Astrophysics Data System (ADS)
Hosler, Erik R.; Wood, Obert R.; Barletta, William A.
2017-10-01
Laser-produced plasma (LPP) EUV sources have demonstrated ˜125 W at customer sites, establishing confidence in EUV lithography (EUVL) as a viable manufacturing technology. However, for extension to the 3-nm technology node and beyond, existing scanner/source technology must enable higher-NA imaging systems (requiring increased resist dose and providing half-field exposures) and/or EUV multipatterning (requiring increased wafer throughput proportional to the number of exposure passes). Both development paths will require a substantial increase in EUV source power to maintain the economic viability of the technology, creating an opportunity for free-electron laser (FEL) EUV sources. FEL-based EUV sources offer an economic, high-power/single-source alternative to LPP EUV sources. Should FELs become the preferred next-generation EUV source, the choice of FEL emission architecture will greatly affect its operational stability and overall capability. A near-term industrialized FEL is expected to utilize one of the following three existing emission architectures: (1) self-amplified spontaneous emission, (2) regenerative amplifier, or (3) self-seeding. Model accelerator parameters are put forward to evaluate the impact of emission architecture on FEL output. Then, variations in the parameter space are applied to assess the potential impact to lithography operations, thereby establishing component sensitivity. The operating range of various accelerator components is discussed based on current accelerator performance demonstrated at various scientific user facilities. Finally, comparison of the performance between the model accelerator parameters and the variation in parameter space provides a means to evaluate the potential emission architectures. A scorecard is presented to facilitate this evaluation and provides a framework for future FEL design and enablement for EUVL applications.
Deneulin, Pascale; Reverdy, Caroline; Rébénaque, Pierrick; Danthe, Eve; Mulhauser, Blaise
2018-04-01
Honey is a natural product with very diverse sensory attributes that are influenced by the flower source, the bee species, the geographic origin, the treatments and conditions during storage. This study aimed at describing 50 honeys from diverse flower sources in different continents and islands, stored under various conditions. Many articles have been published on the sensory characterization of honeys, thus a common list of attributes has been established, but it appeared to be poorly suited to describe a large number of honeys from around the world. This is why the novel and rapid sensory evaluation method, the Pivot Profile©, was tested, with the participation of 15 panelists during five sessions. The first objective was to obtain a sensory description of the 50 honeys that were tested. From 1152 distinct terms, a list of 29 sensory attributes was established and the attributes divided into three categories: color/texture (8 terms), aroma (16 terms), and taste (5 terms). At first, the honeys have been ranked according to their level of crystallization from fluid/liquid to viscous/hard. Then color was the second assessment factor of the variability. In terms of aroma, honeys from Africa were characterized by smoky, resin, caramel and dried fruit as opposed to floral and fruity, mainly for honeys from South America and Europe. Finally, the honeys were ranked according to their sweetness. The second objective of this study was to test the new sensory method, called Pivot Profile© which is used to describe a large number of products with interpretable results. Copyright © 2017 Elsevier Ltd. All rights reserved.
Medrano-Félix, Andrés; Estrada-Acosta, Mitzi; Peraza-Garay, Felipe; Castro-Del Campo, Nohelia; Martínez-Urtaza, Jaime; Chaidez, Cristóbal
2017-08-01
Long-term exposure to river water by non-indigenous micro-organisms such as Salmonella may affect metabolic adaptation to carbon sources. This study was conducted to determine differences in carbon source utilization of Salmonella Oranienburg and Salmonella Saintpaul (isolated from tropical river water) as well as the control strain Salmonella Typhimurium exposed to laboratory, river water, and host cells (Hep-2 cell line) growth conditions. Results showed that Salmonella Oranienburg and Salmonella Saintpaul showed better ability for carbon source utilization under the three growth conditions evaluated; however, S. Oranienburg showed the fastest and highest utilization on different carbon sources, including D-Glucosaminic acid, N-acetyl-D-Glucosamine, Glucose-1-phosphate, and D-Galactonic acid, while Salmonella Saintpaul and S. Typhimurium showed a limited utilization of carbon sources. In conclusion, this study suggests that environmental Salmonella strains show better survival and preconditioning abilities to external environments than the control strain based on their plasticity on diverse carbon sources use.
Playdon, Mary; Ferrucci, Leah M; McCorkle, Ruth; Stein, Kevin D; Cannady, Rachel; Sanft, Tara; Cartmel, Brenda
2016-08-01
Survivorship care plans (SCPs) provide cancer patients and health care providers with a treatment summary and outline of recommended medical follow-up. Few studies have investigated the information needs and preferred sources among long-term cancer survivors. Cancer survivors of the ten most common cancers enrolled in the longitudinal Study of Cancer Survivors-I (SCS-I) completed a survey 9 years post-diagnosis (n = 3138); at time of diagnosis of the SCS-I cohort, SCPs were not considered usual care. We assessed participants' current desire and preferred sources for information across ten SCP items and evaluated factors associated with information need 9 years after diagnosis. The proportion of long-term cancer survivors endorsing a need for cancer and health information 9 years post-diagnosis ranged from 43 % (cancer screening) to 9 % (consequences of cancer on ability to work). Print media and personalized reading materials were the most preferred information sources. Younger age, higher education, race other than non-Hispanic white, later cancer stage, having breast cancer, having ≥2 comorbidities, and self-reporting poor health were associated with greater informational need (p < 0.05). Long-term cancer survivors continue to report health information needs for most SCP items and would prefer a print format; however, level of need differs by socio-demographic and cancer characteristics. Cancer survivors who did not previously receive a SCP may still benefit from receiving SCP content, and strategies for enabling dissemination to long-term survivors warrant further investigation.
NASA Astrophysics Data System (ADS)
Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.
2015-12-01
Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.
Campbell, W.H.
1986-01-01
Electric currents in long pipelines can contribute to corrosion effects that limit the pipe's lifetime. One cause of such electric currents is the geomagnetic field variations that have sources in the Earth's upper atmosphere. Knowledge of the general behavior of the sources allows a prediction of the occurrence times, favorable locations for the pipeline effects, and long-term projections of corrosion contributions. The source spectral characteristics, the Earth's conductivity profile, and a corrosion-frequency dependence limit the period range of the natural field changes that affect the pipe. The corrosion contribution by induced currents from geomagnetic sources should be evaluated for pipelines that are located at high and at equatorial latitudes. At midlatitude locations, the times of these natural current maxima should be avoided for the necessary accurate monitoring of the pipe-to-soil potential. ?? 1986 D. Reidel Publishing Company.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawloski, G A; Tompson, A F B; Carle, S F
The objectives of this report are to develop, summarize, and interpret a series of detailed unclassified simulations that forecast the nature and extent of radionuclide release and near-field migration in groundwater away from the CHESHIRE underground nuclear test at Pahute Mesa at the NTS over 1000 yrs. Collectively, these results are called the CHESHIRE Hydrologic Source Term (HST). The CHESHIRE underground nuclear test was one of 76 underground nuclear tests that were fired below or within 100 m of the water table between 1965 and 1992 in Areas 19 and 20 of the NTS. These areas now comprise the Pahutemore » Mesa Corrective Action Unit (CAU) for which a separate subregional scale flow and transport model is being developed by the UGTA Project to forecast the larger-scale migration of radionuclides from underground tests on Pahute Mesa. The current simulations are being developed, on one hand, to more fully understand the complex coupled processes involved in radionuclide migration, with a specific focus on the CHESHIRE test. While remaining unclassified, they are as site specific as possible and involve a level of modeling detail that is commensurate with the most fundamental processes, conservative assumptions, and representative data sets available. However, the simulation results are also being developed so that they may be simplified and interpreted for use as a source term boundary condition at the CHESHIRE location in the Pahute Mesa CAU model. In addition, the processes of simplification and interpretation will provide generalized insight as to how the source term behavior at other tests may be considered or otherwise represented in the Pahute Mesa CAU model.« less
Nitrogen enrichment regulates calcium sources in forests
Hynicka, Justin D.; Pett-Ridge, Julie C.; Perakis, Steven
2016-01-01
Nitrogen (N) is a key nutrient that shapes cycles of other essential elements in forests, including calcium (Ca). When N availability exceeds ecosystem demands, excess N can stimulate Ca leaching and deplete Ca from soils. Over the long term, these processes may alter the proportion of available Ca that is derived from atmospheric deposition vs. bedrock weathering, which has fundamental consequences for ecosystem properties and nutrient supply. We evaluated how landscape variation in soil N, reflecting long-term legacies of biological N fixation, influenced plant and soil Ca availability and ecosystem Ca sources across 22 temperate forests in Oregon. We also examined interactions between soil N and bedrock Ca using soil N gradients on contrasting basaltic vs. sedimentary bedrock that differed 17-fold in underlying Ca content. We found that low-N forests on Ca-rich basaltic bedrock relied strongly on Ca from weathering, but that soil N enrichment depleted readily weatherable mineral Ca and shifted forest reliance toward atmospheric Ca. Forests on Ca-poor sedimentary bedrock relied more consistently on atmospheric Ca across all levels of soil N enrichment. The broad importance of atmospheric Ca was unexpected given active regional uplift and erosion that are thought to rejuvenate weathering supply of soil minerals. Despite different Ca sources to forests on basaltic vs. sedimentary bedrock, we observed consistent declines in plant and soil Ca availability with increasing N, regardless of the Ca content of underlying bedrock. Thus, traditional measures of Ca availability in foliage and soil exchangeable pools may poorly reflect long-term Ca sources that sustain soil fertility. We conclude that long-term soil N enrichment can deplete available Ca and cause forests to rely increasingly on Ca from atmospheric deposition, which may limit ecosystem Ca supply in an increasingly N-rich world.
High-Order Residual-Distribution Hyperbolic Advection-Diffusion Schemes: 3rd-, 4th-, and 6th-Order
NASA Technical Reports Server (NTRS)
Mazaheri, Alireza R.; Nishikawa, Hiroaki
2014-01-01
In this paper, spatially high-order Residual-Distribution (RD) schemes using the first-order hyperbolic system method are proposed for general time-dependent advection-diffusion problems. The corresponding second-order time-dependent hyperbolic advection- diffusion scheme was first introduced in [NASA/TM-2014-218175, 2014], where rapid convergences over each physical time step, with typically less than five Newton iterations, were shown. In that method, the time-dependent hyperbolic advection-diffusion system (linear and nonlinear) was discretized by the second-order upwind RD scheme in a unified manner, and the system of implicit-residual-equations was solved efficiently by Newton's method over every physical time step. In this paper, two techniques for the source term discretization are proposed; 1) reformulation of the source terms with their divergence forms, and 2) correction to the trapezoidal rule for the source term discretization. Third-, fourth, and sixth-order RD schemes are then proposed with the above techniques that, relative to the second-order RD scheme, only cost the evaluation of either the first derivative or both the first and the second derivatives of the source terms. A special fourth-order RD scheme is also proposed that is even less computationally expensive than the third-order RD schemes. The second-order Jacobian formulation was used for all the proposed high-order schemes. The numerical results are then presented for both steady and time-dependent linear and nonlinear advection-diffusion problems. It is shown that these newly developed high-order RD schemes are remarkably efficient and capable of producing the solutions and the gradients to the same order of accuracy of the proposed RD schemes with rapid convergence over each physical time step, typically less than ten Newton iterations.
Related Studies in Long Term Lithium Battery Stability
NASA Technical Reports Server (NTRS)
Horning, R. J.; Chua, D. L.
1984-01-01
The continuing growth of the use of lithium electrochemical systems in a wide variety of both military and industrial applications is primarily a result of the significant benefits associated with the technology such as high energy density, wide temperature operation and long term stability. The stability or long term storage capability of a battery is a function of several factors, each important to the overall storage life and, therefore, each potentially a problem area if not addressed during the design, development and evaluation phases of the product cycle. Design (e.g., reserve vs active), inherent material thermal stability, material compatibility and self-discharge characteristics are examples of factors key to the storability of a power source.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T Elizabeth
2018-01-01
The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. A mixed-methods approach was used with cancer survivors from the "Assessment of Patients' Experience with Cancer Care" 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families' and friends' provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T. Elizabeth
2018-01-01
Objective The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. Methods A mixed-methods approach was used with cancer survivors from the “Assessment of Patients’ Experience with Cancer Care” 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Results Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families’ and friends’ provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Conclusion Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients. PMID:29339938
Observed ground-motion variabilities and implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, F.; Bora, S. S.; Bindi, D.; Specht, S.; Drouet, S.; Derras, B.; Pina-Valdes, J.
2016-12-01
One of the key challenges of seismology is to be able to calibrate and analyse the physical factors that control earthquake and ground-motion variabilities. Within the framework of empirical ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-field records and modern regression algorithms allow to decompose these residuals into between-event and a within-event residual components. The between-event term quantify all the residual effects of the source (e.g. stress-drops) which are not accounted by magnitude term as the only source parameter of the model. Between-event residuals provide a new and rather robust way to analyse the physical factors that control earthquake source properties and associated variabilities. We first will show the correlation between classical stress-drops and between-event residuals. We will also explain why between-event residuals may be a more robust way (compared to classical stress-drop analysis) to analyse earthquake source-properties. We will finally calibrate between-events variabilities using recent high-quality global accelerometric datasets (NGA-West 2, RESORCE) and datasets from recent earthquakes sequences (Aquila, Iquique, Kunamoto). The obtained between-events variabilities will be used to evaluate the variability of earthquake stress-drops but also the variability of source properties which cannot be explained by a classical Brune stress-drop variations. We will finally use the between-event residual analysis to discuss regional variations of source properties, differences between aftershocks and mainshocks and potential magnitude dependencies of source characteristics.
NASA Technical Reports Server (NTRS)
Hughes, Eric J.; Krotkov, Nickolay; da Silva, Arlindo; Colarco, Peter
2015-01-01
Simulation of volcanic emissions in climate models requires information that describes the eruption of the emissions into the atmosphere. While the total amount of gases and aerosols released from a volcanic eruption can be readily estimated from satellite observations, information about the source parameters, like injection altitude, eruption time and duration, is often not directly known. The AeroCOM volcanic emissions inventory provides estimates of eruption source parameters and has been used to initialize volcanic emissions in reanalysis projects, like MERRA. The AeroCOM volcanic emission inventory provides an eruptions daily SO2 flux and plume top altitude, yet an eruption can be very short lived, lasting only a few hours, and emit clouds at multiple altitudes. Case studies comparing the satellite observed dispersal of volcanic SO2 clouds to simulations in MERRA have shown mixed results. Some cases show good agreement with observations Okmok (2008), while for other eruptions the observed initial SO2 mass is half of that in the simulations, Sierra Negra (2005). In other cases, the initial SO2 amount agrees with the observations but shows very different dispersal rates, Soufriere Hills (2006). In the aviation hazards community, deriving accurate source terms is crucial for monitoring and short-term forecasting (24-h) of volcanic clouds. Back trajectory methods have been developed which use satellite observations and transport models to estimate the injection altitude, eruption time, and eruption duration of observed volcanic clouds. These methods can provide eruption timing estimates on a 2-hour temporal resolution and estimate the altitude and depth of a volcanic cloud. To better understand the differences between MERRA simulations and volcanic SO2 observations, back trajectory methods are used to estimate the source term parameters for a few volcanic eruptions and compared to their corresponding entry in the AeroCOM volcanic emission inventory. The nature of these mixed results is discussed with respect to the source term estimates.
Study of nonpoint source nutrient loading in the Patuxent River basin, Maryland
Preston, S.D.
1997-01-01
Study of nonpoint-source (NPS) nutrient loading in Maryland has focused on the Patuxent watershed because of its importance and representativeness of conditions in the State. Evaluation of NPS nutrient loading has been comprehensive and has included long-term monitoring, detailed watershed modeling, and synoptic sampling studies. A large amount of information has been compiled for the watershed and that information is being used to identify primary controls and efficient management strategies for NPS nutrient loading. Results of the Patuxent NPS study have identified spatial trends in water quality that appear to be related to basin charcteristics such as land use, physiography, andgeology. Evaluation of the data compiled by the study components is continuing and is expected to provide more detailed assessments of the reasons for spatial trends. In particular, ongoing evaluation of the watershed model output is expected to provide detailed information on the relative importance of nutrient sources and transport pathways across the entire watershed. Planned future directions of NPS evaluation in the State of Maryland include continued study of water quality in the Patuxent watershed and a shift in emphasis to a statewide approach. Eventually, the statewide approach will become the primary approach usedby the State to evaluate NPS loading. The information gained in the Patuxent study and the tools developed will represent valuable assets indeveloping the statewide NPS assessment program.
77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-06-01
The bibliography contains citations concerning standards and standard tests for water quality in drinking water sources, reservoirs, and distribution systems. Standards from domestic and international sources are presented. Glossaries and vocabularies that concern water quality analysis, testing, and evaluation are included. Standard test methods for individual elements, selected chemicals, sensory properties, radioactivity, and other chemical and physical properties are described. Discussions for proposed standards on new pollutant materials are briefly considered. (Contains a minimum of 203 citations and includes a subject term index and title list.)
Photovoltaics as a terrestrial energy source. Volume 1: An introduction
NASA Technical Reports Server (NTRS)
Smith, J. L.
1980-01-01
Photovoltaic (PV) systems were examined their potential for terrestrial application and future development. Photovoltaic technology, existing and potential photovoltaic applications, and the National Photovoltaics Program are reviewed. The competitive environment for this electrical source, affected by the presence or absence of utility supplied power is evaluated in term of systems prices. The roles of technological breakthroughs, directed research and technology development, learning curves, and commercial demonstrations in the National Program are discussed. The potential for photovoltaics to displace oil consumption is examined, as are the potential benefits of employing PV in either central-station or non-utility owned, small, distributed systems.
NASA Astrophysics Data System (ADS)
Fomina, E. V.; Lesovik, V. S.; Fomin, A. E.; Kozhukhova, N. I.; Lebedev, M. S.
2018-03-01
Argillite is a carbonaceous industrial by-product that is a potential source in environmentally friendly and source-saving construction industry. In this research, chemical and mineral composition as well as particle size distribution of argillite were studied and used to develop autoclave aerated concrete as partial substitute of quartz sand. Effect of the argillite as a mineral admixture in autoclave aerated concrete was investigated in terms of compressive and tensile strength, density, heat conductivity etc. The obtained results demonstrated an efficiency of argillite as an energy-saving material in autoclave construction composites.
Conti, Andrea A
2008-11-01
The study of the use of English for medicine has become a continual source of enquiry. Aim of this survey was the systematic evaluation of the qualitative and quantitative perception, translation and current use of English terms on the part of Italian health operators. Eight English terms directly connected with the health scenario or related to it compliance", "imaging", "likelihood", "odds ratio", "outcome", "stent", "test", "trial") were selected and, by means of a paper registration form, they were administered to forty Italian health professionals (non-physicians), already active in the health sector and attending specialised health degree courses. The participants were asked to furnish up to two translational proposals for every single English term, and, after the written registration, there followed a structured oral discussion of the translation, perception and everyday use of the English terms in the working reality of the participants. This survey provides a scientific "real world" experience, and its qualitative and quantitative findings are of use in evaluating the level of correction in the adoption of English language on the part of health operators.
An Improved Elastic and Nonelastic Neutron Transport Algorithm for Space Radiation
NASA Technical Reports Server (NTRS)
Clowdsley, Martha S.; Wilson, John W.; Heinbockel, John H.; Tripathi, R. K.; Singleterry, Robert C., Jr.; Shinn, Judy L.
2000-01-01
A neutron transport algorithm including both elastic and nonelastic particle interaction processes for use in space radiation protection for arbitrary shield material is developed. The algorithm is based upon a multiple energy grouping and analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. The algorithm is then coupled to the Langley HZETRN code through a bidirectional neutron evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for an aluminum water shield-target configuration is then compared with MCNPX and LAHET Monte Carlo calculations for the same shield-target configuration. With the Monte Carlo calculation as a benchmark, the algorithm developed in this paper showed a great improvement in results over the unmodified HZETRN solution. In addition, a high-energy bidirectional neutron source based on a formula by Ranft showed even further improvement of the fluence results over previous results near the front of the water target where diffusion out the front surface is important. Effects of improved interaction cross sections are modest compared with the addition of the high-energy bidirectional source terms.
NASA Technical Reports Server (NTRS)
Spalvins, T.
1979-01-01
Ion plating is a plasma deposition technique where ions of the gas and the evaporant have a decisive role in the formation of a coating in terms of adherence, coherence, and morphological growth. The range of materials that can be ion plated is predominantly determined by the selection of the evaporation source. Based on the type of evaporation source, gaseous media and mode of transport, the following will be discussed: resistance, electron beam sputtering, reactive and ion beam evaporation. Ionization efficiencies and ion energies in the glow discharge determine the percentage of atoms which are ionized under typical ion plating conditions. The plating flux consists of a small number of energetic ions and a large number of energetic neutrals. The energy distribution ranges from thermal energies up to a maximum energy of the discharge. The various reaction mechanisms which contribute to the exceptionally strong adherence - formation of a graded substrate/coating interface are not fully understood, however the controlling factors are evaluated. The influence of process variables on the nucleation and growth characteristics are illustrated in terms of morphological changes which affect the mechanical and tribological properties of the coating.
Representing Thoughts, Words, and Things in the UMLS
Campbell, Keith E.; Oliver, Diane E.; Spackman, Kent A.; Shortliffe, Edward H.
1998-01-01
The authors describe a framework, based on the Ogden-Richards semiotic triangle, for understanding the relationship between the Unified Medical Language System (UMLS) and the source terminologies from which the UMLS derives its content. They pay particular attention to UMLS's Concept Unique Identifier (CUI) and the sense of “meaning” it represents as contrasted with the sense of “meaning” represented by the source terminologies. The CUI takes on emergent meaning through linkage to terms in different terminology systems. In some cases, a CUI's emergent meaning can differ significantly from the original sources' intended meanings of terms linked by that CUI. Identification of these different senses of meaning within the UMLS is consistent with historical themes of semantic interpretation of language. Examination of the UMLS within such a historical framework makes it possible to better understand the strengths and limitations of the UMLS approach for integrating disparate terminologic systems and to provide a model, or theoretic foundation, for evaluating the UMLS as a Possible World—that is, as a mathematical formalism that represents propositions about some perspective or interpretation of the physical world. PMID:9760390
Aerosol Microphysics and Radiation Integration
2007-09-30
http://www.nrlmry.navy.mil/ flambe / LONG-TERM GOALS This project works toward the development and support of real time global prognostic aerosol...Burning Emissions ( FLAMBE ) project were transition to the Fleet Numerical Oceanographic Center (FNMOC) Monterey in FY07. Meteorological guidance...Hyer, E. J. and J. S. Reid (2006), Evaluating the impact of improvements to the FLAMBE smoke source model on forecasts of aerosol distribution
ERIC Educational Resources Information Center
Kanof, Marjorie E.
The most widely used school-based substance abuse prevention program in the United States is the Drug Abuse Resistance Education (DARE) program, which is funded by a variety of sources, including private, federal, and other public entities. DAREs primary mission is to provide children with the information and skills they need to live drug- and…
NASA Astrophysics Data System (ADS)
Neville, J.; Emanuel, R. E.
2017-12-01
In 2016 Hurricane Matthew brought immense flooding and devastation to the Lumbee (aka Lumber) River basin. Some impacts are obvious, such as deserted homes and businesses, but other impacts, including long-term environmental, are uncertain. Extreme flooding throughout the basin established temporary hydrologic connectivity between aquatic environments and upland sources of nutrients and other pollutants. Though 27% of the basin is covered by wetlands, hurricane-induced flooding was so intense that wetlands may have had no opportunity to mitigate delivery of nutrients into surface waters. As a result, how Hurricane Matthew impacted nitrate retention and uptake in the Lumbee River remains uncertain. The unknown magnitude of nitrate transported into the Lumbee River from surrounding sources may have lingering impacts on nitrogen cycling in this stream. With these potential impacts in mind, we conducted a Lagrangian water quality sampling campaign to assess the ability of the Lumbee River to retain and process nitrogen following Hurricane Matthew. We collected samples before and after flooding and compare first order nitrogen uptake kinetics of both periods. The analysis and comparisons allow us to evaluate the long-term impacts of Hurricane Matthew on nitrogen cycling after floodwaters recede.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rood, Arthur S.; Sondrup, A. Jeffrey; Ritter, Paul D.
A methodology to quantify the performance of an air monitoring network in terms of frequency of detection has been developed. The methodology utilizes an atmospheric transport model to predict air concentrations of radionuclides at the samplers for a given release time and duration. Frequency of detection is defined as the fraction of “events” that result in a detection at either a single sampler or network of samplers. An “event” is defined as a release of finite duration that begins on a given day and hour of the year from a facility with the potential to emit airborne radionuclides. Another metricmore » of interest is the network intensity, which is defined as the fraction of samplers in the network that have a positive detection for a given event. The frequency of detection methodology allows for evaluation of short-term releases that include effects of short-term variability in meteorological conditions. The methodology was tested using the U.S. Department of Energy Idaho National Laboratory (INL) Site ambient air monitoring network consisting of 37 low-volume air samplers in 31 different locations covering a 17,630 km 2 region. Releases from six major INL facilities distributed over an area of 1,435 km 2 were modeled and included three stack sources and eight ground-level sources. A Lagrangian Puff air dispersion model (CALPUFF) was used to model atmospheric transport. The model was validated using historical 125Sb releases and measurements. Relevant one-week release quantities from each emission source were calculated based on a dose of 1.9 × 10 –4 mSv at a public receptor (0.01 mSv assuming release persists over a year). Important radionuclides considered include 241Am, 137Cs, 238Pu, 239Pu, 90Sr, and tritium. Results show the detection frequency is over 97.5% for the entire network considering all sources and radionuclides. Network intensities ranged from 3.75% to 62.7%. Evaluation of individual samplers indicated some samplers were poorly situated and add little to the overall effectiveness of the network. As a result, using the frequency of detection methods, optimum sampler placements were simulated that could substantially improve the performance and efficiency of the network.« less
Rood, Arthur S.; Sondrup, A. Jeffrey; Ritter, Paul D.
2016-04-01
A methodology to quantify the performance of an air monitoring network in terms of frequency of detection has been developed. The methodology utilizes an atmospheric transport model to predict air concentrations of radionuclides at the samplers for a given release time and duration. Frequency of detection is defined as the fraction of “events” that result in a detection at either a single sampler or network of samplers. An “event” is defined as a release of finite duration that begins on a given day and hour of the year from a facility with the potential to emit airborne radionuclides. Another metricmore » of interest is the network intensity, which is defined as the fraction of samplers in the network that have a positive detection for a given event. The frequency of detection methodology allows for evaluation of short-term releases that include effects of short-term variability in meteorological conditions. The methodology was tested using the U.S. Department of Energy Idaho National Laboratory (INL) Site ambient air monitoring network consisting of 37 low-volume air samplers in 31 different locations covering a 17,630 km 2 region. Releases from six major INL facilities distributed over an area of 1,435 km 2 were modeled and included three stack sources and eight ground-level sources. A Lagrangian Puff air dispersion model (CALPUFF) was used to model atmospheric transport. The model was validated using historical 125Sb releases and measurements. Relevant one-week release quantities from each emission source were calculated based on a dose of 1.9 × 10 –4 mSv at a public receptor (0.01 mSv assuming release persists over a year). Important radionuclides considered include 241Am, 137Cs, 238Pu, 239Pu, 90Sr, and tritium. Results show the detection frequency is over 97.5% for the entire network considering all sources and radionuclides. Network intensities ranged from 3.75% to 62.7%. Evaluation of individual samplers indicated some samplers were poorly situated and add little to the overall effectiveness of the network. As a result, using the frequency of detection methods, optimum sampler placements were simulated that could substantially improve the performance and efficiency of the network.« less
Renewable energies in electricity generation for reduction of greenhouse gases in Mexico 2025.
Islas, Jorge; Manzini, Fabio; Martínez, Manuel
2002-02-01
This study presents 4 scenarios relating to the environmental futures of electricity generation in Mexico up to the year 2025. The first scenario emphasizes the use of oil products, particularly fuel oil, and represents the historic path of Mexico's energy policy. The second scenario prioritizes the use of natural gas, reflecting the energy consumption pattern that arose in the mid-1990s as a result of reforms in the energy sector. In the third scenario, the high participation of renewable sources of energy is considered feasible from a technical and economic point of view. The fourth scenario takes into account the present- and medium-term use of natural-gas technologies that the energy reform has produced, but after 2007 a high and feasible participation of renewable sources of energy is considered. The 4 scenarios are evaluated up to the year 2025 in terms of greenhouse gases (GHG) and acid rain precursor gases (ARPG).
Computational study of radiation doses at UNLV accelerator facility
NASA Astrophysics Data System (ADS)
Hodges, Matthew; Barzilov, Alexander; Chen, Yi-Tung; Lowe, Daniel
2017-09-01
A Varian K15 electron linear accelerator (linac) has been considered for installation at University of Nevada, Las Vegas (UNLV). Before experiments can be performed, it is necessary to evaluate the photon and neutron spectra as generated by the linac, as well as the resulting dose rates within the accelerator facility. A computational study using MCNPX was performed to characterize the source terms for the bremsstrahlung converter. The 15 MeV electron beam available in the linac is above the photoneutron threshold energy for several materials in the linac assembly, and as a result, neutrons must be accounted for. The angular and energy distributions for bremsstrahlung flux generated by the interaction of the 15 MeV electron beam with the linac target were determined. This source term was used in conjunction with the K15 collimators to determine the dose rates within the facility.
Li, Lei; Wang, Tie-yu; Wang, Xiaojun; Xiao, Rong-bo; Li, Qi-feng; Peng, Chi; Han, Cun-liang
2016-04-15
Based on comprehensive consideration of soil environmental quality, pollution status of river, environmental vulnerability and the stress of pollution sources, a technical method was established for classification of priority area of soil environmental protection around the river-style water sources. Shunde channel as an important drinking water sources of Foshan City, Guangdong province, was studied as a case, of which the classification evaluation system was set up. In detail, several evaluation factors were selected according to the local conditions of nature, society and economy, including the pollution degree of heavy metals in soil and sediment, soil characteristics, groundwater sensitivity, vegetation coverage, the type and location of pollution sources. Data information was mainly obtained by means of field survey, sampling analysis, and remote sensing interpretation. Afterwards, Analytical Hierarchy Process (AHP) was adopted to decide the weight of each factor. The basic spatial data layers were set up respectively and overlaid based on the weighted summation assessment model in Geographical Information System (GIS), resulting in a classification map of soil environmental protection level in priority area of Shunde channel. Accordingly, the area was classified to three levels named as polluted zone, risky zone and safe zone, which respectively accounted for 6.37%, 60.90% and 32.73% of the whole study area. Polluted zone and risky zone were mainly distributed in Lecong, Longjiang and Leliu towns, with pollutants mainly resulted from the long-term development of aquaculture and the industries containing furniture, plastic constructional materials and textile and clothing. In accordance with the main pollution sources of soil, targeted and differentiated strategies were put forward. The newly established evaluation method could be referenced for the protection and sustainable utilization of soil environment around the water sources.
Framework for Evaluating Water Quality of the New England Crystalline Rock Aquifers
Harte, Philip T.; Robinson, Gilpin R.; Ayotte, Joseph D.; Flanagan, Sarah M.
2008-01-01
Little information exists on regional ground-water-quality patterns for the New England crystalline rock aquifers (NECRA). A systematic approach to facilitate regional evaluation is needed for several reasons. First, the NECRA are vulnerable to anthropogenic and natural contaminants such as methyl tert-butyl ether (MTBE), arsenic, and radon gas. Second, the physical characteristics of the aquifers, termed 'intrinsic susceptibility', can lead to variable and degraded water quality. A framework approach for characterizing the aquifer region into areas of similar hydrogeology is described in this report and is based on hypothesized relevant physical features and chemical conditions (collectively termed 'variables') that affect regional patterns of ground-water quality. A framework for comparison of water quality across the NECRA consists of a group of spatial variables related to aquifer properties, hydrologic conditions, and contaminant sources. These spatial variables are grouped under four general categories (features) that can be mapped across the aquifers: (1) geologic, (2) hydrophysiographic, (3) land-use land-cover, and (4) geochemical. On a regional scale, these variables represent indicators of natural and anthropogenic sources of contaminants, as well as generalized physical and chemical characteristics of the aquifer system that influence ground-water chemistry and flow. These variables can be used in varying combinations (depending on the contaminant) to categorize the aquifer into areas of similar hydrogeologic characteristics to evaluate variation in regional water quality through statistical testing.
An investigation on nuclear energy policy in Turkey and public perception
NASA Astrophysics Data System (ADS)
Coskun, Mehmet Burhanettin; Tanriover, Banu
2016-11-01
Turkey, which meets nearly 70 per cent of its energy demands with import, is facing the problems of energy security and current account deficit as a result of its dependence on foreign sources in terms of energy input. It is also known that Turkey is having environmental problems due to the increases in CO2 emission. Considering these problems in Turkish economy, where energy input is commonly used, it is necessary to use energy sources efficiently and provide alternative energy sources. Due to the dependency of renewable sources on meteorological conditions (the absence of enough sun, wind, and water sources), the energy generation could not be provided efficiently and permanently from these sources. At this point, nuclear energy as analternative energy source maintains its importance as a sustainable energy source that providing energy in 7 days and 24 hours. The main purpose of this study is to evaluate the nuclear energy subject within the context of negative public perceptions emerged after Chernobyl (1986) and Fukushima (2011) disasters and to investigate in the economic framework.
Radiological analysis of plutonium glass batches with natural/enriched boron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
2000-06-22
The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less
Neubauer, Georg; Feychting, Maria; Hamnerius, Yngve; Kheifets, Leeka; Kuster, Niels; Ruiz, Ignacio; Schüz, Joachim; Uberbacher, Richard; Wiart, Joe; Röösli, Martin
2007-04-01
The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach. (c) 2006 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Civerolo, Kevin; Hogrefe, Christian; Zalewsky, Eric; Hao, Winston; Sistla, Gopal; Lynn, Barry; Rosenzweig, Cynthia; Kinney, Patrick L.
2010-10-01
This paper compares spatial and seasonal variations and temporal trends in modeled and measured concentrations of sulfur and nitrogen compounds in wet and dry deposition over an 18-year period (1988-2005) over a portion of the northeastern United States. Substantial emissions reduction programs occurred over this time period, including Title IV of the Clean Air Act Amendments of 1990 which primarily resulted in large decreases in sulfur dioxide (SO 2) emissions by 1995, and nitrogen oxide (NO x) trading programs which resulted in large decreases in warm season NO x emissions by 2004. Additionally, NO x emissions from mobile sources declined more gradually over this period. The results presented here illustrate the use of both operational and dynamic model evaluation and suggest that the modeling system largely captures the seasonal and long-term changes in sulfur compounds. The modeling system generally captures the long-term trends in nitrogen compounds, but does not reproduce the average seasonal variation or spatial patterns in nitrate.
Cockpit display of hazardous weather information
NASA Technical Reports Server (NTRS)
Hansman, R. John, Jr.; Wanke, Craig
1990-01-01
Information transfer and display issues associated with the dissemination of hazardous weather warnings are studied in the context of windshear alerts. Operational and developmental windshear detection systems are briefly reviewed. The July 11, 1988 microburst events observed as part of the Denver Terminal Doppler Weather Radar (TDWR) operational evaluation are analyzed in terms of information transfer and the effectiveness of the microburst alerts. Information transfer, message content and display issues associated with microburst alerts generated from ground based sources are evaluated by means of pilot opinion surveys and part task simulator studies.
Cockpit display of hazardous weather information
NASA Technical Reports Server (NTRS)
Hansman, R. John, Jr.; Wanke, Craig
1989-01-01
Information transfer and display issues associated with the dissemination of hazardous-weather warnings are studied in the context of wind-shear alerts. Operational and developmental wind-shear detection systems are briefly reviewed. The July 11, 1988 microburst events observed as part of the Denver TDWR operational evaluation are analyzed in terms of information transfer and the effectiveness of the microburst alerts. Information transfer, message content, and display issues associated with microburst alerts generated from ground-based sources (Doppler radars, LLWAS, and PIREPS) are evaluated by means of pilot opinion surveys and part-task simulator studies.
Source term evaluation for accident transients in the experimental fusion facility ITER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Virot, F.; Barrachin, M.; Cousin, F.
2015-03-15
We have studied the transport and chemical speciation of radio-toxic and toxic species for an event of water ingress in the vacuum vessel of experimental fusion facility ITER with the ASTEC code. In particular our evaluation takes into account an assessed thermodynamic data for the beryllium gaseous species. This study shows that deposited beryllium dusts of atomic Be and Be(OH){sub 2} are formed. It also shows that Be(OT){sub 2} could exist in some conditions in the drain tank. (authors)
Natural and Induced Environment in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Wilson, John W.; Badavi, Francis F.; Kim, Myung-Hee Y.; Clowdsley, Martha S.; Heinbockel, John H.; Cucinotta, Francis A.; Badhwar, Gautam D.; Atwell, William; Huston, Stuart L.
2002-01-01
The long-term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind which varies over the solar cycle. The neutron environment within the Shuttle in low Earth orbit has two sources. A time dependent model for the ambient environment is used to evaluate the natural and induced environment. The induced neutron environment is evaluated using measurements on STS-31 and STS-36 near the 1990 solar maximum.
Assessment Methods of Groundwater Overdraft Area and Its Application
NASA Astrophysics Data System (ADS)
Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun
2018-05-01
Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.
Analysis and Modeling of Parallel Photovoltaic Systems under Partial Shading Conditions
NASA Astrophysics Data System (ADS)
Buddala, Santhoshi Snigdha
Since the industrial revolution, fossil fuels like petroleum, coal, oil, natural gas and other non-renewable energy sources have been used as the primary energy source. The consumption of fossil fuels releases various harmful gases into the atmosphere as byproducts which are hazardous in nature and they tend to deplete the protective layers and affect the overall environmental balance. Also the fossil fuels are bounded resources of energy and rapid depletion of these sources of energy, have prompted the need to investigate alternate sources of energy called renewable energy. One such promising source of renewable energy is the solar/photovoltaic energy. This work focuses on investigating a new solar array architecture with solar cells connected in parallel configuration. By retaining the structural simplicity of the parallel architecture, a theoretical small signal model of the solar cell is proposed and modeled to analyze the variations in the module parameters when subjected to partial shading conditions. Simulations were run in SPICE to validate the model implemented in Matlab. The voltage limitations of the proposed architecture are addressed by adopting a simple dc-dc boost converter and evaluating the performance of the architecture in terms of efficiencies by comparing it with the traditional architectures. SPICE simulations are used to compare the architectures and identify the best one in terms of power conversion efficiency under partial shading conditions.
Madjidi, Faramarz; Behroozy, Ali
2014-01-01
Exposure to visible light and near infrared (NIR) radiation in the wavelength region of 380 to 1400 nm may cause thermal retinal injury. In this analysis, the effective spectral radiance of a hot source is replaced by its temperature in the exposure limit values in the region of 380-1400 nm. This article describes the development and implementation of a computer code to predict those temperatures, corresponding to the exposure limits proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). Viewing duration and apparent diameter of the source were inputs for the computer code. At the first stage, an infinite series was created for calculation of spectral radiance by integration with Planck's law. At the second stage for calculation of effective spectral radiance, the initial terms of this infinite series were selected and integration was performed by multiplying these terms by a weighting factor R(λ) in the wavelength region 380-1400 nm. At the third stage, using a computer code, the source temperature that can emit the same effective spectral radiance was found. As a result, based only on measuring the source temperature and accounting for the exposure time and the apparent diameter of the source, it is possible to decide whether the exposure to visible and NIR in any 8-hr workday is permissible. The substitution of source temperature for effective spectral radiance provides a convenient way to evaluate exposure to visible light and NIR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-07-01
To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODs) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regimes, which are labeled as integrator OUs. This remedial investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the feasibility study to evaluate all probable or likely alternatives.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-09-01
To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODS) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regime`s, which are labeled as integrator OUs. This Remedial Investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the Feasibility Study to evaluate all probable or likely alternatives.« less
NASA Technical Reports Server (NTRS)
Fink, P. W.; Khayat, M. A.; Wilton, D. R.
2005-01-01
It is known that higher order modeling of the sources and the geometry in Boundary Element Modeling (BEM) formulations is essential to highly efficient computational electromagnetics. However, in order to achieve the benefits of hIgher order basis and geometry modeling, the singular and near-singular terms arising in BEM formulations must be integrated accurately. In particular, the accurate integration of near-singular terms, which occur when observation points are near but not on source regions of the scattering object, has been considered one of the remaining limitations on the computational efficiency of integral equation methods. The method of singularity subtraction has been used extensively for the evaluation of singular and near-singular terms. Piecewise integration of the source terms in this manner, while manageable for bases of constant and linear orders, becomes unwieldy and prone to error for bases of higher order. Furthermore, we find that the singularity subtraction method is not conducive to object-oriented programming practices, particularly in the context of multiple operators. To extend the capabilities, accuracy, and maintainability of general-purpose codes, the subtraction method is being replaced in favor of the purely numerical quadrature schemes. These schemes employ singularity cancellation methods in which a change of variables is chosen such that the Jacobian of the transformation cancels the singularity. An example of the sin,oularity cancellation approach is the Duffy method, which has two major drawbacks: 1) In the resulting integrand, it produces an angular variation about the singular point that becomes nearly-singular for observation points close to an edge of the parent element, and 2) it appears not to work well when applied to nearly-singular integrals. Recently, the authors have introduced the transformation u(x(prime))= sinh (exp -1) x(prime)/Square root of ((y prime (exp 2))+ z(exp 2) for integrating functions of the form I = Integral of (lambda(r(prime))((e(exp -jkR))/(4 pi R) d D where A (r (prime)) is a vector or scalar basis function and R = Square root of( (x(prime)(exp2) + (y(prime)(exp2) + z(exp 2)) is the distance between source and observation points. This scheme has all of the advantages of the Duffy method while avoiding the disadvantages listed above. In this presentation we will survey similar approaches for handling singular and near-singular terms for kernels with 1/R(exp 2) type behavior, addressing potential pitfalls and offering techniques to efficiently handle special cases.
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S
2015-03-15
The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.
Ancient Glass: A Literature Search and its Role in Waste Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Pierce, Eric M.
2010-07-01
When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less
POI Summarization by Aesthetics Evaluation From Crowd Source Social Media.
Qian, Xueming; Li, Cheng; Lan, Ke; Hou, Xingsong; Li, Zhetao; Han, Junwei
2018-03-01
Place-of-Interest (POI) summarization by aesthetics evaluation can recommend a set of POI images to the user and it is significant in image retrieval. In this paper, we propose a system that summarizes a collection of POI images regarding both aesthetics and diversity of the distribution of cameras. First, we generate visual albums by a coarse-to-fine POI clustering approach and then generate 3D models for each album by the collected images from social media. Second, based on the 3D to 2D projection relationship, we select candidate photos in terms of the proposed crowd source saliency model. Third, in order to improve the performance of aesthetic measurement model, we propose a crowd-sourced saliency detection approach by exploring the distribution of salient regions in the 3D model. Then, we measure the composition aesthetics of each image and we explore crowd source salient feature to yield saliency map, based on which, we propose an adaptive image adoption approach. Finally, we combine the diversity and the aesthetics to recommend aesthetic pictures. Experimental results show that the proposed POI summarization approach can return images with diverse camera distributions and aesthetics.
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
Economic evaluation of vaccines in Canada: A systematic review.
Chit, Ayman; Lee, Jason K H; Shim, Minsup; Nguyen, Van Hai; Grootendorst, Paul; Wu, Jianhong; Van Exan, Robert; Langley, Joanne M
2016-05-03
Economic evaluations should form part of the basis for public health decision making on new vaccine programs. While Canada's national immunization advisory committee does not systematically include economic evaluations in immunization decision making, there is increasing interest in adopting them. We therefore sought to examine the extent and quality of economic evaluations of vaccines in Canada. We conducted a systematic review of economic evaluations of vaccines in Canada to determine and summarize: comprehensiveness across jurisdictions, studied vaccines, funding sources, study designs, research quality, and changes over time. Searches in multiple databases were conducted using the terms "vaccine," "economics" and "Canada." Descriptive data from eligible manuscripts was abstracted and three authors independently evaluated manuscript quality using a 7-point Likert-type scale scoring tool based on criteria from the International Society for Pharmacoeconomics and Outcomes Research (ISPOR). 42/175 articles met the search criteria. Of these, Canada-wide studies were most common (25/42), while provincial studies largely focused on the three populous provinces of Ontario, Quebec and British Columbia. The most common funding source was industry (17/42), followed by government (7/42). 38 studies used mathematical models estimating expected economic benefit while 4 studies examined post-hoc data on established programs. Studies covered 10 diseases, with 28/42 addressing pediatric vaccines. Many studies considered cost-utility (22/42) and the majority of these studies reported favorable economic results (16/22). The mean quality score was 5.9/7 and was consistent over publication date, funding sources, and disease areas. We observed diverse approaches to evaluate vaccine economics in Canada. Given the increased complexity of economic studies evaluating vaccines and the impact of results on public health practice, Canada needs improved, transparent and consistent processes to review and assess the findings of the economic evaluations of vaccines.
Evaluating the Safety Profile of Non-Active Implantable Medical Devices Compared with Medicines.
Pane, Josep; Coloma, Preciosa M; Verhamme, Katia M C; Sturkenboom, Miriam C J M; Rebollo, Irene
2017-01-01
Recent safety issues involving non-active implantable medical devices (NAIMDs) have highlighted the need for better pre-market and post-market evaluation. Some stakeholders have argued that certain features of medicine safety evaluation should also be applied to medical devices. Our objectives were to compare the current processes and methodologies for the assessment of NAIMD safety profiles with those for medicines, identify potential gaps, and make recommendations for the adoption of new methodologies for the ongoing benefit-risk monitoring of these devices throughout their entire life cycle. A literature review served to examine the current tools for the safety evaluation of NAIMDs and those for medicines. We searched MEDLINE using these two categories. We supplemented this search with Google searches using the same key terms used in the MEDLINE search. Using a comparative approach, we summarized the new product design, development cycle (preclinical and clinical phases), and post-market phases for NAIMDs and drugs. We also evaluated and compared the respective processes to integrate and assess safety data during the life cycle of the products, including signal detection, signal management, and subsequent potential regulatory actions. The search identified a gap in NAIMD safety signal generation: no global program exists that collects and analyzes adverse events and product quality issues. Data sources in real-world settings, such as electronic health records, need to be effectively identified and explored as additional sources of safety information, particularly in some areas such as the EU and USA where there are plans to implement the unique device identifier (UDI). The UDI and other initiatives will enable more robust follow-up and assessment of long-term patient outcomes. The safety evaluation system for NAIMDs differs in many ways from those for drugs, but both systems face analogous challenges with respect to monitoring real-world usage. Certain features of the drug safety evaluation process could, if adopted and adapted for NAIMDs, lead to better and more systematic evaluations of the latter.
ERIC Educational Resources Information Center
Florida State Advisory Council on Vocational and Technical Education, Tallahassee.
A study of 13 vocational and technical education programs in Florida was conducted which represented an attempt to identify valid and reliable sources of data whereby some vocational programs might be evaluated in specified terms of effectiveness. The programs selected for study were among those which require graduating students to pass licensing…
Kaelin M. Cawley; John Campbell; Melissa Zwilling; Rudolf. Jaffé
2014-01-01
Dissolved organic matter (DOM) source and composition are critical drivers of its reactivity, impact microbial food webs and influence ecosystem functions. It is believed that DOM composition and abundance represent an integrated signal derived from the surrounding watershed. Recent studies have shown that land-use may have a long-term effect on DOM composition....
Long-Term Patency of Twisted Vascular Pedicles in Perforator-Based Propeller Flaps.
Jakubietz, Rafael G; Nickel, Aljoscha; Neshkova, Iva; Schmidt, Karsten; Gilbert, Fabian; Meffert, Rainer H; Jakubietz, Michael G
2017-10-01
Propeller flaps require torsion of the vascular pedicle of up to 180 degrees. Contrary to free flaps, where the relevance of an intact vascular pedicle has been documented, little is known regarding twisted pedicles of propeller flaps. As secondary surgeries requiring undermining of the flap are common in the extremities, knowledge regarding the necessity to protect the pedicle is relevant. The aim of this study was a long-term evaluation of the patency of vascular pedicle of propeller flaps. In a retrospective clinical study, 22 patients who underwent soft-tissue reconstruction with a propeller flap were evaluated after 43 months. A Doppler probe was used to locate and evaluate the patency of the vascular pedicle of the flap. The flaps were used in the lower extremity in 19 cases, on the trunk in 3 cases. All flaps had healed. In all patients, an intact vascular pedicle could be found. Flap size, source vessel, or infection could therefore not be linked to an increased risk of pedicle loss. The vascular pedicle of propeller flaps remains patent in the long term. This allows reelevation and undermining of the flap. We therefore recommend protecting the pedicle in all secondary cases to prevent later flap loss.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perrin, Tess E.; Davis, Robert G.; Wilkerson, Andrea M.
This GATEWAY project evaluated four field installations to better understand the long-term performance of a number of LED products, which can hopefully stimulate improvements in designing, manufacturing, specifying, procuring, and installing LED products. Field studies provide the opportunity to discover and investigate issues that cannot be simulated or uncovered in a laboratory, but the installed performance over time of commercially available LED products has not been well documented. Improving long-term performance can provide both direct energy savings by reducing the need to over-light to account for light loss and indirect energy savings through better market penetration due to SSL’s competitivemore » advantages over less-efficient light source technologies. The projects evaluated for this report illustrate that SSL use is often motivated by advantages other than energy savings, including maintenance savings, easier integration with control systems, and improved lighting quality.« less
An Empirical Temperature Variance Source Model in Heated Jets
NASA Technical Reports Server (NTRS)
Khavaran, Abbas; Bridges, James
2012-01-01
An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.
Independent evaluation of point source fossil fuel CO2 emissions to better than 10%
Turnbull, Jocelyn Christine; Keller, Elizabeth D.; Norris, Margaret W.; Wiltshire, Rachael M.
2016-01-01
Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 (14CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric 14CO2. These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions. PMID:27573818
Independent evaluation of point source fossil fuel CO2 emissions to better than 10%.
Turnbull, Jocelyn Christine; Keller, Elizabeth D; Norris, Margaret W; Wiltshire, Rachael M
2016-09-13
Independent estimates of fossil fuel CO2 (CO2ff) emissions are key to ensuring that emission reductions and regulations are effective and provide needed transparency and trust. Point source emissions are a key target because a small number of power plants represent a large portion of total global emissions. Currently, emission rates are known only from self-reported data. Atmospheric observations have the potential to meet the need for independent evaluation, but useful results from this method have been elusive, due to challenges in distinguishing CO2ff emissions from the large and varying CO2 background and in relating atmospheric observations to emission flux rates with high accuracy. Here we use time-integrated observations of the radiocarbon content of CO2 ((14)CO2) to quantify the recently added CO2ff mole fraction at surface sites surrounding a point source. We demonstrate that both fast-growing plant material (grass) and CO2 collected by absorption into sodium hydroxide solution provide excellent time-integrated records of atmospheric (14)CO2 These time-integrated samples allow us to evaluate emissions over a period of days to weeks with only a modest number of measurements. Applying the same time integration in an atmospheric transport model eliminates the need to resolve highly variable short-term turbulence. Together these techniques allow us to independently evaluate point source CO2ff emission rates from atmospheric observations with uncertainties of better than 10%. This uncertainty represents an improvement by a factor of 2 over current bottom-up inventory estimates and previous atmospheric observation estimates and allows reliable independent evaluation of emissions.
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Part 2 of a Computational Study of a Drop-Laden Mixing Layer
NASA Technical Reports Server (NTRS)
Okongo, Nora; Bellan, Josette
2004-01-01
This second of three reports on a computational study of a mixing layer laden with evaporating liquid drops presents the evaluation of Large Eddy Simulation (LES) models. The LES models were evaluated on an existing database that had been generated using Direct Numerical Simulation (DNS). The DNS method and the database are described in the first report of this series, Part 1 of a Computational Study of a Drop-Laden Mixing Layer (NPO-30719), NASA Tech Briefs, Vol. 28, No.7 (July 2004), page 59. The LES equations, which are derived by applying a spatial filter to the DNS set, govern the evolution of the larger scales of the flow and can therefore be solved on a coarser grid. Consistent with the reduction in grid points, the DNS drops would be represented by fewer drops, called computational drops in the LES context. The LES equations contain terms that cannot be directly computed on the coarser grid and that must instead be modeled. Two types of models are necessary: (1) those for the filtered source terms representing the effects of drops on the filtered flow field and (2) those for the sub-grid scale (SGS) fluxes arising from filtering the convective terms in the DNS equations. All of the filtered-sourceterm models that were developed were found to overestimate the filtered source terms. For modeling the SGS fluxes, constant-coefficient Smagorinsky, gradient, and scale-similarity models were assessed and calibrated on the DNS database. The Smagorinsky model correlated poorly with the SGS fluxes, whereas the gradient and scale-similarity models were well correlated with the SGS quantities that they represented.
NASA Astrophysics Data System (ADS)
Poupardin, A.; Heinrich, P.; Hébert, H.; Schindelé, F.; Jamelot, A.; Reymond, D.; Sugioka, H.
2018-05-01
This paper evaluates the importance of frequency dispersion in the propagation of recent trans-Pacific tsunamis. Frequency dispersion induces a time delay for the most energetic waves, which increases for long propagation distances and short source dimensions. To calculate this time delay, propagation of tsunamis is simulated and analyzed from spectrograms of time-series at specific gauges in the Pacific Ocean. One- and two-dimensional simulations are performed by solving either shallow water or Boussinesq equations and by considering realistic seismic sources. One-dimensional sensitivity tests are first performed in a constant-depth channel to study the influence of the source width. Two-dimensional tests are then performed in a simulated Pacific Ocean with a 4000-m constant depth and by considering tectonic sources of 2010 and 2015 Chilean earthquakes. For these sources, both the azimuth and the distance play a major role in the frequency dispersion of tsunamis. Finally, simulations are performed considering the real bathymetry of the Pacific Ocean. Multiple reflections, refractions as well as shoaling of waves result in much more complex time series for which the effects of the frequency dispersion are hardly discernible. The main point of this study is to evaluate frequency dispersion in terms of traveltime delays by calculating spectrograms for a time window of 6 hours after the arrival of the first wave. Results of the spectral analysis show that the wave packets recorded by pressure and tide sensors in the Pacific Ocean seem to be better reproduced by the Boussinesq model than the shallow water model and approximately follow the theoretical dispersion relationship linking wave arrival times and frequencies. Additionally, a traveltime delay is determined above which effects of frequency dispersion are considered to be significant in terms of maximum surface elevations.
Shigaki, Francirose; Sharpley, Andrew; Prochnow, Luis Ignacio
2007-02-01
Phosphorus runoff from agricultural fields amended with mineral fertilizers and manures has been linked to freshwater eutrophication. A rainfall simulation study was conducted to evaluate the effects of different rainfall intensities and P sources differing in water soluble P (WSP) concentration on P transport in runoff from soil trays packed with a Berks loam and grassed with annual ryegrass (Lolium multiflorum Lam.). Triple superphosphate (TSP; 79% WSP), low-grade super single phosphate (LGSSP; 50% WSP), North Carolina rock phosphate (NCRP; 0.5% WSP) and swine manure (SM; 70% WSP), were broadcast (100 kg total P ha-1) and rainfall applied at 25, 50 and 75 mm h-1 1, 7, 21, and 56 days after P source application. The concentration of dissolved reactive (DRP), particulate (PP), and total P (TP) was significantly (P<0.01) greater in runoff with a rainfall intensity of 75 than 25 mm h-1 for all P sources. Further, runoff DRP increased as P source WSP increased, with runoff from a 50 mm h-1 rain 1 day after source application having a DRP concentration of 0.25 mg L-1 for NCRP and 28.21 mg L-1 for TSP. In contrast, the proportion of runoff TP as PP was greater with low (39% PP for NCRP) than high WSP sources (4% PP for TSP) averaged for all rainfall intensities. The increased PP transport is attributed to the detachment and transport of undissolved P source particles during runoff. These results show that P source water solubility and rainfall intensity can influence P transport in runoff, which is important in evaluating the long-term risks of P source application on P transport in surface runoff.
Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.
Mørk, Søren; Holmes, Ian
2012-03-01
Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruskauff, Greg; Marutzky, Sam
Model evaluation focused solely on the PIN STRIPE and MILK SHAKE underground nuclear tests’ contaminant boundaries (CBs) because they had the largest extent, uncertainty, and potential consequences. The CAMBRIC radionuclide migration experiment also had a relatively large CB, but because it was constrained by transport data (notably Well UE-5n), there was little uncertainty, and radioactive decay reduced concentrations before much migration could occur. Each evaluation target and the associated data-collection activity were assessed in turn to determine whether the new data support, or demonstrate conservatism of, the CB forecasts. The modeling team—in this case, the same team that developed themore » Frenchman Flat geologic, source term, and groundwater flow and transport models—analyzed the new data and presented the results to a PER committee. Existing site understanding and its representation in numerical groundwater flow and transport models was evaluated in light of the new data and the ability to proceed to the CR stage of long-term monitoring and institutional control.« less
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2014 CFR
2014-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2012 CFR
2012-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2010 CFR
2010-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2013 CFR
2013-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2011 CFR
2011-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
Toward multimodal signal detection of adverse drug reactions.
Harpaz, Rave; DuMouchel, William; Schuemie, Martijn; Bodenreider, Olivier; Friedman, Carol; Horvitz, Eric; Ripple, Anna; Sorbello, Alfred; White, Ryen W; Winnenburg, Rainer; Shah, Nigam H
2017-12-01
Improving mechanisms to detect adverse drug reactions (ADRs) is key to strengthening post-marketing drug safety surveillance. Signal detection is presently unimodal, relying on a single information source. Multimodal signal detection is based on jointly analyzing multiple information sources. Building on, and expanding the work done in prior studies, the aim of the article is to further research on multimodal signal detection, explore its potential benefits, and propose methods for its construction and evaluation. Four data sources are investigated; FDA's adverse event reporting system, insurance claims, the MEDLINE citation database, and the logs of major Web search engines. Published methods are used to generate and combine signals from each data source. Two distinct reference benchmarks corresponding to well-established and recently labeled ADRs respectively are used to evaluate the performance of multimodal signal detection in terms of area under the ROC curve (AUC) and lead-time-to-detection, with the latter relative to labeling revision dates. Limited to our reference benchmarks, multimodal signal detection provides AUC improvements ranging from 0.04 to 0.09 based on a widely used evaluation benchmark, and a comparative added lead-time of 7-22 months relative to labeling revision dates from a time-indexed benchmark. The results support the notion that utilizing and jointly analyzing multiple data sources may lead to improved signal detection. Given certain data and benchmark limitations, the early stage of development, and the complexity of ADRs, it is currently not possible to make definitive statements about the ultimate utility of the concept. Continued development of multimodal signal detection requires a deeper understanding the data sources used, additional benchmarks, and further research on methods to generate and synthesize signals. Copyright © 2017 Elsevier Inc. All rights reserved.
Integrating diverse forage sources reduces feed gaps on mixed crop-livestock farms.
Bell, L W; Moore, A D; Thomas, D T
2017-12-04
Highly variable climates induce large variability in the supply of forage for livestock and so farmers must manage their livestock systems to reduce the risk of feed gaps (i.e. periods when livestock feed demand exceeds forage supply). However, mixed crop-livestock farmers can utilise a range of feed sources on their farms to help mitigate these risks. This paper reports on the development and application of a simple whole-farm feed-energy balance calculator which is used to evaluate the frequency and magnitude of feed gaps. The calculator matches long-term simulations of variation in forage and metabolisable energy supply from diverse sources against energy demand for different livestock enterprises. Scenarios of increasing the diversity of forage sources in livestock systems is investigated for six locations selected to span Australia's crop-livestock zone. We found that systems relying on only one feed source were prone to higher risk of feed gaps, and hence, would often have to reduce stocking rates to mitigate these risks or use supplementary feed. At all sites, by adding more feed sources to the farm feedbase the continuity of supply of both fresh and carry-over forage was improved, reducing the frequency and magnitude of feed deficits. However, there were diminishing returns from making the feedbase more complex, with combinations of two to three feed sources typically achieving the maximum benefits in terms of reducing the risk of feed gaps. Higher stocking rates could be maintained while limiting risk when combinations of other feed sources were introduced into the feedbase. For the same level of risk, a feedbase relying on a diversity of forage sources could support stocking rates 1.4 to 3 times higher than if they were using a single pasture source. This suggests that there is significant capacity to mitigate both risk of feed gaps at the same time as increasing 'safe' stocking rates through better integration of feed sources on mixed crop-livestock farms across diverse regions and climates.
NASA Astrophysics Data System (ADS)
Khan, T.; Perlinger, J. A.; Urban, N. R.
2017-12-01
Certain toxic, persistent, bioaccumulative, and semivolatile compounds known as atmosphere-surface exchangeable pollutants or ASEPs are emitted into the environment by primary sources, are transported, deposited to water surfaces, and can be later re-emitted causing the water to act as a secondary source. Polychlorinated biphenyl (PCB) compounds, a class of ASEPs, are of major concern in the Laurentian Great Lakes because of their historical use primarily as additives to oils and industrial fluids, and discharge from industrial sources. Following the ban on production in the U.S. in 1979, atmospheric concentrations of PCBs in the Lake Superior region decreased rapidly. Subsequently, PCB concentrations in the lake surface water also reached near equilibrium as the atmospheric levels of PCBs declined. However, previous studies on long-term PCB levels and trends in lake trout and walleye suggested that the initial rate of decline of PCB concentrations in fish has leveled off in Lake Superior. In this study, a dynamic multimedia flux model was developed with the objective to investigate the observed levelling off of PCB concentrations in Lake Superior fish. The model structure consists of two water layers (the epilimnion and the hypolimnion), and the surface mixed sediment layer, while atmospheric deposition is the primary external pathway of PCB inputs to the lake. The model was applied for different PCB congeners having a range of hydrophobicity and volatility. Using this model, we compare the long-term trends in predicted PCB concentrations in different environmental media with relevant available measurements for Lake Superior. We examine the seasonal depositional and exchange patterns, the relative importance of different process terms, and provide the most probable source of the current observed PCB levels in Lake Superior fish. In addition, we evaluate the role of current atmospheric PCB levels in sustaining the observed fish concentrations and appraise the need for continuous atmospheric PCB monitoring by the Great Lakes Integrated Atmospheric Deposition Network. By combining the modeled lake and biota response times resulting from atmospheric PCB inputs, we predict the time scale for safe fish consumption in Lake Superior.
Aerosol Source Attributions and Source-Receptor Relationships Across the Northern Hemisphere
NASA Technical Reports Server (NTRS)
Bian, Huisheng; Chin, Mian; Kucsera, Tom; Pan, Xiaohua; Darmenov, Anton; Colarco, Peter; Torres, Omar; Shults, Michael
2014-01-01
Emissions and long-range transport of air pollution pose major concerns on air quality and climate change. To better assess the impact of intercontinental transport of air pollution on regional and global air quality, ecosystems, and near-term climate change, the UN Task Force on Hemispheric Transport of Air Pollution (HTAP) is organizing a phase II activity (HTAP2) that includes global and regional model experiments and data analysis, focusing on ozone and aerosols. This study presents the initial results of HTAP2 global aerosol modeling experiments. We will (a) evaluate the model results with surface and aircraft measurements, (b) examine the relative contributions of regional emission and extra-regional source on surface PM concentrations and column aerosol optical depth (AOD) over several NH pollution and dust source regions and the Arctic, and (c) quantify the source-receptor relationships in the pollution regions that reflect the sensitivity of regional aerosol amount to the regional and extra-regional emission reductions.
User data dissemination concepts for earth resources
NASA Technical Reports Server (NTRS)
Davies, R.; Scott, M.; Mitchell, C.; Torbett, A.
1976-01-01
Domestic data dissemination networks for earth-resources data in the 1985-1995 time frame were evaluated. The following topics were addressed: (1) earth-resources data sources and expected data volumes, (2) future user demand in terms of data volume and timeliness, (3) space-to-space and earth point-to-point transmission link requirements and implementation, (4) preprocessing requirements and implementation, (5) network costs, and (6) technological development to support this implementation. This study was parametric in that the data input (supply) was varied by a factor of about fifteen while the user request (demand) was varied by a factor of about nineteen. Correspondingly, the time from observation to delivery to the user was varied. This parametric evaluation was performed by a computer simulation that was based on network alternatives and resulted in preliminary transmission and preprocessing requirements. The earth-resource data sources considered were: shuttle sorties, synchronous satellites (e.g., SEOS), aircraft, and satellites in polar orbits.
Evaluation of STD/AIDS prevention programs: a review of approaches and methodologies.
da Cruz, Marly Marques; dos Santos, Elizabeth Moreira; Monteiro, Simone
2007-05-01
The article presents a review of approaches and methodologies in the evaluation of STD/AIDS prevention programs, searching for theoretical and methodological support for the institutionalization of evaluation and decision-making. The review included the MEDLINE, SciELO, and ISI Web of Science databases and other sources like textbooks and congress abstracts from 1990 to 2005, with the key words: "evaluation", "programs", "prevention", "STD/AIDS", and similar terms. The papers showed a predominance of quantitative outcome or impact evaluative studies with an experimental or quasi-experimental design. The main use of evaluation is accountability, although knowledge output and program improvement were also identified in the studies. Only a few evaluative studies contemplate process evaluation and its relationship to the contexts. The review aimed to contribute to the debate on STD/AIDS, which requires more effective, consistent, and sustainable decisions in the field of prevention.
NASA Astrophysics Data System (ADS)
Musolff, Andreas; Selle, Benny; Fleckenstein, Jan H.; Oosterwoud, Marieke R.; Tittel, Jörg
2016-04-01
The instream concentrations of dissolved organic carbon (DOC) are rising in many catchments of the northern hemisphere. Elevated concentrations of DOC, mainly in the form of colored humic components, increase efforts and costs of drinking water purification. In this study, we evaluated a long-term dataset of 110 catchments draining into German drinking water reservoirs in order to assess sources of DOC and drivers of a potential long-term change. The average DOC concentrations across the wide range of different catchments were found to be well explained by the catchment's topographic wetness index. Higher wetness indices were connected to higher average DOC concentrations, which implies that catchments with shallow topography and pronounced riparian wetlands mobilize more DOC. Overall, 37% of the investigated catchments showed a significant long-term increase in DOC concentrations, while 22% exhibited significant negative trends. Moreover, we found that increasing trends in DOC were positively correlated to trends in dissolved iron concentrations at pH≤6 due to remobilization of DOC previously sorbed to iron minerals. Both, increasing trends in DOC and dissolve iron were found to be connected to decreasing trends and low concentrations of nitrate (below ~6 mg/L). This was especially observed in forested catchments where atmospheric N-depositions were the major source for nitrate availability. In these catchments, we also found long-term increases of phosphate concentrations. Therefore, we argue that dissolved iron, DOC and phosphate were jointly released under iron-reducing conditions when nitrate as a competing electron acceptor was too low in concentrations to prevent the microbial iron reduction. In contrast, we could not explain the observed increasing trends in DOC, iron and phosphate concentrations by the long-term trends of pH, sulfate or precipitation. Altogether this study gives strong evidence that both, source and long-term increases in DOC are primarily controlled by riparian wetland soils within the catchments. Here, the achievement of a long-term reduction in nitrogen deposition may in turn lead to a more pronounced iron reduction and a subsequent release of DOC and other iron-bound substances such as phosphate.
Nordheim, Lena V; Gundersen, Malene W; Espehaug, Birgitte; Guttersrud, Øystein; Flottorp, Signe
2016-01-01
Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents' abilities in critically appraising health claims. We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents' critical appraisal ability for health claims through advancing the students' knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation) to assess the certainty of the evidence. Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low. Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health claims. The small number of studies, their heterogeneity, and the predominantly high risk of bias inhibit any firm conclusions about their effects. None of the studies evaluated any long-term effects of interventions. Future intervention studies should adhere to high methodological standards, target a wider variety of school-based settings, and include a process evaluation. PROSPERO no. CRD42015017936.
Espehaug, Birgitte; Guttersrud, Øystein; Flottorp, Signe
2016-01-01
Background and Objective Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents’ abilities in critically appraising health claims. Methods We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents’ critical appraisal ability for health claims through advancing the students’ knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation) to assess the certainty of the evidence. Results Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low. Conclusion Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health claims. The small number of studies, their heterogeneity, and the predominantly high risk of bias inhibit any firm conclusions about their effects. None of the studies evaluated any long-term effects of interventions. Future intervention studies should adhere to high methodological standards, target a wider variety of school-based settings, and include a process evaluation. Systematic Review Registration PROSPERO no. CRD42015017936. PMID:27557129
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Inverse modelling of radionuclide release rates using gamma dose rate observations
NASA Astrophysics Data System (ADS)
Hamburger, Thomas; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian
2014-05-01
Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. The hazardous consequences reach out on a national and continental scale. Environmental measurements and methods to model the transport and dispersion of the released radionuclides serve as a platform to assess the regional impact of nuclear accidents - both, for research purposes and, more important, to determine the immediate threat to the population. However, the assessments of the regional radionuclide activity concentrations and the individual exposure to radiation dose underlie several uncertainties. For example, the accurate model representation of wet and dry deposition. One of the most significant uncertainty, however, results from the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source terms of severe nuclear accidents may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on rather rough estimates of released key radionuclides given by the operators. Precise measurements are mostly missing due to practical limitations during the accident. Inverse modelling can be used to realise a feasible estimation of the source term (Davoine and Bocquet, 2007). Existing point measurements of radionuclide activity concentrations are therefore combined with atmospheric transport models. The release rates of radionuclides at the accident site are then obtained by improving the agreement between the modelled and observed concentrations (Stohl et al., 2012). The accuracy of the method and hence of the resulting source term depends amongst others on the availability, reliability and the resolution in time and space of the observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates on the other hand are observed routinely on a much denser grid and higher temporal resolution. Gamma dose rate measurements contain no explicit information on the observed spectrum of radionuclides and have to be interpreted carefully. Nevertheless, they provide valuable information for the inverse evaluation of the source term due to their availability (Saunier et al., 2013). We present a new inversion approach combining an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The gamma dose rates are calculated from the modelled activity concentrations. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008). The a priori information on the source term is a first guess. The gamma dose rate observations will be used with inverse modelling to improve this first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.
Mosayebi, Z; Movahedian, A H; Soori, T
2011-07-01
Outbreaks of sepsis due to water or contaminated equipment can cause significant mortality and morbidity in neonatal intensive care units. We studied an outbreak among neonates caused by flavobacterium and investigated the characteristics of the infected neonates, antimicrobial susceptibilities, and the source of the outbreak. Forty-five neonates with documented flavobacterium sepsis were evaluated in this descriptive study. Data including sex, vaginal delivery or caesarean, preterm or term, birth weight, results of blood cultures and antibiograms were recorded and cases followed up until death or recovery. Environmental sampling for detecting the source of contamination was performed. Among the 45 patients, 28 (62.2%) were male and 17 (37.8%) female (P<0.001). The commonest clinical manifestation was respiratory distress (60%). Eighteen neonates (40%) were low birth weight. Thirty-seven neonates (82.2%) were born via caesarean section. Twenty (44.4%) of them were preterm whereas 25 (55.6%) were term (P<0.001). Mortality was 17.7%. All strains were resistant to ampicillin, and susceptible to amikacin. The source of outbreak was contaminated distilled water. Copyright © 2010 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Probing Atom-Surface Interactions by Diffraction of Bose-Einstein Condensates
NASA Astrophysics Data System (ADS)
Bender, Helmar; Stehle, Christian; Zimmermann, Claus; Slama, Sebastian; Fiedler, Johannes; Scheel, Stefan; Buhmann, Stefan Yoshi; Marachevsky, Valery N.
2014-01-01
In this article, we analyze the Casimir-Polder interaction of atoms with a solid grating and the repulsive interaction between the atoms and the grating in the presence of an external laser source. The Casimir-Polder potential is evaluated exactly in terms of Rayleigh reflection coefficients and via an approximate Hamaker approach. The laser-tuned repulsive interaction is given in terms of Rayleigh transmission coefficients. The combined potential landscape above the solid grating is probed locally by diffraction of Bose-Einstein condensates. Measured diffraction efficiencies reveal information about the shape of the potential landscape in agreement with the theory based on Rayleigh decompositions.
Jet Fuel from Shale Oil - 1981 Technology Review,
1981-12-01
the programs just described by Mr Jackson in the previous papaer . F. N. Hodgson of the Mon- santo Research Center provided mass spectrometric... research and development efforts at alleviating the magnitude of the problem and its impact on national security by evaluating the potential of...with Exxon Research and Engineering, domestic oil shale was determined to be the most viable near term alternative source of syncrude available for
Deterrence Impact Modeling Environment (DIME) Proof-of-Concept Test Evaluations and Findings
2016-06-01
sources of this caution: financial, technical, legal, and ethical . Several current Coast Guard policies complicate ongoing engagement with and assessment...and ethical . There is evidence that several of the stakeholder communities most important to the Coast Guard have not been early adopters of the...self-organization) or longer-term outcomes (such as over-harvesting, regeneration of biodiversity, resilience of an ecological system to human nature
Remote detection of chem/bio hazards via coherent anti-Stokes Raman spectroscopy
2017-09-12
hour per response, including the time for reviewing lnstnJctions, searching existing data sources, gathering and maintaining the data needed, and... time remote detection of hazardous microparticles in atmosphere and to evaluate the range of distances for typical species and the parameters of laser...detectable photons from a prototype molecule at a distance. 1S. SUBJECT TERMS Stimulated Raman scattering, Remote detection, biochemical agents, explosives
Application of the DG-1199 methodology to the ESBWR and ABWR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinich, Donald A.; Gauntt, Randall O.; Walton, Fotini
2010-09-01
Appendix A-5 of Draft Regulatory Guide DG-1199 'Alternative Radiological Source Term for Evaluating Design Basis Accidents at Nuclear Power Reactors' provides guidance - applicable to RADTRAD MSIV leakage models - for scaling containment aerosol concentration to the expected steam dome concentration in order to preserve the simplified use of the Accident Source Term (AST) in assessing containment performance under assumed design basis accident (DBA) conditions. In this study Economic and Safe Boiling Water Reactor (ESBWR) and Advanced Boiling Water Reactor (ABWR) RADTRAD models are developed using the DG-1199, Appendix A-5 guidance. The models were run using RADTRAD v3.03. Low Populationmore » Zone (LPZ), control room (CR), and worst-case 2-hr Exclusion Area Boundary (EAB) doses were calculated and compared to the relevant accident dose criteria in 10 CFR 50.67. For the ESBWR, the dose results were all lower than the MSIV leakage doses calculated by General Electric/Hitachi (GEH) in their licensing technical report. There are no comparable ABWR MSIV leakage doses, however, it should be noted that the ABWR doses are lower than the ESBWR doses. In addition, sensitivity cases were evaluated to ascertain the influence/importance of key input parameters/features of the models.« less
Cesca, S.; Battaglia, J.; Dahm, T.; Tessmer, E.; Heimann, S.; Okubo, P.
2008-01-01
The main goal of this study is to improve the modelling of the source mechanism associated with the generation of long period (LP) signals in volcanic areas. Our intent is to evaluate the effects that detailed structural features of the volcanic models play in the generation of LP signal and the consequent retrieval of LP source characteristics. In particular, effects associated with the presence of topography and crustal heterogeneities are here studied in detail. We focus our study on a LP event observed at Kilauea volcano, Hawaii, in 2001 May. A detailed analysis of this event and its source modelling is accompanied by a set of synthetic tests, which aim to evaluate the effects of topography and the presence of low velocity shallow layers in the source region. The forward problem of Green's function generation is solved numerically following a pseudo-spectral approach, assuming different 3-D models. The inversion is done in the frequency domain and the resulting source mechanism is represented by the sum of two time-dependent terms: a full moment tensor and a single force. Synthetic tests show how characteristic velocity structures, associated with shallow sources, may be partially responsible for the generation of the observed long-lasting ringing waveforms. When applying the inversion technique to Kilauea LP data set, inversions carried out for different crustal models led to very similar source geometries, indicating a subhorizontal cracks. On the other hand, the source time function and its duration are significantly different for different models. These results support the indication of a strong influence of crustal layering on the generation of the LP signal, while the assumption of homogeneous velocity model may bring to misleading results. ?? 2008 The Authors Journal compilation ?? 2008 RAS.
NASA Astrophysics Data System (ADS)
Liu, Xiaoyu; Mason, Mark A.; Guo, Zhishi; Krebs, Kenneth A.; Roache, Nancy F.
2015-12-01
This paper describes the measurement and model evaluation of formaldehyde source emissions from composite and solid wood furniture in a full-scale chamber at different ventilation rates for up to 4000 h using ASTM D 6670-01 (2007). Tests were performed on four types of furniture constructed of different materials and from different manufacturers. The data were used to evaluate two empirical emission models, i.e., a first-order and power-law decay model. The experimental results showed that some furniture tested in this study, made only of solid wood and with less surface area, had low formaldehyde source emissions. The effect of ventilation rate on formaldehyde emissions was also examined. Model simulation results indicated that the power-law decay model showed better agreement than the first-order decay model for the data collected from the tests, especially for long-term emissions. This research was limited to a laboratory study with only four types of furniture products tested. It was not intended to comprehensively test or compare the large number of furniture products available in the market place. Therefore, care should be taken when applying the test results to real-world scenarios. Also, it was beyond the scope of this study to link the emissions to human exposure and potential health risks.
Low birth weight and air pollution in California: Which sources and components drive the risk?
Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun
2016-01-01
Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cockpit display of hazardous weather information
NASA Technical Reports Server (NTRS)
Hansman, R. John, Jr.; Wanke, Craig
1991-01-01
Information transfer and display issues associated with the dissemination of hazardous weather warnings are studied in the context of wind shear alerts. Operational and developmental wind shear detection systems are briefly reviewed. The July 11, 1988 microburst events observed as part of the Denver Terminal Doppler Weather Radar (TDWR) operational evaluation are analyzed in terms of information transfer and the effectiveness of the microburst alerts. Information transfer, message content and display issues associated with microburst alerts generated from ground based sources (Doppler Radar, Low Level Wind Shear Alert System, and Pilot Reports) are evaluated by means fo pilot opinion surveys and part task simulator studies.
NASA Astrophysics Data System (ADS)
Ehrnsperger, Laura; Wunder, Tobias; Thomas, Christoph
2017-04-01
Forests are one of the dominant vegetation types on Earth and are an important sink for carbon on our planet. Forests are special ecosystems due to their great canopy height und complex architecture consisting of a subcanopy and a canopy layer, which changes the mechanisms of turbulent exchange within the plant canopy. To date, the sinks and sources of turbulence in forest canopies are not completely understood, especially the role of the pressure transport remains unclear. The INTRAMIX experiment was conducted in a mountainous Norway spruce (Picea abies) forest at the Fluxnet Waldstein site (DE-Bay) in Bavaria, Germany, for a period of 10 weeks in order to experimentally evaluate the significance of the pressure transport to the TKE budget for the first time. The INTRAMIX data of the dense mountain forest was compared to observations from a sparse Ponderosa pine (Pinus ponderosa) stand in Oregon, USA, to study the influence of forest architecture. We hypothesized that the pressure transport is more important in dense forest canopies as the crown decouples the subcanopy from the buoyancy- and shear-driven flow above the canopy. It is also investigated how atmospheric stability influences the TKE budget. Based upon model results from literature we expect the pressure transport to act as a source for TKE especially under free convective and unstable dynamic stability. Results to date indicate that pressure transport is most important in the subcanopy with decreasing magnitude with increasing height. Nevertheless, pressure transport is a continuous source of TKE above the canopy, while in the canopy and subcanopy layer pressure transport acts both as a sink and source term for TKE. In the tree crown layer pressure transport is a source in the morning and afternoon hours and acts as a sink during the evening, while in the subcanopy pressure transport is a source around noon and during the night and acts as a sink in the early morning and afternoon hours. This complementary pattern suggests that the pressure transport is an important means for exchanging TKE across canopy layers.
Solute source depletion control of forward and back diffusion through low-permeability zones
NASA Astrophysics Data System (ADS)
Yang, Minjune; Annable, Michael D.; Jawitz, James W.
2016-10-01
Solute diffusive exchange between low-permeability aquitards and high-permeability aquifers acts as a significant mediator of long-term contaminant fate. Aquifer contaminants diffuse into aquitards, but as contaminant sources are depleted, aquifer concentrations decline, triggering back diffusion from aquitards. The dynamics of the contaminant source depletion, or the source strength function, controls the timing of the transition of aquitards from sinks to sources. Here, we experimentally evaluate three archetypical transient source depletion models (step-change, linear, and exponential), and we use novel analytical solutions to accurately account for dynamic aquitard-aquifer diffusive transfer. Laboratory diffusion experiments were conducted using a well-controlled flow chamber to assess solute exchange between sand aquifer and kaolinite aquitard layers. Solute concentration profiles in the aquitard were measured in situ using electrical conductivity. Back diffusion was shown to begin earlier and produce larger mass flux for rapidly depleting sources. The analytical models showed very good correspondence with measured aquifer breakthrough curves and aquitard concentration profiles. The modeling approach links source dissolution and back diffusion, enabling assessment of human exposure risk and calculation of the back diffusion initiation time, as well as the resulting plume persistence.
Solute source depletion control of forward and back diffusion through low-permeability zones.
Yang, Minjune; Annable, Michael D; Jawitz, James W
2016-10-01
Solute diffusive exchange between low-permeability aquitards and high-permeability aquifers acts as a significant mediator of long-term contaminant fate. Aquifer contaminants diffuse into aquitards, but as contaminant sources are depleted, aquifer concentrations decline, triggering back diffusion from aquitards. The dynamics of the contaminant source depletion, or the source strength function, controls the timing of the transition of aquitards from sinks to sources. Here, we experimentally evaluate three archetypical transient source depletion models (step-change, linear, and exponential), and we use novel analytical solutions to accurately account for dynamic aquitard-aquifer diffusive transfer. Laboratory diffusion experiments were conducted using a well-controlled flow chamber to assess solute exchange between sand aquifer and kaolinite aquitard layers. Solute concentration profiles in the aquitard were measured in situ using electrical conductivity. Back diffusion was shown to begin earlier and produce larger mass flux for rapidly depleting sources. The analytical models showed very good correspondence with measured aquifer breakthrough curves and aquitard concentration profiles. The modeling approach links source dissolution and back diffusion, enabling assessment of human exposure risk and calculation of the back diffusion initiation time, as well as the resulting plume persistence. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Saumyadip; Abraham, John
2012-07-01
The unsteady flamelet progress variable (UFPV) model has been proposed by Pitsch and Ihme ["An unsteady/flamelet progress variable method for LES of nonpremixed turbulent combustion," AIAA Paper No. 2005-557, 2005] for modeling the averaged/filtered chemistry source terms in Reynolds averaged simulations and large eddy simulations of reacting non-premixed combustion. In the UFPV model, a look-up table of source terms is generated as a function of mixture fraction Z, scalar dissipation rate χ, and progress variable C by solving the unsteady flamelet equations. The assumption is that the unsteady flamelet represents the evolution of the reacting mixing layer in the non-premixed flame. We assess the accuracy of the model in predicting autoignition and flame development in compositionally stratified n-heptane/air mixtures using direct numerical simulations (DNS). The focus in this work is primarily on the assessment of accuracy of the probability density functions (PDFs) employed for obtaining averaged source terms. The performance of commonly employed presumed functions, such as the dirac-delta distribution function, the β distribution function, and statistically most likely distribution (SMLD) approach in approximating the shapes of the PDFs of the reactive and the conserved scalars is evaluated. For unimodal distributions, it is observed that functions that need two-moment information, e.g., the β distribution function and the SMLD approach with two-moment closure, are able to reasonably approximate the actual PDF. As the distribution becomes multimodal, higher moment information is required. Differences are observed between the ignition trends obtained from DNS and those predicted by the look-up table, especially for smaller gradients where the flamelet assumption becomes less applicable. The formulation assumes that the shape of the χ(Z) profile can be modeled by an error function which remains unchanged in the presence of heat release. We show that this assumption is not accurate.
High-order scheme for the source-sink term in a one-dimensional water temperature model
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005
High-order scheme for the source-sink term in a one-dimensional water temperature model.
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.
On the gravitational potential and field anomalies due to thin mass layers
NASA Technical Reports Server (NTRS)
Ockendon, J. R.; Turcotte, D. L.
1977-01-01
The gravitational potential and field anomalies for thin mass layers are derived using the technique of matched asymptotic expansions. An inner solution is obtained using an expansion in powers of the thickness and it is shown that the outer solution is given by a surface distribution of mass sources and dipoles. Coefficients are evaluated by matching the inner expansion of the outer solution with the outer expansion of the inner solution. The leading term in the inner expansion for the normal gravitational field gives the Bouguer formula. The leading term in the expansion for the gravitational potential gives an expression for the perturbation to the geoid. The predictions given by this term are compared with measurements by satellite altimetry. The second-order terms in the expansion for the gravitational field are required to predict the gravity anomaly at a continental margin. The results are compared with observations.
Economic evaluation of enhanced asthma management: a systematic review
Yong, Yee V.; Shafie, Asrul A.
2014-01-01
Objectives: To evaluate and compare full economic evaluation studies on the cost-effectiveness of enhanced asthma management (either as an adjunct to usual care or alone) vs. usual care alone. Methods: Online databases were searched for published journal articles in English language from year 1990 to 2012, using the search terms ‘“asthma” AND (“intervene” OR “manage”) AND (“pharmacoeconomics” OR “economic evaluation” OR “cost effectiveness” OR “cost benefit” OR “cost utility”)’. Hand search was done for local publishing. Only studies with full economic evaluation on enhanced management were included (cost consequences (CC), cost effectiveness (CE), cost benefit (CB), or cost utility (CU) analysis). Data were extracted and assessed for the quality of its economic evaluation design and evidence sources. Results: A total of 49 studies were included. There were 3 types of intervention for enhanced asthma management: education, environmental control, and self-management. The most cost-effective enhanced management was a mixture of education and self-management by an integrated team of healthcare and allied healthcare professionals. In general, the studies had a fair quality of economic evaluation with a mean QHES score of 73.7 (SD=9.7), and had good quality of evidence sources. Conclusion: Despite the overall fair quality of economic evaluations but good quality of evidence sources for all data components, this review showed that the delivered enhanced asthma managements, whether as single or mixed modes, were overall effective and cost-reducing. Whilst the availability and accessibility are an equally important factor to consider, the sustainability of the cost-effective management has to be further investigated using a longer time horizon especially for chronic diseases such as asthma. PMID:25580173
Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy
Hall, Matthew L.
2011-01-01
Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory – perception, encoding, and recall – in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English was used for perception, memory encoding, and recall in hearing ASL-English bilinguals. Results indicate that using ASL during both perception and encoding contributes to the serial span discrepancy. Interestingly, performing recall in ASL slightly increased span, ruling out the view that signing is in general a poor choice for short-term memory. These results suggest that despite the general equivalence of sign and speech in other memory domains, speech-based representations are better suited for the specific task of perception and memory encoding of a series of unrelated verbal items in serial order through the phonological loop. This work suggests that interpretation of performance on serial recall tasks in English may not translate straightforwardly to serial tasks in sign language. PMID:21450284
Investigation of a family of power conditioners integrated into a utility grid: Category 1
NASA Astrophysics Data System (ADS)
Wood, P.; Putkovich, R. P.
1981-07-01
Technical issues regarding ac and dc interface requirements were studied. A baseline design was selected to be a good example of existing technology which would not need significant development effort for its implementation in residential solar photovoltaic systems. Alternative technologies are evaluated to determine which meet the baseline specification, and their costs and losses are evaluated. Areas in which cost improvements can be obtained are studied, and the three best candidate technologies--the current sourced converter, the HF front end converter, and the programmed wave converter--are compared. It is concluded that the designs investigated will meet, or with slight improvement could meet, short term efficiency goals. Long term efficiency goals could be met if an isolation transformer were not required in the power conditioning equipment. None of the technologies studied can meet cost goals unless further improvements are possible.
NASA Astrophysics Data System (ADS)
Phelan, Thomas J.; Abriola, Linda M.; Gibson, Jenny L.; Smits, Kathleen M.; Christ, John A.
2015-12-01
In-situ bioremediation, a widely applied treatment technology for source zones contaminated with dense non-aqueous phase liquids (DNAPLs), has proven economical and reasonably efficient for long-term management of contaminated sites. Successful application of this remedial technology, however, requires an understanding of the complex interaction of transport, mass transfer, and biotransformation processes. The bioenhancement factor, which represents the ratio of DNAPL mass transfer under microbially active conditions to that which would occur under abiotic conditions, is commonly used to quantify the effectiveness of a particular bioremediation remedy. To date, little research has been directed towards the development and validation of methods to predict bioenhancement factors under conditions representative of real sites. This work extends an existing, first-order, bioenhancement factor expression to systems with zero-order and Monod kinetics, representative of many source-zone scenarios. The utility of this model for predicting the bioenhancement factor for previously published laboratory and field experiments is evaluated. This evaluation demonstrates the applicability of these simple bioenhancement factors for preliminary experimental design and analysis, and for assessment of dissolution enhancement in ganglia-contaminated source zones. For ease of application, a set of nomographs is presented that graphically depicts the dependence of bioenhancement factor on physicochemical properties. Application of these nomographs is illustrated using data from a well-documented field site. Results suggest that this approach can successfully capture field-scale, as well as column-scale, behavior. Sensitivity analyses reveal that bioenhanced dissolution will critically depend on in-situ biomass concentrations.
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Quantifying Transmission of Clostridium difficile within and outside Healthcare Settings
Olsen, Margaret A.; Dubberke, Erik R.; Galvani, Alison P.; Townsend, Jeffrey P.
2016-01-01
To quantify the effect of hospital and community-based transmission and control measures on Clostridium difficile infection (CDI), we constructed a transmission model within and between hospital, community, and long-term care-facility settings. By parameterizing the model from national databases and calibrating it to C. difficile prevalence and CDI incidence, we found that hospitalized patients with CDI transmit C. difficile at a rate 15 (95% CI 7.2–32) times that of asymptomatic patients. Long-term care facility residents transmit at a rate of 27% (95% CI 13%–51%) that of hospitalized patients, and persons in the community at a rate of 0.1% (95% CI 0.062%–0.2%) that of hospitalized patients. Despite lower transmission rates for asymptomatic carriers and community sources, these transmission routes have a substantial effect on hospital-onset CDI because of the larger reservoir of hospitalized carriers and persons in the community. Asymptomatic carriers and community sources should be accounted for when designing and evaluating control interventions. PMID:26982504
JAMSS: proteomics mass spectrometry simulation in Java.
Smith, Rob; Prince, John T
2015-03-01
Countless proteomics data processing algorithms have been proposed, yet few have been critically evaluated due to lack of labeled data (data with known identities and quantities). Although labeling techniques exist, they are limited in terms of confidence and accuracy. In silico simulators have recently been used to create complex data with known identities and quantities. We propose Java Mass Spectrometry Simulator (JAMSS): a fast, self-contained in silico simulator capable of generating simulated MS and LC-MS runs while providing meta information on the provenance of each generated signal. JAMSS improves upon previous in silico simulators in terms of its ease to install, minimal parameters, graphical user interface, multithreading capability, retention time shift model and reproducibility. The simulator creates mzML 1.1.0. It is open source software licensed under the GPLv3. The software and source are available at https://github.com/optimusmoose/JAMSS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Gopalakrishnan, V; Baskaran, R; Venkatraman, B
2016-08-01
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less
IMPROVEMENTS IN THE THERMAL NEUTRON CALIBRATION UNIT, TNF2, AT LNMRI/IRD.
Astuto, A; Fernandes, S S; Patrão, K C S; Fonseca, E S; Pereira, W W; Lopes, R T
2018-02-21
The standard thermal neutron flux unit, TNF2, in the Brazilian National Ionizing Radiation Metrology Laboratory was rebuilt. Fluence is still achieved by moderating of four 241Am-Be sources with 0.6 TBq each. The facility was again simulated and redesigned with graphite core and paraffin added graphite blocks surrounding it. Simulations using the MCNPX code on different geometric arrangements of moderator materials and neutron sources were performed. The resulting neutron fluence quality in terms of intensity, spectrum and cadmium ratio was evaluated. After this step, the system was assembled based on the results obtained from the simulations and measurements were performed with equipment existing in LNMRI/IRD and by simulated equipment. This work focuses on the characterization of a central chamber point and external points around the TNF2 in terms of neutron spectrum, fluence and ambient dose equivalent, H*(10). This system was validated with spectra measurements, fluence and H*(10) to ensure traceability.
A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms
2014-01-01
Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581
The quality and readability of internet information regarding clavicle fractures.
Zhang, Dafang; Schumacher, Charles; Harris, Mitchel Byron
2016-03-01
The internet has become a major source of health information for patients. However, there has been little scrutiny of health information available on the internet to the public. Our objectives were to evaluate the quality and readability of information available on the internet regarding clavicle fractures and whether they changed with academic affiliation of the website or with complexity of the search term. Through a prospective evaluation of 3 search engines using 3 different search terms of varying complexity ("broken collarbone," "collarbone fracture," and "clavicle fracture"), we evaluated 91 website hits for quality and readability. Websites were specifically analyzed by search term and by website type. Information quality was evaluated on a four-point scale, and information readability was assessed using the Flesch-Kincaid score for reading grade level. The average quality score for our website hits was low, and the average reading grade level was far above the recommended level. Academic websites offered significantly higher quality information, whereas commercial websites offered significantly lower quality information. The use of more complex search terms yielded information of higher reading grade level but not higher quality. Current internet information regarding clavicle fractures is of low quality and low readability. Higher quality information utilizing more accessible language on clavicle fractures is needed on the internet. It is important to be aware of the information accessible to patients prior to their presentation to our clinics. Patients should be advised to visit websites with academic affiliations and to avoid commercial websites. Copyright © 2015 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
26 CFR 1.737-1 - Recognition of precontribution gain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...
Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.
Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N
2007-12-07
A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.
Safari, Ameneh; Safari, Yahya
2018-08-01
Evidence-based medicine (EBM) is defining proper and wise use of the best evidence in clinical decision for patient׳s care. This study have done with the aim of evaluating health information system for decision-making with EBM approach in educational hospital of Kermanshah city. The statistical population include all the specialist and specialty, and also head nurses of educational hospitals in Kermanshah city. The data collected by researcher made questionnaire. The content validities of the questionnaire were confirmed by experts to complete the questions of the questionnaire. Then, the reliability of the questionnaire was evaluated using the Cronbach׳s alpha coefficient. The results have showed that the accessibility rate to the internet sources is in desirable level. The results have showed that there was a significant difference at least in one group between the availability of hospital information system EBM establishment in terms of accessing to the internet based data, according to the academic major ( P = 0.021 ). The sufficiency of hospital information system in evidence-based medicine establishment in terms of necessary knowledge for implementing it according to the educational major have showed a significant statistical difference at least in one group ( P = 0.001 ). Kermanshah׳s hospital have a desirable condition in terms of accessibility to the internet sources, knowledge of EBM and its implementation which this have showed the availability of desirable platform for decision-making with the EBM approach. However, it is better to implement regulate educational periods for educating the doctors and nurses in order to reach practical implementation of the EBM approach.
Toward a common language for biobanking.
Fransson, Martin N; Rial-Sebbag, Emmanuelle; Brochhausen, Mathias; Litton, Jan-Eric
2015-01-01
To encourage the process of harmonization, the biobank community should support and use a common terminology. Relevant terms may be found in general thesauri for medicine, legal instruments or specific glossaries for biobanking. A comparison of the use of these sources has so far not been conducted and would be a useful instrument to further promote harmonization and data sharing. Thus, the purpose of the present study was to investigate the preference of definitions important for sharing biological samples and data. Definitions for 10 terms -[human] biobank, sample/specimen, sample collection, study, aliquot, coded, identifying information, anonymised, personal data and informed consent-were collected from several sources. A web-based questionnaire was sent to 560 European individuals working with biobanks asking to select their preferred definition for the terms. A total of 123 people participated in the survey, giving a response rate of 23%. The result was evaluated from four aspects: scope of definitions, potential regional differences, differences in semantics and definitions in the context of ontologies, guided by comments from responders. Indicative from the survey is the risk of focusing only on the research aspect of biobanking in definitions. Hence, it is recommended that important terms should be formulated in such a way that all areas of biobanking are covered to improve the bridges between research and clinical application. Since several of the terms investigated here within can also be found in a legal context, which may differ between countries, establishing what is a proper definition on how it adheres to law is also crucial.
Modeling the contribution of point sources and non-point sources to Thachin River water pollution.
Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth
2009-08-15
Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.
Brusseau, M. L.; Hatton, J.; DiGuiseppi, W.
2011-01-01
The long-term impact of source-zone remediation efforts was assessed for a large site contaminated by trichloroethene. The impact of the remediation efforts (soil vapor extraction and in-situ chemical oxidation) was assessed through analysis of plume-scale contaminant mass discharge, which was measured using a high-resolution data set obtained from 23 years of operation of a large pump-and-treat system. The initial contaminant mass discharge peaked at approximately 7 kg/d, and then declined to approximately 2 kg/d. This latter value was sustained for several years prior to the initiation of source-zone remediation efforts. The contaminant mass discharge in 2010, measured several years after completion of the two source-zone remediation actions, was approximately 0.2 kg/d, which is ten times lower than the value prior to source-zone remediation. The time-continuous contaminant mass discharge data can be used to evaluate the impact of the source-zone remediation efforts on reducing the time required to operate the pump-and-treat system, and to estimate the cost savings associated with the decreased operational period. While significant reductions have been achieved, it is evident that the remediation efforts have not completely eliminated contaminant mass discharge and associated risk. Remaining contaminant mass contributing to the current mass discharge is hypothesized to comprise poorly-accessible mass in the source zones, as well as aqueous (and sorbed) mass present in the extensive lower-permeability units located within and adjacent to the contaminant plume. The fate of these sources is an issue of critical import to the remediation of chlorinated-solvent contaminated sites, and development of methods to address these sources will be required to achieve successful long-term management of such sites and to ultimately transition them to closure. PMID:22115080
Bayesian source term determination with unknown covariance of measurements
NASA Astrophysics Data System (ADS)
Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav
2017-04-01
Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Study of SPM tolerances of electronically compensated DML based systems.
Papagiannakis, I; Klonidis, D; Birbas, Alexios N; Kikidis, J; Tomkos, I
2009-05-25
This paper experimentally investigates the effectiveness of electronic dispersion compensation (EDC) for signals limited by self phase modulation (SPM) and various dispersion levels. The sources considered are low-cost conventional directly modulated lasers (DMLs), fabricated for operation at 2.5 Gb/s but modulated at 10 Gb/s. Performance improvement is achieved by means of electronic feed-forward and decision-feedback equalization (FFE/DFE) at the receiver end. Experimental studies consider both transient and adiabatic chirp dominated DMLs sources. The improvement is evaluated in terms of required optical signal-to-noise ratio (ROSNR) for bit-error-rate (BER) values of 10(-3) versus launch power over uncompensated links of standard single mode fiber (SSMF).
Rafael Santiago; Tom Gallagher; Matthew Smidt; Dana Mitchell
2016-01-01
Renewable fuels are being tested as an alternative for fossil fuels. For the Southeast U.S., the use of woody biomass has proven to be an excellent source of renewable energy in terms of cost benefit and availability. Short rotation woody crops (SRWC) are timber plantations with exclusive characteristics that can meet the intensive demand for wood due to their fast...
2013-09-30
Circulation (HC) in terms of the meridional streamfunction. The interannual variability of the Atlantic HC in boreal summer was examined using the EOF...large-scale circulations in the NAVGEM model and the source of predictability for the seasonal variation of the Atlantic TCs. We have been working...EOF analysis of Meridional Circulation (JAS). (a) The leading mode (M1); (b) variance explained by the first 10 modes. 9
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
Survey on the Performance of Source Localization Algorithms.
Fresno, José Manuel; Robles, Guillermo; Martínez-Tarifa, Juan Manuel; Stewart, Brian G
2017-11-18
The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton-Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm.
Survey on the Performance of Source Localization Algorithms
2017-01-01
The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton–Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm. PMID:29156565
Dy, Christopher J; Taylor, Samuel A; Patel, Ronak M; Kitay, Alison; Roberts, Timothy R; Daluiski, Aaron
2012-09-01
Recent emphasis on shared decision making and patient-centered research has increased the importance of patient education and health literacy. The internet is rapidly growing as a source of self-education for patients. However, concern exists over the quality, accuracy, and readability of the information. Our objective was to determine whether the quality, accuracy, and readability of information online about distal radius fractures vary with the search term. This was a prospective evaluation of 3 search engines using 3 different search terms of varying sophistication ("distal radius fracture," "wrist fracture," and "broken wrist"). We evaluated 70 unique Web sites for quality, accuracy, and readability. We used comparative statistics to determine whether the search term affected the quality, accuracy, and readability of the Web sites found. Three orthopedic surgeons independently gauged quality and accuracy of information using a set of predetermined scoring criteria. We evaluated the readability of the Web site using the Fleisch-Kincaid score for reading grade level. There were significant differences in the quality, accuracy, and readability of information found, depending on the search term. We found higher quality and accuracy resulted from the search term "distal radius fracture," particularly compared with Web sites resulting from the term "broken wrist." The reading level was higher than recommended in 65 of the 70 Web sites and was significantly higher when searching with "distal radius fracture" than "wrist fracture" or "broken wrist." There was no correlation between Web site reading level and quality or accuracy. The readability of information about distal radius fractures in most Web sites was higher than the recommended reading level for the general public. The quality and accuracy of the information found significantly varied with the sophistication of the search term used. Physicians, professional societies, and search engines should consider efforts to improve internet access to high-quality information at an understandable level. Copyright © 2012 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
An integrated system for dynamic control of auditory perspective in a multichannel sound field
NASA Astrophysics Data System (ADS)
Corey, Jason Andrew
An integrated system providing dynamic control of sound source azimuth, distance and proximity to a room boundary within a simulated acoustic space is proposed for use in multichannel music and film sound production. The system has been investigated, implemented, and psychoacoustically tested within the ITU-R BS.775 recommended five-channel (3/2) loudspeaker layout. The work brings together physical and perceptual models of room simulation to allow dynamic placement of virtual sound sources at any location of a simulated space within the horizontal plane. The control system incorporates a number of modules including simulated room modes, "fuzzy" sources, and tracking early reflections, whose parameters are dynamically changed according to sound source location within the simulated space. The control functions of the basic elements, derived from theories of perception of a source in a real room, have been carefully tuned to provide efficient, effective, and intuitive control of a sound source's perceived location. Seven formal listening tests were conducted to evaluate the effectiveness of the algorithm design choices. The tests evaluated: (1) loudness calibration of multichannel sound images; (2) the effectiveness of distance control; (3) the resolution of distance control provided by the system; (4) the effectiveness of the proposed system when compared to a commercially available multichannel room simulation system in terms of control of source distance and proximity to a room boundary; (5) the role of tracking early reflection patterns on the perception of sound source distance; (6) the role of tracking early reflection patterns on the perception of lateral phantom images. The listening tests confirm the effectiveness of the system for control of perceived sound source distance, proximity to room boundaries, and azimuth, through fine, dynamic adjustment of parameters according to source location. All of the parameters are grouped and controlled together to create a perceptually strong impression of source location and movement within a simulated space.
Tamura, Hirosumi; Higa, Arisa; Hoshi, Hirotaka; Hiyama, Gen; Takahashi, Nobuhiko; Ryufuku, Masae; Morisawa, Gaku; Yanagisawa, Yuka; Ito, Emi; Imai, Jun-Ichi; Dobashi, Yuu; Katahira, Kiyoaki; Soeda, Shu; Watanabe, Takafumi; Fujimori, Keiya; Watanabe, Shinya; Takagi, Motoki
2018-06-18
Patient-derived tumor xenograft models represent a promising preclinical cancer model that better replicates disease, compared with traditional cell culture; however, their use is low-throughput and costly. To overcome this limitation, patient-derived tumor organoids (PDOs) were established from human lung, ovarian and uterine tumor tissues, among others, to accurately and efficiently recapitulate the tissue architecture and function. PDOs were able to be cultured for >6 months, and formed cell clusters with similar morphologies to their source tumors. Comparative histological and comprehensive gene expression analyses proved that the characteristics of PDOs were similar to those of their source tumors, even following long-term expansion in culture. At present, 53 PDOs have been established by the Fukushima Translational Research Project, and were designated as Fukushima PDOs (F‑PDOs). In addition, the in vivo tumorigenesis of certain F‑PDOs was confirmed using a xenograft model. The present study represents a detailed analysis of three F‑PDOs (termed REME9, 11 and 16) established from endometrial cancer tissues. These were used for cell growth inhibition experiments using anticancer agents. A suitable high-throughput assay system, with 96- or 384‑well plates, was designed for each F‑PDO, and the efficacy of the anticancer agents was subsequently evaluated. REME9 and 11 exhibited distinct responses and increased resistance to the drugs, as compared with conventional cancer cell lines (AN3 CA and RL95-2). REME9 and 11, which were established from tumors that originated in patients who did not respond to paclitaxel and carboplatin (the standard chemotherapy for endometrial cancer), exhibited high resistance (half-maximal inhibitory concentration >10 µM) to the two agents. Therefore, assay systems using F‑PDOs may be utilized to evaluate anticancer agents using conditions that better reflect clinical conditions, compared with conventional methods using cancer cell lines, and to discover markers that identify the pharmacological effects of anticancer agents.
NASA Astrophysics Data System (ADS)
Bonhoff, H. A.; Petersson, B. A. T.
2010-08-01
For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.
Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions
NASA Astrophysics Data System (ADS)
Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.
2017-12-01
Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.
Brandt, Justin; Catchings, Rufus D.; Christensen, Allen H.; Flint, Alan L.; Gandhok, Gini; Goldman, Mark R.; Halford, Keith J.; Langenheim, V.E.; Martin, Peter; Rymer, Michael J.; Schroeder, Roy A.; Smith, Gregory A.; Sneed, Michelle; Martin, Peter
2011-01-01
Agua Caliente Spring, in downtown Palm Springs, California, has been used for recreation and medicinal therapy for hundreds of years and currently (2008) is the source of hot water for the Spa Resort owned by the Agua Caliente Band of the Cahuilla Indians. The Agua Caliente Spring is located about 1,500 feet east of the eastern front of the San Jacinto Mountains on the southeast-sloping alluvial plain of the Coachella Valley. The objectives of this study were to (1) define the geologic structure associated with the Agua Caliente Spring; (2) define the source(s), and possibly the age(s), of water discharged by the spring; (3) ascertain the seasonal and longer-term variability of the natural discharge, water temperature, and chemical characteristics of the spring water; (4) evaluate whether water-level declines in the regional aquifer will influence the temperature of the spring discharge; and, (5) estimate the quantity of spring water that leaks out of the water-collector tank at the spring orifice.
Medical Subject Headings (MeSH) for indexing and retrieving open-source healthcare data.
Marc, David T; Khairat, Saif S
2014-01-01
The US federal government initiated the Open Government Directive where federal agencies are required to publish high value datasets so that they are available to the public. Data.gov and the community site Healthdata.gov were initiated to disperse such datasets. However, data searches and retrieval for these sites are keyword driven and severely limited in performance. The purpose of this paper is to address the issue of extracting relevant open-source data by proposing a method of adopting the MeSH framework for indexing and data retrieval. A pilot study was conducted to compare the performance of traditional keywords to MeSH terms for retrieving relevant open-source datasets related to "mortality". The MeSH framework resulted in greater sensitivity with comparable specificity to the keyword search. MeSH showed promise as a method for indexing and retrieving data, yet future research should conduct a larger scale evaluation of the performance of the MeSH framework for retrieving relevant open-source healthcare datasets.
Reduced mercury deposition in New Hampshire from 1996 to 2002 due to changes in local sources.
Han, Young-Ji; Holsen, Thomas M; Evers, David C; Driscoll, Charles T
2008-12-01
Changes in deposition of gaseous divalent mercury (Hg(II)) and particulate mercury (Hg(p)) in New Hampshire due to changes in local sources from 1996 to 2002 were assessed using the Industrial Source Complex Short Term (ISCST3) model (regional and global sources and Hg atmospheric reactions were not considered). Mercury (Hg) emissions in New Hampshire and adjacent areas decreased significantly (from 1540 to 880 kg yr(-1)) during this period, and the average annual modeled deposition of total Hg also declined from 17 to 7.0 microg m(-2) yr(-1) for the same period. In 2002, the maximum amount of Hg deposition was modeled to be in southern New Hampshire, while for 1996 the maximum deposition occurred farther north and east. The ISCST3 was also used to evaluate two future scenarios. The average percent difference in deposition across all cells was 5% for the 50% reduction scenario and 9% for the 90% reduction scenario.
Economic evaluation of vaccines in Canada: A systematic review
Chit, Ayman; Lee, Jason K. H.; Shim, Minsup; Nguyen, Van Hai; Grootendorst, Paul; Wu, Jianhong; Van Exan, Robert; Langley, Joanne M.
2016-01-01
ABSTRACT Background: Economic evaluations should form part of the basis for public health decision making on new vaccine programs. While Canada's national immunization advisory committee does not systematically include economic evaluations in immunization decision making, there is increasing interest in adopting them. We therefore sought to examine the extent and quality of economic evaluations of vaccines in Canada. Objective: We conducted a systematic review of economic evaluations of vaccines in Canada to determine and summarize: comprehensiveness across jurisdictions, studied vaccines, funding sources, study designs, research quality, and changes over time. Methods: Searches in multiple databases were conducted using the terms “vaccine,” “economics” and “Canada.” Descriptive data from eligible manuscripts was abstracted and three authors independently evaluated manuscript quality using a 7-point Likert-type scale scoring tool based on criteria from the International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Results: 42/175 articles met the search criteria. Of these, Canada-wide studies were most common (25/42), while provincial studies largely focused on the three populous provinces of Ontario, Quebec and British Columbia. The most common funding source was industry (17/42), followed by government (7/42). 38 studies used mathematical models estimating expected economic benefit while 4 studies examined post-hoc data on established programs. Studies covered 10 diseases, with 28/42 addressing pediatric vaccines. Many studies considered cost-utility (22/42) and the majority of these studies reported favorable economic results (16/22). The mean quality score was 5.9/7 and was consistent over publication date, funding sources, and disease areas. Conclusions: We observed diverse approaches to evaluate vaccine economics in Canada. Given the increased complexity of economic studies evaluating vaccines and the impact of results on public health practice, Canada needs improved, transparent and consistent processes to review and assess the findings of the economic evaluations of vaccines. PMID:26890128
Habilomatis, George; Chaloulakou, Archontoula
2013-10-01
Recently, a branch of particulate matter research concerns on ultrafine particles found in the urban environment, which originate, to a significant extent, from traffic sources. In urban street canyons, dispersion of ultrafine particles affects pedestrian's short term exposure and resident's long term exposure as well. The aim of the present work is the development and the evaluation of a composite lattice Boltzmann model to study the dispersion of ultrafine particles, in urban street canyon microenvironment. The proposed model has the potential to penetrate into the physics of this complex system. In order to evaluate the model performance against suitable experimental data, ultrafine particles levels have been monitored on an hourly basis for a period of 35 days, in a street canyon, in Athens area. The results of the comparative analysis are quite satisfactory. Furthermore, our modeled results are in a good agreement with the results of other computational and experimental studies. This work is a first attempt to study the dispersion of an air pollutant by application of the lattice Boltzmann method. Copyright © 2013 Elsevier B.V. All rights reserved.
Defining the concept of ‘tick repellency’ in veterinary medicine
HALOS, L.; BANETH, G.; BEUGNET, F.; BOWMAN, A. S.; CHOMEL, B.; FARKAS, R.; FRANC, M.; GUILLOT, J.; INOKUMA, H.; KAUFMAN, R.; JONGEJAN, F.; JOACHIM, A.; OTRANTO, D.; PFISTER, K.; POLLMEIER, M.; SAINZ, A.; WALL, R.
2012-01-01
SUMMARY Although widely used, the term repellency needs to be employed with care when applied to ticks and other periodic or permanent ectoparasites. Repellency has classically been used to describe the effects of a substance that causes a flying arthropod to make oriented movements away from its source. However, for crawling arthropods such as ticks, the term commonly subsumes a range of effects that include arthropod irritation and consequent avoiding or leaving the host, failing to attach, to bite, or to feed. The objective of the present article is to highlight the need for clarity, to propose consensus descriptions and methods for the evaluation of various effects on ticks caused by chemical substances. PMID:22216951
The concentration-discharge slope as a tool for water quality management.
Bieroza, M Z; Heathwaite, A L; Bechmann, M; Kyllmar, K; Jordan, P
2018-07-15
Recent technological breakthroughs of optical sensors and analysers have enabled matching the water quality measurement interval to the time scales of stream flow changes and led to an improved understanding of spatially and temporally heterogeneous sources and delivery pathways for many solutes and particulates. This new ability to match the chemograph with the hydrograph has promoted renewed interest in the concentration-discharge (c-q) relationship and its value in characterizing catchment storage, time lags and legacy effects for both weathering products and anthropogenic pollutants. In this paper we evaluated the stream c-q relationships for a number of water quality determinands (phosphorus, suspended sediments, nitrogen) in intensively managed agricultural catchments based on both high-frequency (sub-hourly) and long-term low-frequency (fortnightly-monthly) routine monitoring data. We used resampled high-frequency data to test the uncertainty in water quality parameters (e.g. mean, 95th percentile and load) derived from low-frequency sub-datasets. We showed that the uncertainty in water quality parameters increases with reduced sampling frequency as a function of the c-q slope. We also showed that different sources and delivery pathways control c-q relationship for different solutes and particulates. Secondly, we evaluated the variation in c-q slopes derived from the long-term low-frequency data for different determinands and catchments and showed strong chemostatic behaviour for phosphorus and nitrogen due to saturation and agricultural legacy effects. The c-q slope analysis can provide an effective tool to evaluate the current monitoring networks and the effectiveness of water management interventions. This research highlights how improved understanding of solute and particulate dynamics obtained with optical sensors and analysers can be used to understand patterns in long-term water quality time series, reduce the uncertainty in the monitoring data and to manage eutrophication in agricultural catchments. Copyright © 2018 Elsevier B.V. All rights reserved.
Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E
1994-01-01
OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788
A Semi-implicit Treatment of Porous Media in Steady-State CFD.
Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund
There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.
NASA Technical Reports Server (NTRS)
Bomani, Bilal Mark McDowell; Link, Dirk; Kail, Brian; Morreale, Bryan; Lee, Eric S.; Gigante, Bethany M.; Hendricks, Robert C.
2014-01-01
Finding a viable and sustainable source of renewable energy is a global task. Biofuels as a renewable energy source can potentially be a viable option for sustaining long-term energy needs. Biodiesel from halophytes shows great promise due to their ability to serve not only as a fuel source, but a food source as well. Halophytes are one of the few biomass plant species that can tolerate a wide range of saline conditions. We investigate the feasibility of using the halophyte, Salicornia virginica as a biofuel source by conducting a series of experiments utilizing various growth and salinity conditions. The goal is to determine if the saline content of Salicornia virginica in our indoor growth vs outdoor growth conditions has an influence on lipid recovery and total biomass composition. We focused on using standard lipid extraction protocols and characterization methods to evaluate twelve Salicornia virginica samples under six saline values ranging from freshwater to seawater and two growth conditions. The overall goal is to develop an optimal lipid extraction protocol for Salicornia virginica and potentially apply this protocol to halophytes in general.
EN FACE IMAGING OF RETINAL ARTERY MACROANEURYSMS USING SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY.
Hanhart, Joel; Strassman, Israel; Rozenman, Yaakov
2017-01-01
To describe the advantages of en face view with swept-source optical coherence tomography in assessing the morphologic features of retinal arterial macroaneurysms, their consequences on adjacent retina, planning laser treatment, and evaluating its effects. Three eyes were treated for retinal arterial macroaneurysms and followed by swept-source optical coherence tomography in 2014-2015. En face images of the retina and choroid were obtained by EnView, a swept-source optical coherence tomography program. Retinal arterial macroaneurysms have a typical optical coherence tomography appearance. En face view allows delineation of the macroaneurysm wall, thrombotic components within the dilation, and lumen measurement. Hemorrhage, lipids, and fluids can be precisely described in terms of amount and extent over the macula and depth. This technique is also practical for planning focal laser treatment and determining its effects. En face swept-source optical coherence tomography is a rapid, noninvasive, high-resolution, promising technology, which allows excellent visualization of retinal arterial macroaneurysms and their consequences on surrounding tissues. It could make angiography with intravenous injection redundant in planning and assessing therapy.
NASA Technical Reports Server (NTRS)
Gyorgak, C. A.
1975-01-01
An evaluation was made of five braze filler metals for joining an aluminum-containing oxide dispersion-strengthened (ODS) alloy, TD-NiCrAl. All five braze filler metals evaluated are considered suitable for joining TD-NiCrAl in terms of wettability and flow. Also, the braze alloys appear to be tolerant of slight variations in brazing procedures since joints prepared by three sources using three of the braze filler metals exhibited similar brazing characteristics and essentially equivalent 1100 C stress-rupture properties in a brazed butt-joint configuration. Recommendations are provided for brazing the aluminum-containing ODS alloys.
Land surface Verification Toolkit (LVT)
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.
2017-01-01
LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.
Re-Evaluating Satellite Solar Power Systems for Earth
NASA Technical Reports Server (NTRS)
Landis, Geoffrey A.
2006-01-01
The Solar Power Satellite System is a concept to collect solar power in space, and then transport it to the surface of the Earth by microwave (or possibly laser) beam, where if is converted into electrical power for terrestrial use. The recent increase in energy costs, predictions of the near-term exhaustion of oil, and prominence of possible climate change due to the "greenhouse effect" from burning of fossil fuels has again brought alternative energy sources to public attention, and the time is certainly appropriate to reexamine the economics of space based power. Several new concepts for Satellite Power System designs were evaluated to make the concept more economically feasible.
Pan, Po-Lin; Meng, Juan
2015-01-01
This study examined how major TV news networks covered two flu pandemics in 1976 and 2009 in terms of news frames, mortality exemplars, mortality subject attributes, vaccination, evaluation approaches, and news sources. Results showed that the first pandemic was frequently framed with the medical/scientific and political/legal issues, while the second pandemic was emphasized with the health risk issue in TV news. Both flu pandemics were regularly reported with mortality exemplars, but the focus in the first pandemic was on the flu virus threat and vaccination side effects, while the vaccination shortage was frequently revealed in the second outbreak.
Runge, Michael C.; LaGory, Kirk E.; Russell, Kendra; Balsom, Janet R.; Butler, R. Alan; Coggins,, Lewis G.; Grantz, Katrina A.; Hayse, John; Hlohowskyj, Ihor; Korman, Josh; May, James E.; O'Rourke, Daniel J.; Poch, Leslie A.; Prairie, James R.; VanKuiken, Jack C.; Van Lonkhuyzen, Robert A.; Varyu, David R.; Verhaaren, Bruce T.; Veselka, Thomas D.; Williams, Nicholas T.; Wuthrich, Kelsey K.; Yackulic, Charles B.; Billerbeck, Robert P.; Knowles, Glen W.
2016-01-07
The results of the decision analysis are meant to serve as only one of many sources of information that can be used to evaluate the alternatives proposed in the Environmental Impact Statement. These results only focus on those resource goals for which quantitative performance metrics could be formulated and evaluated; there are other important aspects of the resource goals that also need to be considered. Not all the stakeholders who were invited to participate in the decision analysis chose to do so; thus, the Bureau of Reclamation, National Park Service, and U.S. Department of Interior may want to consider other input.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.
Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ({sup 3}H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total {sup 3}H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon etmore » al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use {sup 3}H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily {sup 3}H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the {sup 3}H discharge from SWSA 5 to streams is increasing or decreasing.« less
Preliminary investigation of processes that affect source term identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.
Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ({sup 3}H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total {sup 3}H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon etmore » al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use {sup 3}H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily {sup 3}H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the {sup 3}H discharge from SWSA 5 to streams is increasing or decreasing.« less
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne
2014-01-01
Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.
Mendez, Monica O.; Maier, Raina M.
2008-01-01
Objective Unreclaimed mine tailings sites are a worldwide problem, with thousands of unvegetated, exposed tailings piles presenting a source of contamination for nearby communities. Tailings disposal sites in arid and semiarid environments are especially subject to eolian dispersion and water erosion. Phytostabilization, the use of plants for in situ stabilization of tailings and metal contaminants, is a feasible alternative to costly remediation practices. In this review we emphasize considerations for phytostabilization of mine tailings in arid and semiarid environments, as well as issues impeding its long-term success. Data sources We reviewed literature addressing mine closures and revegetation of mine tailings, along with publications evaluating plant ecology, microbial ecology, and soil properties of mine tailings. Data extraction Data were extracted from peer-reviewed articles and books identified in Web of Science and Agricola databases, and publications available through the U.S. Department of Agriculture, U.S. Environmental Protection Agency, and the United Nations Environment Programme. Data synthesis Harsh climatic conditions in arid and semiarid environments along with the innate properties of mine tailings require specific considerations. Plants suitable for phytostabilization must be native, be drought-, salt-, and metal-tolerant, and should limit shoot metal accumulation. Factors for evaluating metal accumulation and toxicity issues are presented. Also reviewed are aspects of implementing phytostabilization, including plant growth stage, amendments, irrigation, and evaluation. Conclusions Phytostabilization of mine tailings is a promising remedial technology but requires further research to identify factors affecting its long-term success by expanding knowledge of suitable plant species and mine tailings chemistry in ongoing field trials. PMID:18335091
Comparison of Phase-Based 3D Near-Field Source Localization Techniques for UHF RFID.
Parr, Andreas; Miesen, Robert; Vossiek, Martin
2016-06-25
In this paper, we present multiple techniques for phase-based narrowband backscatter tag localization in three-dimensional space with planar antenna arrays or synthetic apertures. Beamformer and MUSIC localization algorithms, known from near-field source localization and direction-of-arrival estimation, are applied to the 3D backscatter scenario and their performance in terms of localization accuracy is evaluated. We discuss the impact of different transceiver modes known from the literature, which evaluate different send and receive antenna path combinations for a single localization, as in multiple input multiple output (MIMO) systems. Furthermore, we propose a new Singledimensional-MIMO (S-MIMO) transceiver mode, which is especially suited for use with mobile robot systems. Monte-Carlo simulations based on a realistic multipath error model ensure spatial correlation of the simulated signals, and serve to critically appraise the accuracies of the different localization approaches. A synthetic uniform rectangular array created by a robotic arm is used to evaluate selected localization techniques. We use an Ultra High Frequency (UHF) Radiofrequency Identification (RFID) setup to compare measurements with the theory and simulation. The results show how a mean localization accuracy of less than 30 cm can be reached in an indoor environment. Further simulations demonstrate how the distance between aperture and tag affects the localization accuracy and how the size and grid spacing of the rectangular array need to be adapted to improve the localization accuracy down to orders of magnitude in the centimeter range, and to maximize array efficiency in terms of localization accuracy per number of elements.
Prioritizing environmental justice and equality: diesel emissions in southern California.
Marshall, Julian D; Swor, Kathryn R; Nguyen, Nam P
2014-04-01
Existing environmental policies aim to reduce emissions but lack standards for addressing environmental justice. Environmental justice research documents disparities in exposure to air pollution; however, little guidance currently exists on how to make improvements or on how specific emission-reduction scenarios would improve or deteriorate environmental justice conditions. Here, we quantify how emission reductions from specific sources would change various measures of environmental equality and justice. We evaluate potential emission reductions for fine diesel particulate matter (DPM) in Southern California for five sources: on-road mobile, off-road mobile, ships, trains, and stationary. Our approach employs state-of-the-science dispersion and exposure models. We compare four environmental goals: impact, efficiency, equality, and justice. Results indicate potential trade-offs among those goals. For example, reductions in train emissions produce the greatest improvements in terms of efficiency, equality, and justice, whereas off-road mobile source reductions can have the greatest total impact. Reductions in on-road emissions produce improvements in impact, equality, and justice, whereas emission reductions from ships would widen existing population inequalities. Results are similar for complex versus simplified exposure analyses. The approach employed here could usefully be applied elsewhere to evaluate opportunities for improving environmental equality and justice in other locations.
The meaning of city noises: Investigating sound quality in Paris (France)
NASA Astrophysics Data System (ADS)
Dubois, Daniele; Guastavino, Catherine; Maffiolo, Valerie; Guastavino, Catherine; Maffiolo, Valerie
2004-05-01
The sound quality of Paris (France) was investigated by using field inquiries in actual environments (open questionnaires) and using recordings under laboratory conditions (free-sorting tasks). Cognitive categories of soundscapes were inferred by means of psycholinguistic analyses of verbal data and of mathematical analyses of similarity judgments. Results show that auditory judgments mainly rely on source identification. The appraisal of urban noise therefore depends on the qualitative evaluation of noise sources. The salience of human sounds in public spaces has been demonstrated, in relation to pleasantness judgments: soundscapes with human presence tend to be perceived as more pleasant than soundscapes consisting solely of mechanical sounds. Furthermore, human sounds are qualitatively processed as indicators of human outdoor activities, such as open markets, pedestrian areas, and sidewalk cafe districts that reflect city life. In contrast, mechanical noises (mainly traffic noise) are commonly described in terms of physical properties (temporal structure, intensity) of a permanent background noise that also characterizes urban areas. This connotes considering both quantitative and qualitative descriptions to account for the diversity of cognitive interpretations of urban soundscapes, since subjective evaluations depend both on the meaning attributed to noise sources and on inherent properties of the acoustic signal.
NASA Astrophysics Data System (ADS)
Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang
2018-01-01
Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.
Evaluation of realistic layouts for next generation on-scalp MEG: spatial information density maps.
Riaz, Bushra; Pfeiffer, Christoph; Schneiderman, Justin F
2017-08-01
While commercial magnetoencephalography (MEG) systems are the functional neuroimaging state-of-the-art in terms of spatio-temporal resolution, MEG sensors have not changed significantly since the 1990s. Interest in newer sensors that operate at less extreme temperatures, e.g., high critical temperature (high-T c ) SQUIDs, optically-pumped magnetometers, etc., is growing because they enable significant reductions in head-to-sensor standoff (on-scalp MEG). Various metrics quantify the advantages of on-scalp MEG, but a single straightforward one is lacking. Previous works have furthermore been limited to arbitrary and/or unrealistic sensor layouts. We introduce spatial information density (SID) maps for quantitative and qualitative evaluations of sensor arrays. SID-maps present the spatial distribution of information a sensor array extracts from a source space while accounting for relevant source and sensor parameters. We use it in a systematic comparison of three practical on-scalp MEG sensor array layouts (based on high-T c SQUIDs) and the standard Elekta Neuromag TRIUX magnetometer array. Results strengthen the case for on-scalp and specifically high-T c SQUID-based MEG while providing a path for the practical design of future MEG systems. SID-maps are furthermore general to arbitrary magnetic sensor technologies and source spaces and can thus be used for design and evaluation of sensor arrays for magnetocardiography, magnetic particle imaging, etc.
Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo
The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.
Design requirements for a stand alone EUV interferometer
NASA Astrophysics Data System (ADS)
Michallon, Ph.; Constancias, C.; Lagrange, A.; Dalzotto, B.
2008-03-01
EUV lithography is expected to be inserted for the 32/22 nm nodes with possible extension below. EUV resist availability remains one of the main issues to be resolved. There is an urgent need to provide suitable tools to accelerate resist development and to achieve resolution, LER and sensitivity specifications simultaneously. An interferometer lithography tool offers advantages regarding conventional EUV exposure tool. It allows the evaluation of resists, free from the deficiencies of optics and mask which are limiting the achieved resolution. Traditionally, a dedicated beam line from a synchrotron, with limited access, is used as a light source in EUV interference lithography. This paper identifies the technology locks to develop a stand alone EUV interferometer using a compact EUV source. It will describe the theoretical solutions adopted and especially look at the feasibility according to available technologies. EUV sources available on the market have been evaluated in terms of power level, source size, spatial coherency, dose uniformity, accuracy, stability and reproducibility. According to the EUV source characteristics, several optic designs were studied (simple or double gratings). For each of these solutions, the source and collimation optic specifications have been determined. To reduce the exposure time, a new grating technology will also be presented allowing to significantly increasing the transmission system efficiency. The optical grating designs were studied to allow multi-pitch resolution print on the same exposure without any focus adjustment. Finally micro mechanical system supporting the gratings was studied integrating the issues due to vacuum environment, alignment capability, motion precision, automation and metrology to ensure the needed placement control between gratings and wafer. A similar study was carried out for the collimation-optics mechanical support which depends on the source characteristics.
An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.
Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D
2016-05-01
Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.
2015-04-30
itself. This is the traditional methodology (combined with attaching commercial software licenses to the contract and citing patent royalty ... Royalties It helps to visualize the Intellectual Property Volume approach so the following notional tables with example deliverable technical data...attorney’s fees Automatic renewal provisions that violate the Anti-Deficiency Act Provisions that prohibit disclosure of license terms/conditions
A novel method for fast imaging of brain function, non-invasively, with light
NASA Astrophysics Data System (ADS)
Chance, Britton; Anday, Endla; Nioka, Shoko; Zhou, Shuoming; Hong, Long; Worden, Katherine; Li, C.; Murray, T.; Ovetsky, Y.; Pidikiti, D.; Thomas, R.
1998-05-01
Imaging of the human body by any non-invasive technique has been an appropriate goal of physics and medicine, and great success has been obtained with both Magnetic Resonance Imaging (MRI) and Positron Emission Tomography (PET) in brain imaging. Non-imaging responses to functional activation using near infrared spectroscopy of brain (fNIR) obtained in 1993 (Chance, et al. [1]) and in 1994 (Tamura, et al. [2]) are now complemented with images of pre-frontal and parietal stimulation in adults and pre-term neonates in this communication (see also [3]). Prior studies used continuous [4], pulsed [3] or modulated [5] light. The amplitude and phase cancellation of optical patterns as demonstrated for single source detector pairs affords remarkable sensitivity of small object detection in model systems [6]. The methods have now been elaborated with multiple source detector combinations (nine sources, four detectors). Using simple back projection algorithms it is now possible to image sensorimotor and cognitive activation of adult and pre- and full-term neonate human brain function in times < 30 sec and with two dimensional resolutions of < 1 cm in two dimensional displays. The method can be used in evaluation of adult and neonatal cerebral dysfunction in a simple, portable and affordable method that does not require immobilization, as contrasted to MRI and PET.
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Evaluation of ground-water quality in the Santa Maria Valley, California
Hughes, Jerry L.
1977-01-01
The quality and quantity of recharge to the Santa Maria Valley, Calif., ground-water basin from natural sources, point sources, and agriculture are expressed in terms of a hydrologic budget, a solute balance, and maps showing the distribution of select chemical constituents. Point sources includes a sugar-beet refinery, oil refineries, stockyards, golf courses, poultry farms, solid-waste landfills, and municipal and industrial wastewater-treatment facilities. Pumpage has exceeded recharge by about 10,000 acre-feet per year. The result is a declining potentiometric surface with an accumulation of solutes and an increase in nitrogen in ground water. Nitrogen concentrations have reached as much as 50 milligrams per liter. In comparison to the solutes from irrigation return, natural recharge, and rain, discharge of wastewater from municipal and industrial wastewater-treatment facilities contributes less than 10 percent. The quality of treated wastewater is often lower in select chemical constituents than the receiving water. (Woodard-USGS)
Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun
2018-03-01
The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.
Experiments with Lasers and Frequency Doublers
NASA Technical Reports Server (NTRS)
Bachor, H.-A.; Taubman, M.; White, A. G.; Ralph, T.; McClelland, D. E.
1996-01-01
Solid state laser sources, such as diode-pumped Nd:YAG lasers, have given us CW laser light of high power with unprecedented stability and low noise performance. In these lasers most of the technical sources of noise can be eliminated allowing them to be operated close to the theoretical noise limit set by the quantum properties of light. The next step of reducing the noise below the standard limit is known as squeezing. We present experimental progress in generating reliably squeezed light using the process of frequency doubling. We emphasize the long term stability that makes this a truly practical source of squeezed light. Our experimental results match noise spectra calculated with our recently developed models of coupled systems which include the noise generated inside the laser and its interaction with the frequency doubler. We conclude with some observations on evaluating quadrature squeezed states of light.
Uncertainty, variability, and earthquake physics in ground‐motion prediction equations
Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.
2017-01-01
Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20 km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.
Evaluating sources and processing of nonpoint source nitrate in a small suburban watershed in China
NASA Astrophysics Data System (ADS)
Han, Li; Huang, Minsheng; Ma, Minghai; Wei, Jinbao; Hu, Wei; Chouhan, Seema
2018-04-01
Identifying nonpoint sources of nitrate has been a long-term challenge in mixed land-use watershed. In the present study, we combine dual nitrate isotope, runoff and stream water monitoring to elucidate the nonpoint nitrate sources across land use, and determine the relative importance of biogeochemical processes for nitrate export in a small suburban watershed, Longhongjian watershed, China. Our study suggested that NH4+ fertilizer, soil NH4+, litter fall and groundwater were the main nitrate sources in Longhongjian Stream. There were large changes in nitrate sources in response to season and land use. Runoff analysis illustrated that the tea plantation and forest areas contributed to a dominated proportion of the TN export. Spatial analysis illustrated that NO3- concentration was high in the tea plantation and forest areas, and δ15N-NO3 and δ18O-NO3 were enriched in the step ponds. Temporal analysis showed high NO3- level in spring, and nitrate isotopes were enriched in summer. Study as well showed that the step ponds played an important role in mitigating nitrate pollution. Nitrification and plant uptake were the significant biogeochemical processes contributing to the nitrogen transformation, and denitrification hardly occurred in the stream.
Health data in Ontario: taking stock and moving ahead.
Iron, Karey
2006-01-01
Ontario has been a leader in performance-reporting in clinical areas such as surgery, cardiac care and drug use in the elderly. Data used to report on these areas are readily available for performance evaluation and are of reasonable quality. But other key areas like managing chronic disease and preventive care cannot be fully evaluated because relevant data are either unavailable or of poor quality. A focus on timely access to good quality demographic and vital statistics data would enhance our ability to evaluate components of the Ontario health system. New comprehensive primary care, laboratory services and drug prescriptions data sources are also necessary for health-system evaluation and planning. In the short term, a dedicated, centralized agency with legislative authority is proposed to move Ontario's health information agenda forward in a holistic, strategic and timely manner.
van Griensven, Johan; De Weiggheleire, Anja; Delamou, Alexandre; Smith, Peter G.; Edwards, Tansy; Vandekerckhove, Philippe; Bah, Elhadj Ibrahima; Colebunders, Robert; Herve, Isola; Lazaygues, Catherine; Haba, Nyankoye; Lynen, Lutgarde
2016-01-01
The clinical evaluation of convalescent plasma (CP) for the treatment of Ebola virus disease (EVD) in the current outbreak, predominantly affecting Guinea, Sierra Leone, and Liberia, was prioritized by the World Health Organization in September 2014. In each of these countries, nonrandomized comparative clinical trials were initiated. The Ebola-Tx trial in Conakry, Guinea, enrolled 102 patients by 7 July 2015; no severe adverse reactions were noted. The Ebola-CP trial in Sierra Leone and the EVD001 trial in Liberia have included few patients. Although no efficacy data are available yet, current field experience supports the safety, acceptability, and feasibility of CP as EVD treatment. Longer-term follow-up as well as data from nontrial settings and evidence on the scalability of the intervention are required. CP sourced from within the outbreak is the most readily available source of anti-EVD antibodies. Until the advent of effective antivirals or monoclonal antibodies, CP merits further evaluation. PMID:26261205
Evaluation of compliance with national legislation on emissions in Portugal.
Gomes, João F P
2005-04-01
More than 13 years after publication of the first air quality laws in Portugal and more than 10 years after the publication of the respective emission limits, it seems appropriate to analyze the degree of compliance by the Portuguese manufacturing industry. Using the data from emission measurements made regularly by the Instituto de Soldadura e Qualidade, the only officially accredited laboratory according to standard ISO 17025, I analyzed a set of approximately 400 sources in terms of compliance with the emission limits regarding total suspended particulates, sulfur dioxide, nitrogen oxides, and volatile organic compounds. I evaluated compliance through a nondimensional parameter and plotted it versus the emission flow rate to derive conclusions: the results indicate that emission limits are generally met regarding sulfur dioxide and nitrogen oxides but not for the other pollutants considered in this study. However, noncompliance occurs mainly for very low emission flow rates, which suggests some alterations in the emission limits, which are being revised at the moment. These alterations will include the exemption of measurements in minor sources.
NASA Technical Reports Server (NTRS)
Yee, H. C.; Shinn, J. L.
1986-01-01
Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.
Common Calibration Source for Monitoring Long-term Ozone Trends
NASA Technical Reports Server (NTRS)
Kowalewski, Matthew
2004-01-01
Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.
Watershed nitrogen and phosphorus balance: The upper Potomac River basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaworski, N.A.; Groffman, P.M.; Keller, A.A.
1992-01-01
Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less
Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente
2016-01-01
This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated. PMID:26927113
Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente
2016-02-26
This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated.
Wang, Xiaoli; Wang, Wei; Wang, Peng
2017-02-01
Schistosomiasis is a neglected tropical parasitic disease of great public health significance worldwide. Currently, mass drug administration with praziquantel remains the major strategy for global schistosomiasis control programs. Since 2005, an integrated strategy with emphasis on infectious source control was implemented for the control of schistosomiasis japonica, a major public health concern in China, and pilot studies have demonstrated that such a strategy is effective to reduce the prevalence of Schistosoma japonicum infection in both humans and bovines. However, there is little knowledge on the long-term effectiveness of this integrated strategy for controlling schistosomiasis japonica. The aim of this study was to evaluate the long-term effectiveness of the integrated strategy for schistosomiasis control following the 10-year implementation, based on the data from the national schistosomiasis control program released by the Ministry of Health, People's Republic of China. In 2014, there were 5 counties in which the transmission of schistosomiasis japonica had not been interrupted, which reduced by 95.2% as compared to that in 2005 (105 counties). The number of schistosomiasis patients and acute cases reduced by 85.5 and 99.7% in 2014 (115,614 cases and 2 cases) as compared to that in 2005 (798,762 cases and 564 cases), and the number of bovines and S. japonicum-infected bovines reduced by 47.9 and 98% in 2014 (919,579 bovines and 666 infected bovines) as compared to that in 2005 (1,764,472 bovines and 33,736 infected bovines), respectively. During the 10-year implementation of the integrated strategy, however, there was a minor fluctuation in the area of Oncomelania hupensis snail habitats, and there was only a 5.6% reduction in the area of snail habitats in 2014 relative to in 2005. The results of the current study demonstrate that the 10-year implementation of the integrated strategy with emphasis on infectious source has greatly reduced schistosomiasis-related morbidity in humans and bovines. It is concluded that the new integrated strategy has remarkable long-term effectiveness on the transmission of schistosomiasis japonica in China, which facilitates the shift of the national schistosomiasis control program from transmission control to transmission interruption and elimination. However, such a strategy seems to have little effect on the shrinking of areas of snail habitats.
Addison, P F E; Flander, L B; Cook, C N
2015-02-01
Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.
A structured approach to recording AIDS-defining illnesses in Kenya: A SNOMED CT based solution
Oluoch, Tom; de Keizer, Nicolette; Langat, Patrick; Alaska, Irene; Ochieng, Kenneth; Okeyo, Nicky; Kwaro, Daniel; Cornet, Ronald
2016-01-01
Introduction Several studies conducted in sub-Saharan Africa (SSA) have shown that routine clinical data in HIV clinics often have errors. Lack of structured and coded documentation of diagnosis of AIDS defining illnesses (ADIs) can compromise data quality and decisions made on clinical care. Methods We used a structured framework to derive a reference set of concepts and terms used to describe ADIs. The four sources used were: (i) CDC/Accenture list of opportunistic infections, (ii) SNOMED Clinical Terms (SNOMED CT), (iii) Focus Group Discussion (FGD) among clinicians and nurses attending to patients at a referral provincial hospital in western Kenya, and (iv) chart abstraction from the Maternal Child Health (MCH) and HIV clinics at the same hospital. Using the January 2014 release of SNOMED CT, concepts were retrieved that matched terms abstracted from approach iii & iv, and the content coverage assessed. Post-coordination matching was applied when needed. Results The final reference set had 1054 unique ADI concepts which were described by 1860 unique terms. Content coverage of SNOMED CT was high (99.9% with pre-coordinated concepts; 100% with post-coordination). The resulting reference set for ADIs was implemented as the interface terminology on OpenMRS data entry forms. Conclusion Different sources demonstrate complementarity in the collection of concepts and terms for an interface terminology. SNOMED CT provides a high coverage in the domain of ADIs. Further work is needed to evaluate the effect of the interface terminology on data quality and quality of care. PMID:26184057
Near-term deployment of carbon capture and sequestration from biorefineries in the United States.
Sanchez, Daniel L; Johnson, Nils; McCoy, Sean T; Turner, Peter A; Mach, Katharine J
2018-05-08
Capture and permanent geologic sequestration of biogenic CO 2 emissions may provide critical flexibility in ambitious climate change mitigation. However, most bioenergy with carbon capture and sequestration (BECCS) technologies are technically immature or commercially unavailable. Here, we evaluate low-cost, commercially ready CO 2 capture opportunities for existing ethanol biorefineries in the United States. The analysis combines process engineering, spatial optimization, and lifecycle assessment to consider the technical, economic, and institutional feasibility of near-term carbon capture and sequestration (CCS). Our modeling framework evaluates least cost source-sink relationships and aggregation opportunities for pipeline transport, which can cost-effectively transport small CO 2 volumes to suitable sequestration sites; 216 existing US biorefineries emit 45 Mt CO 2 annually from fermentation, of which 60% could be captured and compressed for pipeline transport for under $25/tCO 2 A sequestration credit, analogous to existing CCS tax credits, of $60/tCO 2 could incent 30 Mt of sequestration and 6,900 km of pipeline infrastructure across the United States. Similarly, a carbon abatement credit, analogous to existing tradeable CO 2 credits, of $90/tCO 2 can incent 38 Mt of abatement. Aggregation of CO 2 sources enables cost-effective long-distance pipeline transport to distant sequestration sites. Financial incentives under the low-carbon fuel standard in California and recent revisions to existing federal tax credits suggest a substantial near-term opportunity to permanently sequester biogenic CO 2 This financial opportunity could catalyze the growth of carbon capture, transport, and sequestration; improve the lifecycle impacts of conventional biofuels; support development of carbon-negative fuels; and help fulfill the mandates of low-carbon fuel policies across the United States. Copyright © 2018 the Author(s). Published by PNAS.
Public health interventions and behaviour change: reviewing the grey literature.
Franks, H; Hardiker, N R; McGrath, M; McQuarrie, C
2012-01-01
This study identified and reviewed grey literature relating to factors facilitating and inhibiting effective interventions in three areas: the promotion of mental health and well-being, the improvement of food and nutrition, and interventions seeking to increase engagement in physical activity. Sourcing, reviewing and analysis of relevant grey literature. Evidence was collected from a variety of non-traditional sources. Thirty-six pieces of documentary evidence across the three areas were selected for in-depth appraisal and review. A variety of approaches, often short-term, were used both as interventions and outcome measures. Interventions tended to have common outcomes, enabling the identification of themes. These included improvements in participant well-being as well as identification of barriers to, and promoters of, success. Most interventions demonstrated some positive impact, although some did not. This was particularly the case for more objective measures of change, such as physiological measurements, particularly when used to evaluate short-term interventions. Objective health measurement as part of an intervention may act as a catalyst for future behaviour change. Time is an important factor that could either promote or impede the success of interventions for both participants and facilitators. Likewise, the importance of involving all stakeholders, including participants, when planning health promoting interventions was established as an important indicator of success. Despite its limited scope, this review suggests that interventions can be more efficient and effective. For example, larger-scale, longer-term interventions could be more efficient, whilst outcomes relating to the implementation and beyond could provide a clearer picture of effectiveness. Additionally, interventions and evaluations must be flexible, evolve in partnership with local communities, and reflect local need and context. Copyright © 2011 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Development and Characterization of a Laser-Induced Acoustic Desorption Source.
Huang, Zhipeng; Ossenbrüggen, Tim; Rubinsky, Igor; Schust, Matthias; Horke, Daniel A; Küpper, Jochen
2018-03-20
A laser-induced acoustic desorption source, developed for use at central facilities, such as free-electron lasers, is presented. It features prolonged measurement times and a fixed interaction point. A novel sample deposition method using aerosol spraying provides a uniform sample coverage and hence stable signal intensity. Utilizing strong-field ionization as a universal detection scheme, the produced molecular plume is characterized in terms of number density, spatial extend, fragmentation, temporal distribution, translational velocity, and translational temperature. The effect of desorption laser intensity on these plume properties is evaluated. While translational velocity is invariant for different desorption laser intensities, pointing to a nonthermal desorption mechanism, the translational temperature increases significantly and higher fragmentation is observed with increased desorption laser fluence.
Thermal maturity of type II kerogen from the New Albany Shale assessed by13C CP/MAS NMR
Werner-Zwanziger, U.; Lis, G.; Mastalerz, Maria; Schimmelmann, A.
2005-01-01
Thermal maturity of oil and gas source rocks is typically quantified in terms of vitrinite reflectance, which is based on optical properties of terrestrial woody remains. This study evaluates 13C CP/MAS NMR parameters in kerogen (i.e., the insoluble fraction of organic matter in sediments and sedimentary rocks) as proxies for thermal maturity in marine-derived source rocks where terrestrially derived vitrinite is often absent or sparse. In a suite of samples from the New Albany Shale (Middle Devonian to the Early Mississippian, Illinois Basin) the abundance of aromatic carbon in kerogen determined by 13C CP/MAS NMR correlates linearly well with vitrinite reflectance. ?? 2004 Elsevier Inc. All rights reserved.
The myths of 'big data' in health care.
Jacofsky, D J
2017-12-01
'Big data' is a term for data sets that are so large or complex that traditional data processing applications are inadequate. Billions of dollars have been spent on attempts to build predictive tools from large sets of poorly controlled healthcare metadata. Companies often sell reports at a physician or facility level based on various flawed data sources, and comparative websites of 'publicly reported data' purport to educate the public. Physicians should be aware of concerns and pitfalls seen in such data definitions, data clarity, data relevance, data sources and data cleaning when evaluating analytic reports from metadata in health care. Cite this article: Bone Joint J 2017;99-B:1571-6. ©2017 The British Editorial Society of Bone & Joint Surgery.
A dosimetric comparison of {sup 169}Yb versus {sup 192}Ir for HDR prostate brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lymperopoulou, G.; Papagiannis, P.; Sakelliou, L.
2005-12-15
For the purpose of evaluating the use of {sup 169}Yb for prostate High Dose Rate brachytherapy (HDR), a hypothetical {sup 169}Yb source is assumed with the exact same design of the new microSelectron source replacing the {sup 192}Ir active core by pure {sup 169}Yb metal. Monte Carlo simulation is employed for the full dosimetric characterization of both sources and results are compared following the AAPM TG-43 dosimetric formalism. Monte Carlo calculated dosimetry results are incorporated in a commercially available treatment planning system (SWIFT{sup TM}), which features an inverse treatment planning option based on a multiobjective dose optimization engine. The qualitymore » of prostate HDR brachytherapy using the real {sup 192}Ir and hypothetical {sup 169}Yb source is compared in a comprehensive analysis of different prostate implants in terms of the multiobjective dose optimization solutions as well as treatment quality indices such as Dose Volume Histograms (DVH) and the Conformal Index (COIN). Given that scattering overcompensates for absorption in intermediate photon energies and distances in the range of interest to prostate HDR brachytherapy, {sup 169}Yb proves at least equivalent to {sup 192}Ir irrespective of prostate volume. This has to be evaluated in view of the shielding requirements for the {sup 169}Yb energies that are minimal relative to that for {sup 192}Ir.« less
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas
2017-10-01
In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of the release with its associated source term and perform a forward model simulation to study the consequences of the iodine release. Results of these procedures are compared with the known release location and reported information about its time variation. We find that our algorithm could successfully locate the actual release site. The estimated release period is also in agreement with the values reported by IAEA and the reported total released activity of 342 GBq is within the 99 % confidence interval of the posterior distribution of our most likely model.
Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
2003-01-01
A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.
Biochemical Changes after Short-term Oral Exposure of Jatropha curcas Seeds in Wistar Rats
Awasthy, Vijeyta; Vadlamudi, V. P.; Koley, K. M.; Awasthy, B. K.; Singh, P. K.
2010-01-01
Jatropha curcas (Euphorbiaceae) is a multipurpose shrub with varied medicinal uses and is of significant economic importance. In addition to being the source of biodiesel, its seeds are also considered highly nutritious and could be exploited as a rich and economical protein supplement in animal feeds. However, the inherent phytotoxins present in the seed is the hindrance. The toxicity nature of the seeds of the local variety of J. curcas is not known. Therefore, investigations were undertaken to evaluate the short-term oral toxicity of the seeds of locally grown J. curcas. Short-term toxicity was conducted in rats by daily feeding the basal diet (Group I), and the diet in which the crude protein requirement was supplemented at 25% (Group II) and 50% (Group III) levels through Jatropha seed powder. The adverse effects of Jatropha seed protein supplementation (JSPS) were evaluated by observing alterations in biochemical profiles. The biochemical profile of rats fed on diet with JSPS at both the levels revealed significant reduction in plasma glucose and total protein and increase in plasma creatinine, transaminases (Plasma glutamic pyruvic transaminase and Plasma glutamic oxaloacetic transaminase), and alkaline phosphatase. PMID:21170248
Antineutrino analysis for continuous monitoring of nuclear reactors: Sensitivity study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Christopher; Erickson, Anna
This paper explores the various contributors to uncertainty on predictions of the antineutrino source term which is used for reactor antineutrino experiments and is proposed as a safeguard mechanism for future reactor installations. The errors introduced during simulation of the reactor burnup cycle from variation in nuclear reaction cross sections, operating power, and other factors are combined with those from experimental and predicted antineutrino yields, resulting from fissions, evaluated, and compared. The most significant contributor to uncertainty on the reactor antineutrino source term when the reactor was modeled in 3D fidelity with assembly-level heterogeneity was found to be the uncertaintymore » on the antineutrino yields. Using the reactor simulation uncertainty data, the dedicated observation of a rigorously modeled small, fast reactor by a few-ton near-field detector was estimated to offer reduction of uncertainty on antineutrino yields in the 3.0–6.5 MeV range to a few percent for the primary power-producing fuel isotopes, even with zero prior knowledge of the yields.« less
NSRD-15:Computational Capability to Substantiate DOE-HDBK-3010 Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Bignell, John; Dingreville, Remi Philippe Michel
Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE-HDBK-3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms from postulated accident scenarios. In calculating source terms, analysts tend to use the DOE Handbook’s bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived frommore » very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes.« less
System alignment using the Talbot effect
NASA Astrophysics Data System (ADS)
Chevallier, Raymond; Le Falher, Eric; Heggarty, Kevin
1990-08-01
The Talbot effect is utilized to correct an alignment problem related to a neural network used for image recognition, which required the alignment of a spatial light modulator (SLM) with the input module. A mathematical model which employs the Fresnel diffraction theory is presented to describe the method. The calculation of the diffracted amplitude describes the wavefront sphericity and the original object transmittance function in order to qualify the lateral shift of the Talbot image. Another explanation is set forth in terms of plane-wave illumination in the neural network. Using a Fourier series and by describing planes where all the harmonics are in phase, the reconstruction of Talbot images is explained. The alignment is effective when the lenslet array is aligned on the even Talbot images of the SLM pixels and the incident wave is a plane wave. The alignment is evaluated in terms of source and periodicity errors, tilt of the incident plane waves, and finite object dimensions. The effects of the error sources are concluded to be negligible, the lenslet array is shown to be successfully aligned with the SLM, and other alignment applications are shown to be possible.
PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.
Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred
2018-01-01
The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
Some implementational issues of convection schemes for finite volume formulations
NASA Technical Reports Server (NTRS)
Thakur, Siddharth; Shyy, Wei
1993-01-01
Two higher-order upwind schemes - second-order upwind and QUICK - are examined in terms of their interpretation, implementation as well as performance for a recirculating flow in a lid-driven cavity, in the context of a control volume formulation using the SIMPLE algorithm. The present formulation of these schemes is based on a unified framework wherein the first-order upwind scheme is chosen as the basis, with the remaining terms being assigned to the source term. The performance of these schemes is contrasted with the first-order upwind and second-order central difference schemes. Also addressed in this study is the issue of boundary treatment associated with these higher-order upwind schemes. Two different boundary treatments - one that uses a two-point scheme consistently within a given control volume at the boundary, and the other that maintains consistency of flux across the interior face between the adjacent control volumes - are formulated and evaluated.
Some implementational issues of convection schemes for finite-volume formulations
NASA Technical Reports Server (NTRS)
Thakur, Siddharth; Shyy, Wei
1993-01-01
Two higher-order upwind schemes - second-order upwind and QUICK - are examined in terms of their interpretation, implementations, as well as performance for a recirculating flow in a lid-driven cavity, in the context of a control-volume formulation using the SIMPLE algorithm. The present formulation of these schemes is based on a unified framework wherein the first-order upwind scheme is chosen as the basis, with the remaining terms being assigned to the source term. The performance of these schemes is contrasted with the first-order upwind and second-order central difference schemes. Also addressed in this study is the issue of boundary treatment associated with these higher-order upwind schemes. Two different boundary treatments - one that uses a two-point scheme consistently within a given control volume at the boundary, and the other that maintains consistency of flux across the interior face between the adjacent control volumes - are formulated and evaluated.
NASA Technical Reports Server (NTRS)
Byrne, K. P.; Marshall, S. E.
1983-01-01
A procedure for experimentally determining, in terms of the particle motions, the shapes of the low order acoustic modes in enclosures is described. The procedure is based on finding differentiable functions which approximate the shape functions of the low order acoustic modes when these modes are defined in terms of the acoustic pressure. The differentiable approximating functions are formed from polynomials which are fitted by a least squares procedure to experimentally determined values which define the shapes of the low order acoustic modes in terms of the acoustic pressure. These experimentally determined values are found by a conventional technique in which the transfer functions, which relate the acoustic pressures at an array of points in the enclosure to the volume velocity of a fixed point source, are measured. The gradient of the function which approximates the shape of a particular mode in terms of the acoustic pressure is evaluated to give the mode shape in terms of the particle motion. The procedure was tested by using it to experimentally determine the shapes of the low order acoustic modes in a small rectangular enclosure.
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
Asymptotic expansions of the kernel functions for line formation with continuous absorption
NASA Technical Reports Server (NTRS)
Hummer, D. G.
1991-01-01
Asymptotic expressions are obtained for the kernel functions M2(tau, alpha, beta) and K2(tau, alpha, beta) appearing in the theory of line formation with complete redistribution over a Voigt profile with damping parameter a, in the presence of a source of continuous opacity parameterized by beta. For a greater than 0, each coefficient in the asymptotic series is expressed as the product of analytic functions of a and eta. For Doppler broadening, only the leading term can be evaluated analytically.
Studies of the use of heat from high temperature nuclear sources for hydrogen production processes
NASA Technical Reports Server (NTRS)
Farbman, G. H.
1976-01-01
Future uses of hydrogen and hydrogen production processes that can meet the demand for hydrogen in the coming decades were considered. To do this, a projection was made of the market for hydrogen through the year 2000. Four hydrogen production processes were selected, from among water electrolysis, fossil based and thermochemical water decomposition systems, and evaluated, using a consistent set of ground rules, in terms of relative performance, economics, resource requirements, and technology status.
Evaluating Long-Term Impacts of Soil-Mixing Source-Zone Treatment using Cryogenic Core Collection
2017-06-01
to (a) coring equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the vicinity of sampling...encountered due to (a) coring equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the...equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the vicinity of sampling. Downhole
2010-07-01
the ground source heat pump system . During installation, construction equipment would remove vegetation from the surface and disturb soil to a depth...levels of 50 to 55 dBA or higher on a daily basis. Studies specifically conducted to determine noise effects on various human activities show that about...needs to be evaluated for its potential effects on a project site and adjacent land uses. The foremost factor affecting a proposed action in terms of
2014-03-01
sources. 15. SUBJECT TERMS Operation Tomodachi, Radiation Dose, Department of Defense, Japan, Fukushima , Earthquake, Tsunami, Cosmic Radiation 16...were reported along with data collected after the releases from the Fukushima Daiichi Nuclear Power Station (FDNPS) began contributing to the...Araki, S.; Ohta, Y.; Ikeuchi, Y.; 2012. “Changes of Radionuclides in the Environment in Chiba, Japan, after the Fukushima Nuclear Power Plant Accident
Validating the Usefulness of Combined Japanese GMS Data For Long-Term Global Change Studies
NASA Technical Reports Server (NTRS)
Simpson, James J.; Dodge, James C. (Technical Monitor)
2001-01-01
The primary objectives of the Geostationary Meteorological Satellite (GMS)-5 Pathfinder Project were the following: (1) to evaluate GMS-5 data for sources of error and develop methods for minimizing any such errors in GMS-5 data; (2) to prepare a GMS-5 Pathfinder data set for the GMS-5 Pathfinder Benchmark Period (1 July 95 - 30 June 96); and (3) show the usefulness of the improved Pathfinder data set in at least one geophysical application. All objectives were met.
Development of axisymmetric lattice Boltzmann flux solver for complex multiphase flows
NASA Astrophysics Data System (ADS)
Wang, Yan; Shu, Chang; Yang, Li-Ming; Yuan, Hai-Zhuan
2018-05-01
This paper presents an axisymmetric lattice Boltzmann flux solver (LBFS) for simulating axisymmetric multiphase flows. In the solver, the two-dimensional (2D) multiphase LBFS is applied to reconstruct macroscopic fluxes excluding axisymmetric effects. Source terms accounting for axisymmetric effects are introduced directly into the governing equations. As compared to conventional axisymmetric multiphase lattice Boltzmann (LB) method, the present solver has the kinetic feature for flux evaluation and avoids complex derivations of external forcing terms. In addition, the present solver also saves considerable computational efforts in comparison with three-dimensional (3D) computations. The capability of the proposed solver in simulating complex multiphase flows is demonstrated by studying single bubble rising in a circular tube. The obtained results compare well with the published data.
12 CFR 201.4 - Availability and terms of credit.
Code of Federal Regulations, 2014 CFR
2014-01-01
... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...
A Systematic Review of Chronic Fatigue Syndrome: Don't Assume It's Depression
Griffith, James P.; Zarrouf, Fahd A.
2008-01-01
Objective: Chronic fatigue syndrome (CFS) is characterized by profound, debilitating fatigue and a combination of several other symptoms resulting in substantial reduction in occupational, personal, social, and educational status. CFS is often misdiagnosed as depression. The objective of this study was to evaluate and discuss different etiologies, approaches, and management strategies of CFS and to present ways to differentiate it from the fatigue symptom of depression. Data Sources: A MEDLINE search was conducted to identify existing information about CFS and depression using the headings chronic fatigue syndrome AND depression. The alternative terms major depressive disorder and mood disorder were also searched in conjunction with the term chronic fatigue syndrome. Additionally, MEDLINE was searched using the term chronic fatigue. All searches were limited to articles published within the last 10 years, in English. A total of 302 articles were identified by these searches. Also, the term chronic fatigue syndrome was searched by itself. This search was limited to articles published within the last 5 years, in English, and resulted in an additional 460 articles. Additional publications were identified by manually searching the reference lists of the articles from both searches. Study Selection and Data Extraction: CFS definitions, etiologies, differential diagnoses (especially depression) and management strategies were extracted, reviewed, and summarized to meet the objectives of this article. Data Synthesis: CFS is underdiagnosed in more than 80% of the people who have it; at the same time, it is often misdiagnosed as depression. Genetic, immunologic, infectious, metabolic, and neurologic etiologies were suggested to explain CFS. A biopsychosocial model was suggested for evaluating, managing, and differentiating CFS from depression. Conclusions: Evaluating and managing chronic fatigue is a challenging situation for physicians, as it is a challenging and difficult condition for patients. A biopsychosocial approach in the evaluation and management is recommended. More studies about CFS manifestations, evaluation, and management are needed. PMID:18458765
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, S R; Dreger, D S; Phillips, W S
2008-07-16
Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less
The role of public and private transfers in the cost-benefit analysis of mental health programs.
Brent, Robert J
2004-11-01
This paper revisits the issue of whether to include maintenance costs in an economic evaluation in mental health. The source of these maintenance costs may be public or private transfers. The issue is discussed in terms of a formal cost-benefit criterion. It is shown that, when transfers have productivity effects, income distribution is important, and one recognizes that public transfers have tax implications, transfers can have real resource effects and cannot be ignored. The criterion is then applied to an evaluation of three case management programs in California that sought to reduce the intensive hospitalization of the severely mentally ill. 2004 John Wiley & Sons, Ltd.
Wildlife habitat evaluation demonstration project. [Michigan
NASA Technical Reports Server (NTRS)
Burgoyne, G. E., Jr.; Visser, L. G.
1981-01-01
To support the deer range improvement project in Michigan, the capability of LANDSAT data in assessing deer habitat in terms of areas and mixes of species and age classes of vegetation is being examined to determine whether such data could substitute for traditional cover type information sources. A second goal of the demonstration project is to determine whether LANDSAT data can be used to supplement and improve the information normally used for making deer habitat management decisions, either by providing vegetative cover for private land or by providing information about the interspersion and juxtaposition of valuable vegetative cover types. The procedure to be used for evaluating in LANDSAT data of the Lake County test site is described.
10 CFR 40.41 - Terms and conditions of licenses.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...
NASA Astrophysics Data System (ADS)
Saheer, Sahana; Pathak, Amey; Mathew, Roxy; Ghosh, Subimal
2016-04-01
Simulations of Indian Summer Monsoon (ISM) with its seasonal and subseasonal characteristics is highly crucial for predictions/ projections towards sustainable agricultural planning and water resources management. The Climate forecast system version 2 (CFSv2), the state of the art coupled climate model developed by National Center for Environmental Prediction (NCEP), is evaluated here for the simulations of ISM. Even though CFSv2 is a fully coupled ocean-atmosphere-land model with advanced physics, increased resolution and refined initialization, its ISM simulations/ predictions/ projections, in terms of seasonal mean and variability are not satisfactory. Numerous works have been done for verifying the CFSv2 forecasts in terms of the seasonal mean, its mean and variability, active and break spells, and El Nino Southern Oscillation (ENSO)-monsoon interactions. Underestimation of JJAS precipitation over the Indian land mass is one of the major drawbacks of CFSv2. ISM gets the moisture required to maintain the precipitation from different oceanic and land sources. In this work, we find the fraction of moisture supplied by different sources in the CFSv2 simulations and the findings are compared with observed fractions. We also investigate the possible variations in the moisture contributions from these different sources. We suspect that the deviation in the relative moisture contribution from different sources to various sinks over the monsoon region has resulted in the observed dry bias. We also find that over the Arabian Sea region, which is the key moisture source of ISM, there is a premature built up of specific humidity during the month of May and a decline during the later months of JJAS. This is also one of the reasons for the underestimation of JJAS mean precipitation.
Santos, Sara A; da Silva, Pedro R; de Brito, Jorge
2017-08-04
This paper intends to evaluate the feasibility of reintroducing recycled concrete aggregates in the precast industry. The mechanical properties of self-compacting concrete (SCC) with incorporation of recycled aggregates (RA) (coarse recycled aggregates (CRA) and fine recycled aggregates (FRA)) from crushed precast elements were evaluated. The goal was to evaluate the ability of producing SCC with a minimum pre-established performance in terms of mechanical strength, incorporating variable ratios of RA (FRA/CRA%: 0/0%, 25/25%, 50/50%, 0/100% and 100/0%) produced from precast source concretes with similar target performances. This replication in SCC was made for two strength classes (45 MPa and 65 MPa), with the intention of obtaining as final result concrete with recycled aggregates whose characteristics are compatible with those of a SCC with natural aggregates in terms of workability and mechanical strength. The results enabled conclusions to be established regarding the SCC's produced with fine and coarse recycled aggregates from the precast industry, based on its mechanical properties. The properties studied are strongly affected by the type and content of recycled aggregates. The potential demonstrated, mainly in the hardened state, by the joint use of fine and coarse recycled aggregate is emphasized.
Santos, Sara A.; da Silva, Pedro R.; de Brito, Jorge
2017-01-01
This paper intends to evaluate the feasibility of reintroducing recycled concrete aggregates in the precast industry. The mechanical properties of self-compacting concrete (SCC) with incorporation of recycled aggregates (RA) (coarse recycled aggregates (CRA) and fine recycled aggregates (FRA)) from crushed precast elements were evaluated. The goal was to evaluate the ability of producing SCC with a minimum pre-established performance in terms of mechanical strength, incorporating variable ratios of RA (FRA/CRA%: 0/0%, 25/25%, 50/50%, 0/100% and 100/0%) produced from precast source concretes with similar target performances. This replication in SCC was made for two strength classes (45 MPa and 65 MPa), with the intention of obtaining as final result concrete with recycled aggregates whose characteristics are compatible with those of a SCC with natural aggregates in terms of workability and mechanical strength. The results enabled conclusions to be established regarding the SCC’s produced with fine and coarse recycled aggregates from the precast industry, based on its mechanical properties. The properties studied are strongly affected by the type and content of recycled aggregates. The potential demonstrated, mainly in the hardened state, by the joint use of fine and coarse recycled aggregate is emphasized. PMID:28777316
Gul, M; Kaynar, M
2017-03-01
Premature ejaculation is one of the most common male sexual dysfunctions; however, only a few patients with premature ejaculation are seeking professional help or advice. Internet has become an important source of knowledge, and thus, more patients are looking online for health information. According to our best knowledge, no study has evaluated the content and quality of websites on premature ejaculation. We, therefore, aimed to evaluate the content and quality of currently available Internet-based information on premature ejaculation. A sample was obtained comprising the 50 top sites retrieved from Google, Bing and Yahoo search engines using the terms 'premature ejaculation'. Each site then was reviewed based on some predefined evaluation criteria to determine the general quality, condition-specific content quality, popularity index and ownership. The websites reviewed were differed highly in terms of quality and ownership. Only a few sites provided comprehensive medical and complete information on premature ejaculation. The online information available is often of uncertain calibre; therefore, men are being exposed to information about premature ejaculation with a highly variable degree quality. This fact should be considered both by health professionals and website owners, and better online resources should be provided for these patients. © 2016 Blackwell Verlag GmbH.
Thavorn, K; Coyle, D
2015-01-01
Background Liver fibrosis is characterized by a buildup of connective tissue due to chronic liver damage. Steatosis is the collection of excessive amounts of fat inside liver cells. Liver biopsy remains the gold standard for the diagnosis of liver fibrosis and steatosis, but its use as a diagnostic tool is limited by its invasive nature and high cost. Objectives To evaluate the cost-effectiveness and budget impact of transient elastography (TE) with and without controlled attenuation parameter (CAP) for the diagnosis of liver fibrosis or steatosis in patients with hepatitis B, hepatitis C, alcoholic liver disease, and nonalcoholic fatty liver disease. Data Sources An economic literature search was performed using computerized databases. For primary economic and budget impact analyses, we obtained data from various sources, such as the Health Quality Ontario evidence-based analysis, published literature, and the Institute for Clinical Evaluative Sciences. Review Methods A systematic review of existing TE cost-effectiveness studies was conducted, and a primary economic evaluation was undertaken from the perspective of the Ontario Ministry of Health and Long-Term Care. Decision analytic models were used to compare short-term costs and outcomes of TE compared to liver biopsy. Outcomes were expressed as incremental cost per correctly diagnosed cases gained. A budget impact analysis was also conducted. Results We included 10 relevant studies that evaluated the cost-effectiveness of TE compared to other noninvasive tests and to liver biopsy; no cost-effectiveness studies of TE with CAP were identified. All studies showed that TE was less expensive but associated with a decrease in the number of correctly diagnosed cases. TE also improved quality-adjusted life-years in patients with hepatitis B and hepatitis C. Our primary economic analysis suggested that TE led to cost savings but was less effective than liver biopsy in the diagnosis of liver fibrosis. TE became more economically attractive with a higher degree of liver fibrosis. TE with CAP was also less expensive and less accurate than liver biopsy. Limitations The model did not take into account long-term costs and consequences associated with TE and liver biopsy and did not include costs to patients and their families, or patient preferences related to diagnostic information. Conclusions TE showed potential cost savings compared to liver biopsy. Further investigation is needed to determine the long-term impacts of TE on morbidity and mortality in Canada and the optimal diagnostic modality for liver fibrosis and steatosis. PMID:26664666
Does Leaders' Health (and Work-Related Experiences) Affect their Evaluation of Followers' Stress?
Giorgi, Gabriele; Mancuso, Serena; Fiz Perez, Francisco Javier; Montani, Francesco; Courcy, Francois; Arcangeli, Giulio
2015-09-01
Stressed workers suffer from severe health problems which appear to have increased. Poor leadership is especially considered a source of stress. Indeed, supervisors might perceive their subordinates to be similar to them as far as stress is concerned and this might more widespread in organizations than previously thought. The present research investigates the relationships between leaders' health, in terms of work-related stress, mental health, and workplace bullying and their evaluation of subordinates' stress. Five regression models were formulated to test our hypothesis. This is a cross-sectional study among 261 Italian leaders, using supervisor self-assessment and leaders' assessments of their subordinates. Leaders' health was related to their evaluation of staff stress. Job demand, lack of job control, and lack of support by colleagues and supervisors evaluated in their subordinates were particularly associated with the leaders' own health. Implications for developing healthy leaders are finally discussed.
Guidelines for conceptual design and evaluation of aquifer thermal energy storage
NASA Astrophysics Data System (ADS)
Meyer, C. F.; Hauz, W.
1980-10-01
Guidelines are presented for use as a tool by those considering application of aquifer thermal energy storage (ATES) technology. The guidelines assist utilities, municipalities, industries, and other entities in the conceptual design and evaluation of systems employing ATES. The potential benefits of ATES are described, an overview is presented of the technology and its applications, and rules of thumb are provided for quickly judging whether a proposed project has sufficient promise to warrant detailed conceptual design and evaluation. The characteristics of sources and end uses of heat and chill which are seasonally mismatched and may benefit from ATES are discussed. Storage and transport subsystems and their expected performance and cost are described. A methodology is presented for conceptual design of an ATES system and evaluation of its technical and economic feasibility in terms of energy conservation, cost savings, fuel substitution, improved dependability of supply, and abatement of pollution.
Does Leaders' Health (and Work-Related Experiences) Affect their Evaluation of Followers' Stress?
Giorgi, Gabriele; Mancuso, Serena; Fiz Perez, Francisco Javier; Montani, Francesco; Courcy, Francois; Arcangeli, Giulio
2015-01-01
Background Stressed workers suffer from severe health problems which appear to have increased. Poor leadership is especially considered a source of stress. Indeed, supervisors might perceive their subordinates to be similar to them as far as stress is concerned and this might more widespread in organizations than previously thought. Methods The present research investigates the relationships between leaders' health, in terms of work-related stress, mental health, and workplace bullying and their evaluation of subordinates' stress. Five regression models were formulated to test our hypothesis. This is a cross-sectional study among 261 Italian leaders, using supervisor self-assessment and leaders' assessments of their subordinates. Results Leaders' health was related to their evaluation of staff stress. Job demand, lack of job control, and lack of support by colleagues and supervisors evaluated in their subordinates were particularly associated with the leaders' own health. Conclusion Implications for developing healthy leaders are finally discussed. PMID:26929835
Schoenfeld, Elinor R; Hyman, Leslie; Simpson, Leslie Long; Michalowicz, Bryan; Reddy, Michael; Gelato, Marie; Hou, Wei; Engebretson, Steven P; Hytner, Catherine; Lenton, Pat
2014-01-01
Background Diabetes and its complications are a major United States public health concern. Methods The Diabetes and Periodontal Therapy Trial (DPTT) evaluated whether non-surgical treatment of periodontal disease influenced diabetes management among persons with Type 2 diabetes and periodontitis. The aim of this study was to evaluate DPTT’s many recruitment strategies in terms of enrollment success. Results/Conclusion Targeted recruitment strategies were more effective in identifying individuals who met periodontal and diabetes eligibility criteria. Individuals eligible for a baseline visit/enrollment were more often male, had a younger age at diabetes diagnosis, a longer diabetes duration, more often Hispanic and less often African–American. Tracking and evaluating recruitment sources during study enrollment optimized recruitment methods to enroll a diverse participant population based upon gender, race and ethnicity. PMID:25574373
Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, Steven M.; Harding, Lee
The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.
Production and fuel characteristics of vegetable oil from oilseed crops in the Pacific Northwest
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auld, D.L.; Bettis, B.L.; Peterson, C.L.
1982-01-01
The purpose of this research was to evaluate the potential yield and fuel quality of various oilseed crops adapted to the Pacific Northwest as a source of liquid fuel for diesel engines. The seed yield and oil production of three cultivars of winter rape (Brassica napus L.), two cultivars of safflower (Carthamus tinctorius L.) and two cultivars of sunflower (Helianthus annuus L.) were evaluated in replicated plots at Moscow. Additional trials were conducted at several locations in Idaho, Oregon and Washington. Sunflower, oleic and linoleic safflower, and low and high erucic acid rapeseed were evaluated for fatty acid composition, energymore » content, viscosity and engine performance in short term tests. During 20 minute engine tests power output, fuel economy and thermal efficiency were compared to diesel fuel. Winter rape produced over twice as much farm extractable oil as either safflower or sunflower. The winter rape cultivars, Norde and Jet Neuf had oil yields which averaged 1740 and 1540 L/ha, respectively. Vegetable oils contained 94 to 95% of the KJ/L of diesel fuel, but were 11.1 to 17.6 times more viscous. Viscosity of the vegetable oils was closely related to fatty acid chain length and number of unsaturated bonds (R/sup 2/=.99). During short term engine tests all vegetable oils produced power outputs equivalent to diesel, and had thermal efficiencies 1.8 to 2.8% higher than diesel. Based on these results it appears that species and cultivars of oilseed crops to be utilized as a source of fuel should be selected on the basis of oil yield. 1 figure, 5 tables.« less
Localization of sound sources in a room with one microphone
NASA Astrophysics Data System (ADS)
Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre
2017-08-01
Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
Consider the source: persuasion of implicit evaluations is moderated by source credibility.
Smith, Colin Tucker; De Houwer, Jan; Nosek, Brian A
2013-02-01
The long history of persuasion research shows how to change explicit, self-reported evaluations through direct appeals. At the same time, research on how to change implicit evaluations has focused almost entirely on techniques of retraining existing evaluations or manipulating contexts. In five studies, we examined whether direct appeals can change implicit evaluations in the same way as they do explicit evaluations. In five studies, both explicit and implicit evaluations showed greater evidence of persuasion following information presented by a highly credible source than a source low in credibility. Whereas cognitive load did not alter the effect of source credibility on explicit evaluations, source credibility had an effect on the persuasion of implicit evaluations only when participants were encouraged and able to consider information about the source. Our findings reveal the relevance of persuasion research for changing implicit evaluations and provide new ideas about the processes underlying both types of evaluation.
Ictal and interictal electric source imaging in presurgical evaluation: a prospective study.
Sharma, Praveen; Scherg, Michael; Pinborg, Lars H; Fabricius, Martin; Rubboli, Guido; Pedersen, Birthe; Leffers, Anne-Mette; Uldall, Peter; Jespersen, Bo; Brennum, Jannick; Mølby Henriksen, Otto; Beniczky, Sándor
2018-05-11
Accurate localization of the epileptic focus is essential for surgical treatment of patients with drug- resistant epilepsy. EEG source imaging (ESI) is increasingly used in presurgical evaluation. However, most previous studies analysed interictal discharges. Prospective studies comparing feasibility and accuracy of interictal (II) and ictal (IC) ESI are lacking. We prospectively analysed long-term video EEG recordings (LTM) of patients admitted for presurgical evaluation. We performed ESI of II and IC signals, using two methods: equivalent current dipole (ECD) and distributed source model (DSM). LTM recordings employed the standard 25-electrode array (including inferior temporal electrodes). An age-matched template head-model was used for source analysis. Results were compared with intracranial recordings (ICR), conventional neuroimaging methods (MRI, PET, SPECT) and outcome one year after surgery. Eighty-seven consecutive patients were analysed. ECD gave a significantly higher proportion of patients with localised focal abnormalities (94%) compared to MRI (70%), PET (66%) and SPECT (64%). Agreement between the ESI methods and ICR was moderate to substantial (k=0.56-0.79). Fifty-four patients were operated (47 for more than one year ago) and 62% of them became seizure-free. Localization accuracy of II-ESI was 51% for DSM and 57% for ECD; for IC-ESI this was 51% (DSM) and 62% (ECD). The differences between the ESI methods were not significant. Differences in localization accuracy between ESI and MRI (55%), PET (33%) and SPECT (40%) were not significant. II and IC ESI of LTM-data have high feasibility and their localisation accuracy is similar to the conventional neuroimaging methods. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Li, Ji; Larregieu, Caroline A; Benet, Leslie Z
2016-12-01
Natural products (NPs) are compounds that are derived from natural sources such as plants, animals, and micro-organisms. Therapeutics has benefited from numerous drug classes derived from natural product sources. The Biopharmaceutics Drug Disposition Classification System (BDDCS) was proposed to serve as a basis for predicting the importance of transporters and enzymes in determining drug bioavailability and disposition. It categorizes drugs into one of four biopharmaceutical classes according to their water solubility and extent of metabolism. The present paper reviews 109 drugs from natural product sources: 29% belong to class 1 (high solubility, extensive metabolism), 22% to class 2 (low solubility, extensive metabolism), 40% to class 3 (high solubility, poor metabolism), and 9% to class 4 (low solubility, poor metabolism). Herein we evaluated the characteristics of NPs in terms of BDDCS class for all 109 drugs as wells as for subsets of NPs drugs derived from plant sources as antibiotics. In the 109 NPs drugs, we compiled 32 drugs from plants, 50% (16) of total in class 1, 22% (7) in class 2 and 28% (9) in class 3, none found in class 4; Meantime, the antibiotics were found 5 (16%) in class 2, 22 (71%) in class 3, and 4 (13%) in class 4; no drug was found in class 1. Based on this classification, we anticipate BDDCS to serve as a useful adjunct in evaluating the potential characteristics of new natural products. Copyright © 2016 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
Impact of noise and air pollution on pregnancy outcomes.
Gehring, Ulrike; Tamburic, Lillian; Sbihi, Hind; Davies, Hugh W; Brauer, Michael
2014-05-01
Motorized traffic is an important source of both air pollution and community noise. While there is growing evidence for an adverse effect of ambient air pollution on reproductive health, little is known about the association between traffic noise and pregnancy outcomes. We evaluated the impact of residential noise exposure on small size for gestational age, preterm birth, term birth weight, and low birth weight at term in a population-based cohort study, for which we previously reported associations between air pollution and pregnancy outcomes. We also evaluated potential confounding of air pollution effects by noise and vice versa. Linked administrative health data sets were used to identify 68,238 singleton births (1999-2002) in Vancouver, British Columbia, Canada, with complete covariate data (sex, ethnicity, parity, birth month and year, income, and education) and maternal residential history. We estimated exposure to noise with a deterministic model (CadnaA) and exposure to air pollution using temporally adjusted land-use regression models and inverse distance weighting of stationary monitors for the entire pregnancy. Noise exposure was negatively associated with term birth weight (mean difference = -19 [95% confidence interval = -23 to -15] g per 6 dB(A)). In joint air pollution-noise models, associations between noise and term birth weight remained largely unchanged, whereas associations decreased for all air pollutants. Traffic may affect birth weight through exposure to both air pollution and noise.
Pilot/Controller Coordinated Decision Making in the Next Generation Air Transportation System
NASA Technical Reports Server (NTRS)
Bearman, Chris; Miller, Ronald c.; Orasanu, Judith M.
2011-01-01
Introduction: NextGen technologies promise to provide considerable benefits in terms of enhancing operations and improving safety. However, there needs to be a thorough human factors evaluation of the way these systems will change the way in which pilot and controllers share information. The likely impact of these new technologies on pilot/controller coordinated decision making is considered in this paper using the "operational, informational and evaluative disconnect" framework. Method: Five participant focus groups were held. Participants were four experts in human factors, between x and x research students and a technical expert. The participant focus group evaluated five key NextGen technologies to identify issues that made different disconnects more or less likely. Results: Issues that were identified were: Decision Making will not necessarily improve because pilots and controllers possess the same information; Having a common information source does not mean pilots and controllers are looking at the same information; High levels of automation may lead to disconnects between the technology and pilots/controllers; Common information sources may become the definitive source for information; Overconfidence in the automation may lead to situations where appropriate breakdowns are not initiated. Discussion: The issues that were identified lead to recommendations that need to be considered in the development of NextGen technologies. The current state of development of these technologies provides a good opportunity to utilize recommendations at an early stage so that NextGen technologies do not lead to difficulties in resolving breakdowns in coordinated decision making.
quanTLC, an online open-source solution for videodensitometric quantification.
Fichou, Dimitri; Morlock, Gertrud E
2018-07-27
The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.
Public Exposure from Indoor Radiofrequency Radiation in the City of Hebron, West Bank-Palestine.
Lahham, Adnan; Sharabati, Afefeh; ALMasri, Hussien
2015-08-01
This work presents the results of measured indoor exposure levels to radiofrequency (RF) radiation emitting sources in one of the major cities in the West Bank-the city of Hebron. Investigated RF emitters include FM, TV broadcasting stations, mobile telephony base stations, cordless phones [Digital Enhanced Cordless Telecommunications (DECT)], and wireless local area networks (WLAN). Measurements of power density were conducted in 343 locations representing different site categories in the city. The maximum total power density found at any location was about 2.3 × 10 W m with a corresponding exposure quotient of about 0.01. This value is well below unity, indicating compliance with the guidelines of the International Commission on Non-ionizing Radiation Protection (ICNIRP). The average total exposure from all RF sources was 0.08 × 10 W m. The relative contributions from different sources to the total exposure in terms of exposure quotient were evaluated and found to be 46% from FM radio, 26% from GSM900, 15% from DECT phones, 9% from WLAN, 3% from unknown sources, and 1% from TV broadcasting. RF sources located outdoors contribute about 73% to the population exposure indoors.
Influence of Iterative Reconstruction Algorithms on PET Image Resolution
NASA Astrophysics Data System (ADS)
Karpetas, G. E.; Michail, C. M.; Fountos, G. P.; Valais, I. G.; Nikolopoulos, D.; Kandarakis, I. S.; Panayiotakis, G. S.
2015-09-01
The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction. The simulated PET scanner was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the modulation transfer function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL, the ordered subsets separable paraboloidal surrogate (OSSPS), the median root prior (MRP) and OSMAPOSL with quadratic prior, algorithms. OSMAPOSL reconstruction was assessed by using fixed subsets and various iterations, as well as by using various beta (hyper) parameter values. MTF values were found to increase with increasing iterations. MTF also improves by using lower beta values. The simulated PET evaluation method, based on the TLC plane source, can be useful in the resolution assessment of PET scanners.
Children facing a family member's acute illness: a review of intervention studies.
Spath, Mary L
2007-07-01
A review of psycho-educational intervention studies to benefit children adapting to a close (parent, sibling, or grandparent) family member's serious illness was conducted. To review the literature on studies addressing this topic, critique research methods, describe clinical outcomes, and make recommendations for future research efforts. Research citations from 1990 to 2005 from Medline, CINAHL, Health Source: Nursing/Academic Edition, PsycARTICLES, and PsycINFO databases were identified. Citations were reviewed and evaluated for sample, design, theoretical framework, intervention, threats to validity, and outcomes. Reviewed studies were limited to those that included statistical analysis to evaluate interventions and outcomes. Six studies were reviewed. Positive outcomes were reported for all of the interventional strategies used in the studies. Reviewed studies generally lacked a theoretical framework and a control group, were generally composed of small convenience samples, and primarily used non-tested investigator instruments. They were diverse in terms of intervention length and intensity, and measured short-term outcomes related to participant program satisfaction, rather than participant cognitive and behavioral change. The paucity of interventional studies and lack of systematic empirical precision to evaluate intervention effectiveness necessitates future studies that are methodologically rigorous.
Lonati, Giovanni; Cernuschi, Stefano; Sidi, Shelina
2010-12-01
This work is intended to assess the impact on local air quality due to atmospheric emissions from port area activities for a new port in project in the Mediterranean Sea. The sources of air pollutants in the harbour area are auxiliary engines used by ships at berth during loading/offloading operations. A fleet activity-based methodology is first applied to evaluate annual pollutant emissions (NO(X), SO(X), PM, CO and VOC) based on vessel traffic data, ships tonnage and in-port hotelling time for loading/offloading operations. The 3-dimensional Calpuff transport and dispersion model is then applied for the subsequent assessment of the ground level spatial distribution of atmospheric pollutants for both long-term and short-term averaging times. Compliance with current air quality standards in the port area is finally evaluated and indications for port operation are provided. Some methodological aspects of the impact assessment procedure, namely those concerning the steps of emission scenario definitions and model simulations set-up at the project stage, are specifically addressed, suggesting a pragmatic approach for similar evaluations for small new ports in project. Copyright © 2010 Elsevier B.V. All rights reserved.
Baslam, Marouane; Pascual, Inmaculada; Sánchez-Díaz, Manuel; Erro, Javier; García-Mina, José María; Goicoechea, Nieves
2011-10-26
The improvement of the nutritional quality of lettuce by its association with arbuscular mycorrhizal fungi (AMF) has been recently reported in a previous study. The aim of this research was to evaluate if the fertilization with three P sources differing in water solubility affects the effectiveness of AMF for improving lettuce growth and nutritional quality. The application of either water-soluble P sources (Hewitt's solution and single superphosphate) or the water-insoluble (WI) fraction of a "rhizosphere-controlled fertilizer" did not exert negative effects on the establishment of the mycorrhizal symbiosis. AMF improved lettuce growth and nutritional quality. Nevertheless, the effect was dependent on the source of P and cultivar. Batavia Rubia Munguía (green cultivar) benefited more than Maravilla de Verano (red cultivar) in terms of mineral nutrients, total soluble sugars, and ascorbate contents. The association of lettuce with AMF resulted in greater quantities of anthocyanins in plants fertilized with WI, carotenoids when plants received either Hewitt's solution or WI, and phenolics regardless of the P fertilizer applied.
Development Status of Ion Source at J-PARC Linac Test Stand
NASA Astrophysics Data System (ADS)
Yamazaki, S.; Takagi, A.; Ikegami, K.; Ohkoshi, K.; Ueno, A.; Koizumi, I.; Oguri, H.
The Japan Proton Accelerator Research Complex (J-PARC) linac power upgrade program is now in progress in parallel with user operation. To realize a nominal performance of 1 MW at 3 GeV Rapid Cycling Synchrotron and 0.75 MW at the Main Ring synchrotron, we need to upgrade the peak beam current (50 mA) of the linac. For the upgrade program, we are testing a new front-end system, which comprises a cesiated RF-driven H- ion source and a new radio -frequency quadrupole linac (RFQ). The H- ion source was developed to satisfy the J-PARC upgrade requirements of an H- ion-beam current of 60 mA and a lifetime of more than 50 days. On February 6, 2014, the first 50 mA H- beams were accelerated by the RFQ during a beam test. To demonstrate the performance of the ion source before its installation in the summer of 2014, we tested the long-term stability through continuous beam operation, which included estimating the lifetime of the RF antenna and evaluating the cesium consumption.
Auditing the multiply-related concepts within the UMLS
Mougin, Fleur; Grabar, Natalia
2014-01-01
Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853
NASA Astrophysics Data System (ADS)
Vishnoi, Gargi; Hielscher, Andreas H.; Ramanujam, Nirmala; Chance, Britton
2000-04-01
In this work experimental tissue phantoms and numerical models were developed to estimate photon migration through the fetal head in utero. The tissue phantoms incorporate a fetal head within an amniotic fluid sac surrounded by a maternal tissue layer. A continuous wave, dual-wavelength ((lambda) equals 760 and 850 nm) spectrometer was employed to make near-infrared measurements on the tissue phantoms for various source-detector separations, fetal-head positions, and fetal-head optical properties. In addition, numerical simulations of photon propagation were performed with finite-difference algorithms that provide solutions to the equation of radiative transfer as well as the diffusion equation. The simulations were compared with measurements on tissue phantoms to determine the best numerical model to describe photon migration through the fetal head in utero. Evaluation of the results indicates that tissue phantoms in which the contact between fetal head and uterine wall is uniform best simulates the fetal head in utero for near-term pregnancies. Furthermore, we found that maximum sensitivity to the head can be achieved if the source of the probe is positioned directly above the fetal head. By optimizing the source-detector separation, this signal originating from photons that have traveled through the fetal head can drastically be increased.
Regional Scale Simulations of Nitrate Leaching through Agricultural Soils of California
NASA Astrophysics Data System (ADS)
Diamantopoulos, E.; Walkinshaw, M.; O'Geen, A. T.; Harter, T.
2016-12-01
Nitrate is recognized as one of California's most widespread groundwater contaminants. As opposed to point sources, which are relative easily identifiable sources of contamination, non-point sources of nitrate are diffuse and linked with widespread use of fertilizers in agricultural soils. California's agricultural regions have an incredible diversity of soils that encompass a huge range of properties. This complicates studies dealing with nitrate risk assessment, since important biological and physicochemical processes appear at the first meters of the vadose zone. The objective of this study is to evaluate all agricultural soils in California according to their potentiality for nitrate leaching based on numerical simulations using the Richards equation. We conducted simulations for 6000 unique soil profiles (over 22000 soil horizons) taking into account the effect of climate, crop type, irrigation and fertilization management scenarios. The final goal of this study is to evaluate simple management methods in terms of reduced nitrate leaching. We estimated drainage rates of water under the root zone and nitrate concentrations in the drain water at the regional scale. We present maps for all agricultural soils in California which can be used for risk assessment studies. Finally, our results indicate that adoption of simple irrigation and fertilization methods may significantly reduce nitrate leaching in vulnerable regions.
DCCA analysis of renewable and conventional energy prices
NASA Astrophysics Data System (ADS)
Paiva, Aureliano Sancho Souza; Rivera-Castro, Miguel Angel; Andrade, Roberto Fernandes Silva
2018-01-01
Here we investigate the inter-influence of oil prices and renewable energy sources. The non-stationary time series are scrutinized within the Detrended Cross-Correlation Analysis (DCCA) framework, where the resulting DCCA coefficient provides a useful and reliable index to the evaluate the cross correlation between events at the same time instant as well as at a suitably chosen time lags. The analysis is based on the quotient of two successive daily closing oil prices and composite indices of renewable energy sources in USA and Europe in the period 2006-2015, which was subject to several social and economic driving forces, as the increase of social pressure in favor of the use of non-fossil energy sources and the worldwide economic crisis that started in 2008. The DCCA coefficient is evaluated for different window sizes, extracting information for short and long term correlation between the indices. Particularly, strong correlation between the behavior of the two distinct economic sectors are observed for large time intervals during the worst period of the economic crisis (2008-2012), hinting at a very cautious behavior of the economic agents. Before and after this period, the behavior of two economic sectors are overwhelmingly uncorrelated or very weakly correlated. The results reported here may be useful to select proper strategies in future similar scenarios.
Gravity Waves in the Southern Hemisphere Extratropical Winter in the 7-km GEOS-5 Nature Run
NASA Astrophysics Data System (ADS)
Holt, L. A.; Alexander, M. J.; Coy, L.; Putman, W.; Molod, A.; Pawson, S.
2016-12-01
This study investigates winter Southern Hemisphere extratropical gravity waves and their sources in a 7-km horizontal resolution global climate simulation, the GEOS-5 Nature Run (NR). Gravity waves are evaluated by comparing brightness temperature anomalies to those from the Atmospheric Infrared Sounder (AIRS). Gravity wave amplitudes, wavelengths, and propagation directions are also computed in the NR and AIRS. The NR shows good agreement with AIRS in terms of spatial patterns of gravity wave activity and propagation directions, but the NR amplitudes are smaller by about a factor of 5 and the wavelengths are about a factor of 2 longer than in AIRS. In addition to evaluating gravity wave characteristics, gravity wave sources in the NR are also investigated by relating diagnostics of tropospheric sources of gravity waves, such as precipitation, frontogenesis, and potential vorticity anomalies to absolute gravity wave momentum fluxes in the lower stratosphere. Strong precipitation events are the most strongly correlated with absolute momentum flux, supporting previous studies highlighting the importance of moist processes in the generation of Southern Hemisphere extratropical gravity waves. Additionally, gravity wave absolute momentum fluxes over land are compared to those over ocean, and the contribution of orographic and nonorographic gravity waves to the total absolute momentum flux is examined.
Comparison of ESI- and APCI-LC-MS/MS methods: A case study of levonorgestrel in human plasma.
Wang, Rulin; Zhang, Lin; Zhang, Zunjian; Tian, Yuan
2016-12-01
Electrospray ionization (ESI) and atmospheric pressure chemical ionization (APCI) techniques for liquid chromatography-tandem mass spectrometry (LC-MS/MS) determination of levonorgestrel were evaluated. In consideration of difference in ionization mechanism, the two ionization sources were compared in terms of LC conditions, MS parameters and performance of method. The sensitivity for detection of levonorgestrel with ESI was 0.25 ng/mL which was lower than 1 ng/mL with APCI. Matrix effects were evaluated for levonorgestrel and canrenone (internal standard, IS) in human plasma, and the results showed that APCI source appeared to be slightly less liable to matrix effect than ESI source. With an overall consideration, ESI was chosen as a better ionization technique for rapid and sensitive quantification of levonorgestrel. The optimized LC-ESI-MS/MS method was validated for a linear range of 0.25-50 ng/mL with a correlation coefficient ≥0.99. The intra- and inter-batch precision and accuracy were within 11.72% and 6.58%, respectively. The application of this method was demonstrated by a bioequivalence study following a single oral administration of 1.5 mg levonorgestrel tablets in 21 Chinese healthy female volunteers.
Ng, Ding-Quan; Lin, Yi-Pin
2016-01-01
In this pilot study, a modified sampling protocol was evaluated for the detection of lead contamination and locating the source of lead release in a simulated premise plumbing system with one-, three- and seven-day stagnation for a total period of 475 days. Copper pipes, stainless steel taps and brass fittings were used to assemble the “lead-free” system. Sequential sampling using 100 mL was used to detect lead contamination while that using 50 mL was used to locate the lead source. Elevated lead levels, far exceeding the World Health Organization (WHO) guideline value of 10 µg·L−1, persisted for as long as five months in the system. “Lead-free” brass fittings were identified as the source of lead contamination. Physical disturbances, such as renovation works, could cause short-term spikes in lead release. Orthophosphate was able to suppress total lead levels below 10 µg·L−1, but caused “blue water” problems. When orthophosphate addition was ceased, total lead levels began to spike within one week, implying that a continuous supply of orthophosphate was required to control total lead levels. Occasional total lead spikes were observed in one-day stagnation samples throughout the course of the experiments. PMID:26927154
Ng, Ding-Quan; Lin, Yi-Pin
2016-02-27
In this pilot study, a modified sampling protocol was evaluated for the detection of lead contamination and locating the source of lead release in a simulated premise plumbing system with one-, three- and seven-day stagnation for a total period of 475 days. Copper pipes, stainless steel taps and brass fittings were used to assemble the "lead-free" system. Sequential sampling using 100 mL was used to detect lead contamination while that using 50 mL was used to locate the lead source. Elevated lead levels, far exceeding the World Health Organization (WHO) guideline value of 10 µg · L(-1), persisted for as long as five months in the system. "Lead-free" brass fittings were identified as the source of lead contamination. Physical disturbances, such as renovation works, could cause short-term spikes in lead release. Orthophosphate was able to suppress total lead levels below 10 µg · L(-1), but caused "blue water" problems. When orthophosphate addition was ceased, total lead levels began to spike within one week, implying that a continuous supply of orthophosphate was required to control total lead levels. Occasional total lead spikes were observed in one-day stagnation samples throughout the course of the experiments.
The long-term problems of contaminated land: Sources, impacts and countermeasures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baes, C.F. III
1986-11-01
This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').
Multisource Estimation of Long-term Global Terrestrial Surface Radiation
NASA Astrophysics Data System (ADS)
Peng, L.; Sheffield, J.
2017-12-01
Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
Survey of ion plating sources. [conferences
NASA Technical Reports Server (NTRS)
Spalvins, T.
1979-01-01
Based on the type of evaporation source, gaseous media and mode of transport, the following is discussed: resistance, electron beam, sputtering, reactive and ion beam evaporation. Ionization efficiencies and ion energies in the glow discharge determine the percentage of atoms which are ionized under typical ion plating conditions. The plating flux consists of a small number of energetic ions and a large number of energetic neutrals. The energy distribution ranges from thermal energies up to a maximum energy of the discharge. The various reaction mechanisms which contribute to the exceptionally strong adherence - formation of a graded sustrate/coating interface are not fully understood, however the controlling factors are evaluated. The influence of process variables on the nucleation and growth characteristics are illustrated in terms of morphological changes which affect the mechanical and tribological properties of the coating.
García-Osogobio, Sandra; Remes-Troche, José María; Takahashi, Takeshi; Barreto Camilo, Juan; Uscanga, Luis
2002-01-01
Lower gastrointestinal bleeding is usually self-limiting in about 80% of cases; however, surgical treatment may be required in selected cases. Preoperative precise identification of the bleeding source is crucial for a successful outcome. To determine the most frequent diagnoses, as well as short and long-term results in a series of patients who underwent a surgical procedure for lower gastrointestinal bleeding. Retrospective analysis of 39 patients operated upon for lower gastrointestinal bleeding from 1979 through 1997 in a referral center. Demographic data, history, physical examination, laboratory tests, resuscitative measures, preoperative work-up for identification of bleeding source, definitive cause of bleeding, surgical procedure, operative morbidity and mortality, as well as long-term status and recurrence of bleeding were recorded. There were 54% women and 46% men. Mean age was 56 years (range, 15-92). Most patients presented hematochezia (69%). Colonoscopy was the most used diagnostic procedure (69%). The bleeding source was located in 90% of patients. Diverticular disease was the most frequent cause of bleeding. A segmental bowel resection was the treatment in 97% of cases. Morbidity was 23% with 18% of mortality. Recurrence occurred in 9% of survivors. Morbidity and mortality were high. Patients who require a surgical operation should be carefully selected and evaluated with a complete work-up to determine the site and cause of bleeding.
A Unified Flash Flood Database across the United States
Gourley, Jonathan J.; Hong, Yang; Flamig, Zachary L.; Arthur, Ami; Clark, Robert; Calianno, Martin; Ruin, Isabelle; Ortel, Terry W.; Wieczorek, Michael; Kirstetter, Pierre-Emmanuel; Clark, Edward; Krajewski, Witold F.
2013-01-01
Despite flash flooding being one of the most deadly and costly weather-related natural hazards worldwide, individual datasets to characterize them in the United States are hampered by limited documentation and can be difficult to access. This study is the first of its kind to assemble, reprocess, describe, and disseminate a georeferenced U.S. database providing a long-term, detailed characterization of flash flooding in terms of spatiotemporal behavior and specificity of impacts. The database is composed of three primary sources: 1) the entire archive of automated discharge observations from the U.S. Geological Survey that has been reprocessed to describe individual flooding events, 2) flash-flooding reports collected by the National Weather Service from 2006 to the present, and 3) witness reports obtained directly from the public in the Severe Hazards Analysis and Verification Experiment during the summers 2008–10. Each observational data source has limitations; a major asset of the unified flash flood database is its collation of relevant information from a variety of sources that is now readily available to the community in common formats. It is anticipated that this database will be used for many diverse purposes, such as evaluating tools to predict flash flooding, characterizing seasonal and regional trends, and improving understanding of dominant flood-producing processes. We envision the initiation of this community database effort will attract and encompass future datasets.
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas
2016-11-01
Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.
Barkmeier-Kraemer, Julie M.; Clark, Heather M.
2017-01-01
Background Hyperkinetic dysarthria is characterized by abnormal involuntary movements affecting respiratory, phonatory, and articulatory structures impacting speech and deglutition. Speech–language pathologists (SLPs) play an important role in the evaluation and management of dysarthria and dysphagia. This review describes the standard clinical evaluation and treatment approaches by SLPs for addressing impaired speech and deglutition in specific hyperkinetic dysarthria populations. Methods A literature review was conducted using the data sources of PubMed, Cochrane Library, and Google Scholar. Search terms included 1) hyperkinetic dysarthria, essential voice tremor, voice tremor, vocal tremor, spasmodic dysphonia, spastic dysphonia, oromandibular dystonia, Meige syndrome, orofacial, cervical dystonia, dystonia, dyskinesia, chorea, Huntington’s Disease, myoclonus; and evaluation/treatment terms: 2) Speech–Language Pathology, Speech Pathology, Evaluation, Assessment, Dysphagia, Swallowing, Treatment, Management, and diagnosis. Results The standard SLP clinical speech and swallowing evaluation of chorea/Huntington’s disease, myoclonus, focal and segmental dystonia, and essential vocal tremor typically includes 1) case history; 2) examination of the tone, symmetry, and sensorimotor function of the speech structures during non-speech, speech and swallowing relevant activities (i.e., cranial nerve assessment); 3) evaluation of speech characteristics; and 4) patient self-report of the impact of their disorder on activities of daily living. SLP management of individuals with hyperkinetic dysarthria includes behavioral and compensatory strategies for addressing compromised speech and intelligibility. Swallowing disorders are managed based on individual symptoms and the underlying pathophysiology determined during evaluation. Discussion SLPs play an important role in contributing to the differential diagnosis and management of impaired speech and deglutition associated with hyperkinetic disorders. PMID:28983422
NASA Astrophysics Data System (ADS)
Bliss, Donald; Franzoni, Linda; Rouse, Jerry; Manning, Ben
2005-09-01
An analysis method for time-dependent broadband diffuse sound fields in enclosures is described. Beginning with a formulation utilizing time-dependent broadband intensity boundary sources, the strength of these wall sources is expanded in a series in powers of an absorption parameter, thereby giving a separate boundary integral problem for each power. The temporal behavior is characterized by a Taylor expansion in the delay time for a source to influence an evaluation point. The lowest-order problem has a uniform interior field proportional to the reciprocal of the absorption parameter, as expected, and exhibits relatively slow exponential decay. The next-order problem gives a mean-square pressure distribution that is independent of the absorption parameter and is primarily responsible for the spatial variation of the reverberant field. This problem, which is driven by input sources and the lowest-order reverberant field, depends on source location and the spatial distribution of absorption. Additional problems proceed at integer powers of the absorption parameter, but are essentially higher-order corrections to the spatial variation. Temporal behavior is expressed in terms of an eigenvalue problem, with boundary source strength distributions expressed as eigenmodes. Solutions exhibit rapid short-time spatial redistribution followed by long-time decay of a predominant spatial mode.
Ammonium as sole N source improves grain quality in wheat.
Fuertes-Mendizábal, Teresa; González-Torralba, Jon; Arregui, Luis M; González-Murua, Carmen; González-Moro, M Begoña; Estavillo, José M
2013-07-01
The skilful handling of N fertilizer, including N source type and its timing, is necessary to obtain maximum profitability in wheat crops in terms of production and quality. Studies on grain yield and quality with ammonium as sole N source have not yet been conducted. The aim of this study was to evaluate the effect of N source management (nitrate vs. ammonium), and splitting it into two or three amendments during the wheat life cycle, on grain yield and quality under irrigated conditions. This experiment demonstrates that Cezanne wheat plants growing with ammonium as exclusive N source are able to achieve the same yield as plants growing with nitrate and that individual wheat plants grown in irrigated pots can efficiently use late N applied in GS37. Ammonium nutrition increased both types of grain reserve proteins (gliadins and glutenins) and also increased the ratio gli/glu with respect to nitrate nutrition. The splitting of the N rate enhanced the ammonium effect on grain protein composition. The application of ammonium N source, especially when split into three amendments, has an analogous effect on grain protein content and composition to applications at a higher N rate, leading to higher N use efficiency. © 2012 Society of Chemical Industry.
Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests
NASA Astrophysics Data System (ADS)
Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.
2015-12-01
Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.
NASA Astrophysics Data System (ADS)
Kim, Y.; Kang, J. H.; Yeum, Y.; Han, K. J.; Kim, D. W.; Park, C. W.
2015-12-01
Nitric nitrogen could be the one of typical pollution source such asNO3-through domestic sewage, livestock and agricultural wastewater. Resident microflorain aquifer has known to remove the nitric nitrogen spontaneously following the denitration process with the carbon source (CS) as reactant. However, it could be reacted very slowly with the rack of CS and there have been some studies for controlled addition of CS (Ref #1-3). The aim of this study was to prepare the controlled-release carbon source (CR-CS) tablet and to evaluate in vitro release profile for groundwater in situ denitrification. CR-CS tablet could be manufactured by direct compression method using hydraulic laboratory press (Caver® 3850) with 8 mm rounded concave punch/ die.Seven kinds of CR-CS tablet were prepared to determine the nature of the additives and their ratio such as sodium silicate, dicalcium phosphate, bentonite and sand#8.For each formulation, the LOD% and flowability of pre-mixed powders and the hardness of compressed tablets were analyzed. In vitro release study was performed to confirm the dissolution profiles following the USP Apparatus 2 method with Distilled water of 900mL, 20 °C. As a result, for each lubricated powders, they were compared in terms of ability to give an acceptable dry pre-mixed powder for tableting process. The hardness of the compressed tablets is acceptable whatever the formulations tested. After in vitro release study, it could confirm that the different formulations of CR-CS tablet have a various release rate patterns, which could release 100% at 3 hrs, 6 hrs and 12 hrs. The in vitro dissolution profiles were in good correlation of Higuchi release kinetic model. In conclusion, this study could be used as a background for development and evaluation of the controlled-release carbon source (CR-CS) tablet for the purification of groundwater following the in situ denitrification.
Hu, Frank B.
2017-01-01
Sugar-sweetened beverages (SSBs) are the single largest source of added sugar and the top source of energy intake in the US diet. In this review, we evaluate whether there is sufficient scientific evidence that decreasing SSB consumption will reduce the prevalence of obesity and its related diseases. Since prospective cohort studies address dietary determinants of long-term weight gain and chronic diseases, whereas randomized controlled trials (RCTs) typically evaluate short-term effects of specific interventions on weight change, both types of evidence are critical in evaluating causality. Findings from well-powered prospective cohorts have consistently shown a significant association, established temporality, and demonstrated a direct dose-response relationship between SSB consumption and long-term weight gain and risk of type 2 diabetes (T2D). A recently published meta-analysis of RCTs commissioned by the World Health Organization (WHO) found that decreased intake of added sugars significantly reduced body weight (0.80 kg, 95% CI 0.39 to 1.21; P<0.001), whereas increased sugar intake led to a comparable weight increase (0.75 kg, 0.30 to 1.19; P=0.001). A parallel meta-analysis of cohort studies also found that higher intake of SSBs among children was associated with 55% (95% CI 32%-82%) higher risk of being overweight or obese compared to those with lower intake. Another meta-analysis of eight prospective cohort studies found that 1–2 servings/day of SSB intake was associated with a 26% (95% CI 12–41%) greater risk of developing T2D compared to occasional intake (< 1 serving/month). Recently, two large RCTs with a high degree of compliance provided convincing data that reducing consumption of SSBs significantly decreases weight gain and adiposity in children and adolescents. Taken together, the evidence that decreasing SSBs will decrease the risk of obesity and related diseases such as T2D is compelling. Several additional issues warrant further discussion. First, prevention of long-term weight gain through dietary changes such as limiting consumption of SSBs is more important than short-term weight loss in reducing the prevalence of obesity in the population. This is because once an individual becomes obese, it is difficult to lose weight and keep it off. Second, we should consider the totality of evidence rather than selective pieces of evidence (e.g., from short-term RCTs only). Finally, while recognizing that the evidence of harm on health against SSBs is strong, we should avoid the trap of waiting for absolute proof before allowing public health action to be taken. PMID:23763695
Peak fitting and integration uncertainties for the Aerodyne Aerosol Mass Spectrometer
NASA Astrophysics Data System (ADS)
Corbin, J. C.; Othman, A.; Haskins, J. D.; Allan, J. D.; Sierau, B.; Worsnop, D. R.; Lohmann, U.; Mensah, A. A.
2015-04-01
The errors inherent in the fitting and integration of the pseudo-Gaussian ion peaks in Aerodyne High-Resolution Aerosol Mass Spectrometers (HR-AMS's) have not been previously addressed as a source of imprecision for these instruments. This manuscript evaluates the significance of these uncertainties and proposes a method for their estimation in routine data analysis. Peak-fitting uncertainties, the most complex source of integration uncertainties, are found to be dominated by errors in m/z calibration. These calibration errors comprise significant amounts of both imprecision and bias, and vary in magnitude from ion to ion. The magnitude of these m/z calibration errors is estimated for an exemplary data set, and used to construct a Monte Carlo model which reproduced well the observed trends in fits to the real data. The empirically-constrained model is used to show that the imprecision in the fitted height of isolated peaks scales linearly with the peak height (i.e., as n1), thus contributing a constant-relative-imprecision term to the overall uncertainty. This constant relative imprecision term dominates the Poisson counting imprecision term (which scales as n0.5) at high signals. The previous HR-AMS uncertainty model therefore underestimates the overall fitting imprecision. The constant relative imprecision in fitted peak height for isolated peaks in the exemplary data set was estimated as ~4% and the overall peak-integration imprecision was approximately 5%. We illustrate the importance of this constant relative imprecision term by performing Positive Matrix Factorization (PMF) on a~synthetic HR-AMS data set with and without its inclusion. Finally, the ability of an empirically-constrained Monte Carlo approach to estimate the fitting imprecision for an arbitrary number of known overlapping peaks is demonstrated. Software is available upon request to estimate these error terms in new data sets.
ERIC Educational Resources Information Center
Andreassen, Rune; Bråten, Ivar
2013-01-01
Building on prior research and theory concerning source evaluation and the role of self-efficacy in the context of online learning, this study investigated the relationship between teachers' beliefs about their capability to evaluate the trustworthiness of sources and their reliance on relevant source features when judging the trustworthiness…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honma, George
The establishment of a systematic process for the evaluation of historic technology information for use in advanced reactor licensing is described. Efforts are underway to recover and preserve Experimental Breeder Reactor II and Fast Flux Test Facility historical data. These efforts have generally emphasized preserving information from data-acquisition systems and hard-copy reports and entering it into modern electronic formats suitable for data retrieval and examination. The guidance contained in this document has been developed to facilitate consistent and systematic evaluation processes relating to quality attributes of historic technical information (with focus on sodium-cooled fast reactor (SFR) technology) that will bemore » used to eventually support licensing of advanced reactor designs. The historical information may include, but is not limited to, design documents for SFRs, research-and-development (R&D) data and associated documents, test plans and associated protocols, operations and test data, international research data, technical reports, and information associated with past U.S. Nuclear Regulatory Commission (NRC) reviews of SFR designs. The evaluation process is prescribed in terms of SFR technology, but the process can be used to evaluate historical information for any type of advanced reactor technology. An appendix provides a discussion of typical issues that should be considered when evaluating and qualifying historical information for advanced reactor technology fuel and source terms, based on current light water reactor (LWR) requirements and recent experience gained from Next Generation Nuclear Plant (NGNP).« less
NASA Astrophysics Data System (ADS)
Smith, N.; Blewitt, D.; Hebert, L. B.
2015-12-01
In coordination with oil and gas operators, we developed a high resolution (< 1 min) simulation of temporal variability in well-pad oil and gas emissions over a year. We include routine emissions from condensate tanks, dehydrators, pneumatic devices, fugitive leaks and liquids unloading. We explore the variability in natural gas emissions from these individual well-pad sources, and find that routine short-term episodic emissions such as tank flashing and liquids unloading result in the appearance of a skewed, or 'fat-tail' distribution of emissions, from an individual well-pad over time. Additionally, we explore the expected variability in emissions from multiple wells with different raw gas composition, gas/liquids production volumes and control equipment. Differences in well-level composition, production volume and control equipment translate into differences in well-level emissions leading to a fat-tail distribution of emissions in the absence of operational upsets. Our results have several implications for recent studies focusing on emissions from oil and gas sources. Time scale of emission estimates are important and have important policy implications. Fat tail distributions may not be entirely driven by avoidable mechanical failures, and are expected to occur under routine operational conditions from short-duration emissions (e.g., tank flashing, liquid unloading). An understanding of the expected distribution of emissions for a particular population of wells is necessary to evaluate whether the observed distribution is more skewed than expected. Temporal variability in well-pad emissions make comparisons to annual average emissions inventories difficult and may complicate the interpretation of long-term ambient fenceline monitoring data. Sophisticated change detection algorithms will be necessary to identify when true operational upsets occur versus routine short-term emissions.
The mass-zero spin-two field and gravitational theory.
NASA Technical Reports Server (NTRS)
Coulter, C. A.
1972-01-01
Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.
Thermal electron heating rate: A derivation
NASA Technical Reports Server (NTRS)
Hoegy, W. R.
1983-01-01
The thermal electron heating rate is an important heat source term in the ionospheric electron energy balance equation, representing heating by photoelectrons or by precipitating higher energy electrons. A formula for the thermal electron heating rate is derived from the kinetic equation using the electron-electron collision operator as given by the unified theory of Kihara and Aono. This collision operator includes collective interactions to produce a finite collision operator with an exact Coulomb logarithm term. The derived heating rate O(e) is the sum of three terms, O(e) = O(p) + S + O(int), which are respectively: (1) primary electron production term giving the heating from newly created electrons that have not yet suffered collisions with the ambient electrons; (2) a heating term evaluated on the energy surface m(e)/2 = E(T) at the transition between Maxwellian and tail electrons at E(T); and (3) the integral term representing heating of Maxwellian electrons by energetic tail electrons at energies ET. Published ionospheric electron temperature studies used only the integral term O(int) with differing lower integration limits. Use of the incomplete heating rate could lead to erroneous conclusions regarding electron heat balance, since O(e) is greater than O(int) by as much as a factor of two.
NASA Astrophysics Data System (ADS)
Weiss, W. J.; Becker, W.; Schindler, S.
2012-12-01
The United States Environmental Protection Agency's 2006 Stage 2 Disinfectant / Disinfection Byproduct Rule (DBPR) for finished drinking waters is intended to reduce overall DBP levels by limiting the levels of total trihalomethanes (TTHM) and five of the haloacetic acids (HAA5). Under Stage 2, maximum contaminant levels (MCLs), 80 μg/L for TTHM and 60 μg/L for HAA5, are based on a locational running annual average for individual sites instead of as the system-wide quarterly running annual average of the Stage 1 DBPR. This means compliance will have to be met at sampling locations of peak TTHM and HAA5 concentrations rather than an average across the entire system. Compliance monitoring under the Stage 2 DBPR began on April 1, 2012. The New York City (NYC) Department of Environmental Protection (DEP) began evaluating potential impacts of the Stage 2 DBPR on NYC's unfiltered water supply in 2002 by monitoring TTHM and HAA5 levels at various locations throughout the distribution system. Initial monitoring indicated that HAA5 levels could be of concern in the future, with the potential to intermittently violate the Stage 2 DBPR at specific locations, particularly those with high water age. Because of the uncertainty regarding the long-term prospect for compliance, DEP evaluated alternatives to ensure compliance, including operational changes (reducing chlorine dose, changing flow configurations to minimize water age, altering pH, altering source water withdrawals); changing the residual disinfectant from free chlorine to chloramines; and engineered treatment alternatives. This paper will discuss the potential for using DEP's Operations Support Tool (OST) and enhanced reservoir monitoring to support optimization of source water withdrawals to minimize finished water DBP levels. The OST is a state-of-the-art decision support system (DSS) to provide computational and predictive support for water supply operations and planning. It incorporates a water supply system simulation model (OASIS, HydroLogics, Inc.), reservoir water quality models, a near real-time monitoring network, and hydrologic forecasts to provide analytical support for both long-term planning and near-term operations. The OST helps managers and operators balance multiple objectives, including water supply reliability, water quality, and environmental and community release objectives. This paper will describe the results of initial testing to evaluate the potential to reduce DBP levels by managing source water withdrawals to minimize the transport of natural organic matter (NOM) precursors from upper reservoirs. Operating rules were developed that take advantage of selective withdrawal capabilities at some upstate reservoirs and the inherent flexibility of the overall water supply system, seeking to minimize DBPs within the larger framework of water supply, water quality, environmental, and regulatory objectives. The results demonstrated that there is substantial flexibility within the system to manage DBP levels, in some cases providing the potential for reductions of DBP precursors of nearly 10%. Additional research is underway that seeks to better understand the sources of natural organic matter in the NYC watershed to provide guidance for on-line monitoring to be used with the OST to support real-time operation support for DBP control.
NASA Astrophysics Data System (ADS)
Dufour, Gaëlle; Albergel, Armand; Balkanski, Yves; Beekmann, Matthias; Cai, Zhaonan; Fortems-Cheiney, Audrey; Cuesta, Juan; Derognat, Claude; Eremenko, Maxim; Foret, Gilles; Hauglustaine, Didier; Lachatre, Matthieu; Laurent, Benoit; Liu, Yi; Meng, Fan; Siour, Guillaume; Tao, Shu; Velay-Lasry, Fanny; Zhang, Qijie; Zhang, Yuli
2017-04-01
The rapid economic development and urbanization of China during the last decades resulted in rising pollutant emissions leading to amongst the largest pollutant concentrations in the world for the major pollutants (ozone, PM2.5, and PM10). Robust monitoring and forecasting systems associated with downstream services providing comprehensive risk indicators are highly needed to establish efficient pollution mitigation strategies. In addition, a precise evaluation of the present and future impacts of Chinese pollutant emissions is of importance to quantify: first, the consequences of pollutants export on atmospheric composition and air quality all over the globe; second, the additional radiative forcing induced by the emitted and produced short-lived climate forcers (ozone and aerosols); third, the long-term health consequences of pollution exposure. To achieve this, a detailed understanding of East Asian pollution is necessary. The French PolEASIA project aims at addressing these different issues by providing a better quantification of major pollutants sources and distributions as well as of their recent and future evolution. The main objectives, methodologies and tools of this starting 4-year project will be presented. An ambitious synergistic and multi-scale approach coupling innovative satellite observations, in situ measurements and chemical transport model simulations will be developed to characterize the spatial distribution, the interannual to daily variability and the trends of the major pollutants (ozone and aerosols) and their sources over East Asia, and to quantify the role of the different processes (emissions, transport, chemical transformation) driving the observed pollutant distributions. A particular attention will be paid to assess the natural and anthropogenic contributions to East Asian pollution. Progress made with the understanding of pollutant sources, especially in terms of modeling of pollution over East Asia and advanced numerical approaches such as inverse modeling will serve the development of an efficient and marketable forecasting system for regional outdoor air pollution. The performances of this upgraded forecasting system will be evaluated and promoted to ensure a good visibility of the French technology. In addition, the contribution of Chinese pollution to the regional and global atmospheric composition, as well as the resulting radiative forcing of short-lived species will be determined using both satellite observations and model simulations. Health Impact Assessment (HIA) methods coupled with model simulations will be used to estimate the long-term impacts of exposure to pollutants (PM2.5 and ozone) on cardiovascular and respiratory mortality. First results obtained in this framework will be presented.
Choi, Moo Jin; Choi, Byung Tae; Shin, Hwa Kyoung; Shin, Byung Cheul; Han, Yoo Kyoung; Baek, Jin Ung
2015-01-01
The major objectives of this study were to provide a list of candidate antiaging medicinal herbs that have been widely utilized in Korean medicine and to organize preliminary data for the benefit of experimental and clinical researchers to develop new drug therapies by analyzing previous studies. “Dongeuibogam,” a representative source of the Korean medicine literature, was selected to investigate candidate antiaging medicinal herbs and to identify appropriate terms that describe the specific antiaging effects that these herbs are predicted to elicit. In addition, we aimed to review previous studies that referenced the selected candidate antiaging medicinal herbs. From our chosen source, “Dongeuibogam,” we were able to screen 102 terms describing antiaging effects, which were further classified into 11 subtypes. Ninety-seven candidate antiaging medicinal herbs were selected using the criterion that their antiaging effects were described using the same terms as those employed in “Dongeuibogam.” These candidates were classified into 11 subtypes. Of the 97 candidate antiaging medicinal herbs selected, 47 are widely used by Korean medical doctors in Korea and were selected for further analysis of their antiaging effects. Overall, we found an average of 7.7 previous studies per candidate herb that described their antiaging effects. PMID:25861371
ERIC Educational Resources Information Center
Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.
2008-01-01
A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…
Infrared and visible image fusion with spectral graph wavelet transform.
Yan, Xiang; Qin, Hanlin; Li, Jia; Zhou, Huixin; Zong, Jing-guo
2015-09-01
Infrared and visible image fusion technique is a popular topic in image analysis because it can integrate complementary information and obtain reliable and accurate description of scenes. Multiscale transform theory as a signal representation method is widely used in image fusion. In this paper, a novel infrared and visible image fusion method is proposed based on spectral graph wavelet transform (SGWT) and bilateral filter. The main novelty of this study is that SGWT is used for image fusion. On the one hand, source images are decomposed by SGWT in its transform domain. The proposed approach not only effectively preserves the details of different source images, but also excellently represents the irregular areas of the source images. On the other hand, a novel weighted average method based on bilateral filter is proposed to fuse low- and high-frequency subbands by taking advantage of spatial consistency of natural images. Experimental results demonstrate that the proposed method outperforms seven recently proposed image fusion methods in terms of both visual effect and objective evaluation metrics.
MICROBIAL LABORATORY GUIDANCE MANUAL FOR THE ...
The Long-Term 2 Enhanced Surface Water Treatment Rule Laboratory Instruction Manual will be a compilation of all information needed by laboratories and field personnel to collect, analyze, and report the microbiological data required under the rule. The manual will provide laboratories with a single source of information that currently is available from various sources including the latest versions of Methods 1622 and 1623, including all approved, equivalent modifications; the procedures for E.coli methods approved for use under the LT2ESWTR; lists of vendor sources; data recording forms; data reporting requirements; information on the Laboratory Quality Assurance Evaluation Program for the Analysis of Cryptosporidium in Water; and sample collection procedures. Although most of this information is available elsewhere, a single, comprehensive compendium containing this information is needed to aid utilities and laboratories performing the sampling and analysis activities required under the LT2 rule. This manual will serve as an instruction manual for laboratories to use when collecting data for Crypto, E. coli and turbidity.
Karlinger, M.R.; Troutman, B.M.
1985-01-01
An instantaneous unit hydrograph (iuh) based on the theory of topologically random networks (topological iuh) is evaluated in terms of sets of basin characteristics and hydraulic parameters. Hydrographs were computed using two linear routing methods for each of two drainage basins in the southeastern United States and are the basis of comparison for the topological iuh's. Elements in the sets of basin characteristics for the topological iuh's are the number of first-order streams only, (N), or the nuber of sources together with the number of channel links in the topological diameter (N, D); the hydraulic parameters are values of the celerity and diffusivity constant. Sensitivity analyses indicate that the mean celerity of the internal links in the network is the critical hydraulic parameter for determining the shape of the topological iuh, while the diffusivity constant has minimal effect on the topological iuh. Asymptotic results (source-only) indicate the number of sources need not be large to approximate the topological iuh with the Weibull probability density function.
NASA Technical Reports Server (NTRS)
Clapp, J. L.
1973-01-01
Research objectives during 1972-73 were to: (1) Ascertain the extent to which special aerial photography can be operationally used in monitoring water pollution parameters. (2) Ascertain the effectiveness of remote sensing in the investigation of nearshore mixing and coastal entrapment in large water bodies. (3) Develop an explicit relationship of the extent of the mixing zone in terms of the outfall, effluent and water body characteristics. (4) Develop and demonstrate the use of the remote sensing method as an effective legal implement through which administrative agencies and courts can not only investigate possible pollution sources but also legally prove the source of water pollution. (5) Evaluate the field potential of remote sensing techniques in monitoring algal blooms and aquatic macrophytes, and the use of these as indicators of lake eutrophication level. (6) Develop a remote sensing technique for the determination of the location and extent of hydrologically active source areas in a watershed.
NASA Astrophysics Data System (ADS)
Isaac-Renton, Miriam; Montwé, David; Hamann, Andreas; Spiecker, Heinrich; Cherubini, Paolo; Treydte, Kerstin
2016-04-01
Choosing drought-tolerant seed sources for reforestation may help adapt forests to climate change. By combining dendroecological growth analysis with a long-term provenance trial, we assessed growth and drought tolerance of different populations of a wide-ranging conifer, lodgepole pine (Pinus contorta). This experimental design simulated a climate warming scenario through southward seed transfer, and an exceptional drought also occurred in 2002. We felled over 500 trees, representing 23 seed sources, which were grown for 32 years at three warm, dry sites in southern British Columbia, Canada. Northern populations showed poor growth and drought tolerance. These seed sources therefore appear to be especially at risk under climate change. Before recommending assisted migration of southern seeds towards the north, however, it is important to understand the physiological mechanisms underlying these responses. We combine functional wood anatomy with a dual-isotope approach to evaluate these mechanisms to drought response.
NASA thesaurus. Volume 3: Definitions
NASA Technical Reports Server (NTRS)
1988-01-01
Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.
Al-Khaza'leh, Ja'far Mansur; Reiber, Christoph; Al Baqain, Raid; Valle Zárate, Anne
2015-01-01
Goat production is an important agricultural activity in Jordan. The country is one of the poorest countries in the world in terms of water scarcity. Provision of sufficient quantity of good quality drinking water is important for goats to maintain feed intake and production. This study aimed to evaluate the seasonal availability and quality of goats' drinking water sources, accessibility, and utilization in different zones in the Karak Governorate in southern Jordan. Data collection methods comprised interviews with purposively selected farmers and quality assessment of water sources. The provision of drinking water was considered as one of the major constraints for goat production, particularly during the dry season (DS). Long travel distances to the water sources, waiting time at watering points, and high fuel and labor costs were the key reasons associated with the problem. All the values of water quality (WQ) parameters were within acceptable limits of the guidelines for livestock drinking WQ with exception of iron, which showed slightly elevated concentration in one borehole source in the DS. These findings show that water shortage is an important problem leading to consequences for goat keepers. To alleviate the water shortage constraint and in view of the depleted groundwater sources, alternative water sources at reasonable distance have to be tapped and monitored for water quality and more efficient use of rainwater harvesting systems in the study area is recommended.
Echolocation versus echo suppression in humans
Wallmeier, Ludwig; Geßele, Nikodemus; Wiegrebe, Lutz
2013-01-01
Several studies have shown that blind humans can gather spatial information through echolocation. However, when localizing sound sources, the precedence effect suppresses spatial information of echoes, and thereby conflicts with effective echolocation. This study investigates the interaction of echolocation and echo suppression in terms of discrimination suppression in virtual acoustic space. In the ‘Listening’ experiment, sighted subjects discriminated between positions of a single sound source, the leading or the lagging of two sources, respectively. In the ‘Echolocation’ experiment, the sources were replaced by reflectors. Here, the same subjects evaluated echoes generated in real time from self-produced vocalizations and thereby discriminated between positions of a single reflector, the leading or the lagging of two reflectors, respectively. Two key results were observed. First, sighted subjects can learn to discriminate positions of reflective surfaces echo-acoustically with accuracy comparable to sound source discrimination. Second, in the Listening experiment, the presence of the leading source affected discrimination of lagging sources much more than vice versa. In the Echolocation experiment, however, the presence of both the lead and the lag strongly affected discrimination. These data show that the classically described asymmetry in the perception of leading and lagging sounds is strongly diminished in an echolocation task. Additional control experiments showed that the effect is owing to both the direct sound of the vocalization that precedes the echoes and owing to the fact that the subjects actively vocalize in the echolocation task. PMID:23986105
Retrieving definitional content for ontology development.
Smith, L; Wilbur, W J
2004-12-01
Ontology construction requires an understanding of the meaning and usage of its encoded concepts. While definitions found in dictionaries or glossaries may be adequate for many concepts, the actual usage in expert writing could be a better source of information for many others. The goal of this paper is to describe an automated procedure for finding definitional content in expert writing. The approach uses machine learning on phrasal features to learn when sentences in a book contain definitional content, as determined by their similarity to glossary definitions provided in the same book. The end result is not a concise definition of a given concept, but for each sentence, a predicted probability that it contains information relevant to a definition. The approach is evaluated automatically for terms with explicit definitions, and manually for terms with no available definition.
A multi-scalar PDF approach for LES of turbulent spray combustion
NASA Astrophysics Data System (ADS)
Raman, Venkat; Heye, Colin
2011-11-01
A comprehensive joint-scalar probability density function (PDF) approach is proposed for large eddy simulation (LES) of turbulent spray combustion and tests are conducted to analyze the validity and modeling requirements. The PDF method has the advantage that the chemical source term appears closed but requires models for the small scale mixing process. A stable and consistent numerical algorithm for the LES/PDF approach is presented. To understand the modeling issues in the PDF method, direct numerical simulation of a spray flame at three different fuel droplet Stokes numbers and an equivalent gaseous flame are carried out. Assumptions in closing the subfilter conditional diffusion term in the filtered PDF transport equation are evaluated for various model forms. In addition, the validity of evaporation rate models in high Stokes number flows is analyzed.
The Role of Near-Earth Asteroids in Long-Term Platinum Supply
NASA Astrophysics Data System (ADS)
Blair, B. R.
2000-01-01
High-grade platinum-group metal concentrations have been identified in an abundant class of near-Earth asteroids known as LL Chondrites. The potential existence of a high-value asteroid-derived mineral product is examined from an economic perspective to assess the possible impacts on long-term precious metal supply. It is hypothesized that extraterrestrial sources of platinum group metals will become available in the global marketplace in a 20-year time frame, based on current trends of growth in technology and increasing levels of human activities in near-Earth space. Current and projected trends in platinum supply and demand are cited from the relevant literature to provide an economic context and provide an example for evaluating the economic potential of future asteroid-derived precious and strategic metals.
Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2010-12-01
The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and they need to convey the epistemic uncertainties in the operational forecasts. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. All operational procedures should be rigorously reviewed by experts in the creation, delivery, and utility of earthquake forecasts. (c) The quality of all operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing in a CSEP-type environment against established long-term forecasts and a wide variety of alternative, time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in PSHA. (e) Alert procedures should be standardized to facilitate decisions at different levels of government and among the public, based in part on objective analysis of costs and benefits. (f) In establishing alert procedures, consideration should also be made of the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that can lead to informal predictions and misinformation.
On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel
2018-05-01
We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.
Extraction of antioxidants from Chlorella sp. using subcritical water treatment
NASA Astrophysics Data System (ADS)
Zakaria, S. M.; Mustapa Kamal, S. M.; Harun, M. R.; Omar, R.; Siajam, S. I.
2017-06-01
Chlorella sp. microalgae is one of the main source of natural bioactive compounds used in the food and pharmaceutical industries. Subcritical water extraction is the technique that offers an efficient, non-toxic, and environmental-friendly method to obtain natural ingredients. In this work, the extracts of Chlorella sp. microalgae was evaluated in terms of: chemical composition, extraction (polysaccharides) yield and antioxidant activity, using subcritical water extraction. Extractions were performed at temperatures ranging from 100°C to 300°C. The results show that by using subcritical water, the highest yield of polysaccharides is 23.6 that obtained at 150°C. Analysis on the polysaccharides yield show that the contents were highly influenced by the extraction temperature. The individual antioxidant activity were evaluated by in vitro assay using a free radical method. In general, the antioxidant activity of the extracts obtained at different water temperatures was high, with values of 31.08-54.29 . The results indicated that extraction by subcritical water was effective and Chlorella sp. can be a useful source of natural antioxidants.
Validation of the digital opacity compliance system under regulatory enforcement conditions.
McFarland, Michael J; Rasmussen, Steve L; Stone, Daniel A; Palmer, Glenn R; Wander, Joseph D
2006-09-01
U.S. Environmental Protection Agency (EPA) Emission Measurement Center in conjunction with EPA Regions VI and VIII, the state of Utah, and the U.S. Department of Defense have conducted a series of long-term pilot and field tests to determine the accuracy and reliability of a visible opacity monitoring system consisting of a conventional digital camera and a separate computer software application for plume opacity determination. This technology, known as the Digital Opacity Compliance System (DOCS), has been successfully demonstrated at EPA-sponsored Method-9 "smoke schools", as well as at a number of government and commercially operated industrial facilities. Results from the current DOCS regulatory pilot study demonstrated that, under regulatory enforcement conditions, the average difference in opacity measurement between the DOCS technology and EPA Reference Method 9 (Method 9) was 1.12%. This opacity difference, which was computed from the evaluation of 241 regulated air sources, was found to be statistically significant at the 99% confidence level. In evaluating only those sources for which a nonzero visible opacity level was recorded, the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diaz, Aaron A.; Baldwin, David L.; Cinson, Anthony D.
2014-08-06
This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of themore » SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.« less
Tomini, F; Prinzen, F; van Asselt, A D I
2016-12-01
Cardiac resynchronization therapy with a biventricular pacemaker (CRT-P) is an effective treatment for dyssynchronous heart failure (DHF). Adding an implantable cardioverter defibrillator (CRT-D) may further reduce the risk of sudden cardiac death (SCD). However, if the majority of patients do not require shock therapy, the cost-effectiveness ratio of CRT-D compared to CRT-P may be high. The objective of this study was to systematically review decision models evaluating the cost-effectiveness of CRT-D for patients with DHF, compare the structure and inputs of these models and identify the main factors influencing the ICERs for CRT-D. A comprehensive search strategy of Medline (Ovid), Embase (Ovid) and EconLit identified eight cost-effectiveness models evaluating CRT-D against optimal pharmacological therapy (OPT) and/or CRT-P. The selected economic studies differed in terms of model structure, treatment path, time horizons, and sources of efficacy data. CRT-D was found cost-effective when compared to OPT but its cost-effectiveness became questionable when compared to CRT-P. Cost-effectiveness of CRT-D may increase depending on improvement of all-cause mortality rates and HF mortality rates in patients who receive CRT-D, costs of the device, and battery life. In particular, future studies need to investigate longer-term mortality rates and identify CRT-P patients that will gain the most, in terms of life expectancy, from being treated with a CRT-D.
PASTE: patient-centered SMS text tagging in a medication management system.
Stenner, Shane P; Johnson, Kevin B; Denny, Joshua C
2012-01-01
To evaluate the performance of a system that extracts medication information and administration-related actions from patient short message service (SMS) messages. Mobile technologies provide a platform for electronic patient-centered medication management. MyMediHealth (MMH) is a medication management system that includes a medication scheduler, a medication administration record, and a reminder engine that sends text messages to cell phones. The object of this work was to extend MMH to allow two-way interaction using mobile phone-based SMS technology. Unprompted text-message communication with patients using natural language could engage patients in their healthcare, but presents unique natural language processing challenges. The authors developed a new functional component of MMH, the Patient-centered Automated SMS Tagging Engine (PASTE). The PASTE web service uses natural language processing methods, custom lexicons, and existing knowledge sources to extract and tag medication information from patient text messages. A pilot evaluation of PASTE was completed using 130 medication messages anonymously submitted by 16 volunteers via a website. System output was compared with manually tagged messages. Verified medication names, medication terms, and action terms reached high F-measures of 91.3%, 94.7%, and 90.4%, respectively. The overall medication name F-measure was 79.8%, and the medication action term F-measure was 90%. Other studies have demonstrated systems that successfully extract medication information from clinical documents using semantic tagging, regular expression-based approaches, or a combination of both approaches. This evaluation demonstrates the feasibility of extracting medication information from patient-generated medication messages.
NASA Astrophysics Data System (ADS)
Madani, K.; Jess, T.; Mahlooji, M.; Ristic, B.
2015-12-01
The world's energy sector is experiencing a serious transition from reliance on fossil fuel energy sources to extensive reliance on renewable energies. Europe is leading the way in this transition to a low carbon economy in an attempt to keep climate change below 2oC. Member States have committed themselves to reducing greenhouse gas emissions by 20% and increasing the share of renewables in the EU's energy mix to 20% by 2020. The EU has now gone a step further with the objective of reducing greenhouse gas emissions by 80-95% by 2050. Nevertheless, the short-term focus of the European Commission is at "cost-efficient ways" to cut its greenhouse gas emissions which forgoes the unintended impacts of a large expansion of low-carbon energy technologies on major natural resources such as water and land. This study uses the "System of Systems (SoS) Approach to Energy Sustainability Assessment" (Hadian and Madani, 2015) to evaluate the Relative Aggregate Footprint (RAF) of energy sources in different European Union (EU) member states. RAF reflects the overall resource-use efficiency of energy sources with respect to four criteria: carbon footprint, water footprint, land footprint, and economic cost. Weights are assigned to the four resource use efficiency criteria based on each member state's varying natural and economic resources to examine the changes in the desirability of energy sources based on regional resource availability conditions, and to help evaluating the overall resource use efficiency of the EU's energy portfolio. A longer-term strategy in Europe has been devised under the "Resource Efficient Europe" flagship imitative intended to put the EU on course to using resources in a sustainable way. This study will highlight the resource efficiency of the EU's energy sector in order to assist in a sustainable transition to a low carbon economy in Europe. ReferenceHadian S, Madani K (2015) A System of Systems Approach to Energy Sustainability Assessment: Are All Renewables Really Green? Ecological Indicators, 52, 194-206.
NASA Astrophysics Data System (ADS)
Hayashi, I.; Kaneko, S.
2014-02-01
Pressure pulsations excited by a centrifugal turbomachinery such as compressor, fan or pump at the blade passing frequency may cause severe noise and vibrations in piping system. Therefore, the practical evaluation method of pressure pulsations is strongly recommended. In particular, the maximum pressure amplitude under the resonant conditions should be appropriately evaluated. In this study, a one-dimensional excitation source model for a compressor or pump is introduced based on the equation of motion, so as to incorporate the non-linear damping proportional to velocity squared in the total piping system including the compressor or pump. The damping characteristics of the compressor or pump are investigated by using the semi-empirical model. It is shown that the resistance coefficient of the compressor or pump depends on the Reynolds number that is defined using the equivalent velocity of the pulsating flow. The frequency response of the pressure amplitude and the pressure distribution in the piping system can be evaluated by introducing the equivalent resistance of the compressor or pump and that of piping system. In particular, the relation of the maximum pressure amplitude in piping system to the location of the excitation source under resonant conditions can be evaluated. Finally, the reduction of the pressure pulsations by use of an orifice plate is discussed in terms of the pulsation energy loss.
Directional Unfolded Source Term (DUST) for Compton Cameras.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean
2018-03-01
A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.
NASA Astrophysics Data System (ADS)
Boutt, D. F.
2017-12-01
The isotopic composition of surface and groundwater is impacted by a multitude of hydrologic processes. The long-term response of these systems to hydrologic change is critical for appropriately interpreting isotopic information for streamflow generation, stream-aquifer-coupling, sources of water to wells, and understanding recharge processes. To evaluate the response time of stream-aquifer systems to extreme precipitation events we use a long-term isotope dataset from Western Massachusetts with drainage areas ranging from 0.1 to > 800 km2. The year of 2011 was the wettest calendar year on record and the months of August and September of 2011 were the wettest consecutive two-month period in the 123 year record. Stable isotopic composition of surface waters of catchments ranging from 1 - 1000 km2 show an enrichment due to summertime and Tropical Storm precipitation. Enrichment in potential recharge water is shown to have a significant long-term impact (> 3 hydrologic years) on the isotopic composition of both surface and groundwater. This highlights the importance of groundwater sources of baseflow to streams and the transient storage and release mechanisms of shallow groundwater storage. The length of isotopic recession of stream water are also a strong function of watershed area. It is concluded that the stream water isotopes are consistent with a large pulse of water being stored and released from enriched groundwater emplaced during this period of above-average precipitation. Ultimately the results point to the importance of considering hydrological processes of streamflow generation and their role in hydrologic processes beyond traditional catchment response analysis.
Cost of care of haemophilia with inhibitors.
Di Minno, M N D; Di Minno, G; Di Capua, M; Cerbone, A M; Coppola, A
2010-01-01
In Western countries, the treatment of patients with inhibitors is presently the most challenging and serious issue in haemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the healthcare of patients in this setting. Being designed to address questions of resource allocation and effectiveness, decision models are the golden standard to reliably assess the overall economic implications of haemophilia with inhibitors in terms of mortality, bleeding-related morbidity, and severity of arthropathy. However, presently, most data analyses stem from retrospective short-term evaluations, that only allow for the analysis of direct health costs. In the setting of chronic diseases, the cost-utility analysis, that takes into account the beneficial effects of a given treatment/healthcare intervention in terms of health-related quality of life, is likely to be the most appropriate approach. To calculate net benefits, the quality adjusted life year, that significantly reflects such health gain, has to be compared with specific economic impacts. Differences in data sources, in medical practice and/or in healthcare systems and costs, imply that most current pharmacoeconomic analyses are confined to a narrow healthcare payer perspective. Long-term/lifetime prospective or observational studies, devoted to a careful definition of when to start a treatment; of regimens (dose and type of product) to employ, and of inhibitor population (children/adults, low-responding/high responding inhibitors) to study, are thus urgently needed to allow for newer insights, based on reliable data sources into resource allocation, effectiveness and cost-utility analysis in the treatment of haemophiliacs with inhibitors.
Hwang, Jee-In; Cimino, James J; Bakken, Suzanne
2003-01-01
The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process.
Hwang, Jee-In; Cimino, James J.; Bakken, Suzanne
2003-01-01
Objective: The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. Design and Measurements: The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. Results: The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Conclusions: Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process. PMID:12668692
Erratum to Surface‐wave green’s tensors in the near field
Haney, Matthew M.; Hisashi Nakahara,
2016-01-01
Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).
Flowsheets and source terms for radioactive waste projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forsberg, C.W.
1985-03-01
Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.
Overview of California's Efforts to Understand and Reduce Methane Sources
NASA Astrophysics Data System (ADS)
Croes, B. E.; Chen, Y.; Duren, R. M.; Falk, M.; Franco, G.; Herner, J.; Ingram, W.; Kuwayama, T.; McCarthy, R.; Scheehle, E.; Vijayan, A.
2016-12-01
Methane is an important short-lived climate pollutant (SLCP) and also has significant health implications as a tropospheric ozone precursor. As part of a comprehensive effort to reduce greenhouse gas (GHG) emissions overall by 40% from 1990 levels by 2030, California has proposed an SLCP Strategy that includes a 40% reduction of methane emissions from 2013 levels by 2030, with goals to reduce oil and gas related emissions and capture methane emissions from dairy operations and organic waste. A recent analysis of satellite data found a large methane "hot spot" over the Central Valley in California, likely the second largest over the entire U.S. In light of this finding, the California legislature passed Assembly Bill 1496 in 2015, which requires the California Air Resources Board (CARB) to undertake measurements to understand the sources of methane hot spots, evaluate life-cycle emissions from natural gas imported into California, and update relevant policies and programs. There is growing evidence in the recent scientific literature suggesting that a small fraction of methane sources within a category emit disproportionately higher emissions than their counterparts, usually referred to as "super emitters". As such, controlling these sources may provide a lower cost opportunity for methane reductions needed to meet near- and long-term climate goals. In order to achieve a comprehensive understanding of sources contributing to "hot spots", CARB, the California Energy Commission, and NASA's Jet Propulsion Laboratory are implementing a large-scale statewide methane survey using a tiered monitoring and measurement program, which will include airborne and ground-level measurements of the various regions and source sectors in the State. This presentation will discuss research and program implementation efforts to evaluate and mitigate methane super emitters and hot spots. These efforts are expected to improve our understanding of methane emission source distributions, improve the estimate of the overall magnitude of anthropogenic methane emissions in California, and inform and improve the effectiveness of methane reduction policies and programs.
Consequences of the Chernobyl accident for the natural and human environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreicer, M.; Aarkog, A.; Alexakhin, R.
1996-07-01
In the ten years since the Chernobyl accident, an enormous amount of work has been done to assess the consequences to the natural and human environment. Although it is difficult to summarize such a large and varied field, some general conclusions can be drawn. This background paper includes the main findings concerning the direct impacts of radiation on the flora and fauna; the general advances of knowledge in the cycling of radionuclides in natural, seminatural and agricultural environments; some evaluation of countermeasures that were used; and a summary of the human radiation doses resulting from the environmental contamination. although openmore » questions still remain, it can be concluded that: (1) at high radiation levels, the natural environment has shown short term impacts but any significant long term impacts remain to be seen; (2) effective countermeasures can be taken to reduce the transfer of contamination from the environment to humans but these are highly site specific and must be evaluated in terms of practicality as well as population does reduction; (3) the majority of the doses have already been received by the human population. If agricultural countermeasures are appropriately taken, the main source of future doses will be the gathering of food and recreational activities in natural and seminatural ecosystems.« less
Barlow, D J; Buriani, A; Ehrman, T; Bosisio, E; Eberini, I; Hylands, P J
2012-04-10
The available databases that catalogue information on traditional Chinese medicines are reviewed in terms of their content and utility for in-silico research on Chinese herbal medicines, as too are the various protein database resources, and the software available for use in such studies. The software available for bioinformatics and 'omics studies of Chinese herbal medicines are summarised, and a critical evaluation given of the various in-silico methods applied in screening Chinese herbal medicines, including classification trees, neural networks, support vector machines, docking and inverse docking algorithms. Recommendations are made regarding any future in-silico studies of Chinese herbal medicines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
González-Román, Loreto; Bagur-Calafat, Caritat; Urrútia-Cuchí, Gerard; Garrido-Pedrosa, Jèssica
2016-01-01
This systematic review aims to report the effectiveness of interventions based on exercise and/or physical environment for reducing falls in cognitively impaired older adults living in long-term care facilities. In July 2014, a literature search was conducted using main databases and specialised sources. Randomised controlled trials assessing the effectiveness of fall prevention interventions, which used exercise or physical environment among elderly people with cognitive impairment living in long-term care facilities, were selected. Two independent reviewers checked the eligibility of the studies, and evaluated their methodological quality. If it was adequate, data were gathered. Fourteen studies with 3,539 participants using exercise and/or physical environment by a single or combined approach were included. The data gathered from studies that used both interventions showed a significant reduction in fall rate. Further research is needed to demonstrate the effectiveness of those interventions for preventing falls in the elderly with cognitive impairment living in long-term care establishments. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.
Hyperboloidal evolution of test fields in three spatial dimensions
NASA Astrophysics Data System (ADS)
Zenginoǧlu, Anıl; Kidder, Lawrence E.
2010-06-01
We present the numerical implementation of a clean solution to the outer boundary and radiation extraction problems within the 3+1 formalism for hyperbolic partial differential equations on a given background. Our approach is based on compactification at null infinity in hyperboloidal scri fixing coordinates. We report numerical tests for the particular example of a scalar wave equation on Minkowski and Schwarzschild backgrounds. We address issues related to the implementation of the hyperboloidal approach for the Einstein equations, such as nonlinear source functions, matching, and evaluation of formally singular terms at null infinity.
Dictionary-Based Tensor Canonical Polyadic Decomposition
NASA Astrophysics Data System (ADS)
Cohen, Jeremy Emile; Gillis, Nicolas
2018-04-01
To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. Performances of the proposed algorithms are evaluated on the decomposition of simulated data and the unmixing of hyperspectral images.
Translation lexicon acquisition from bilingual dictionaries
NASA Astrophysics Data System (ADS)
Doermann, David S.; Ma, Huanfeng; Karagol-Ayan, Burcu; Oard, Douglas W.
2001-12-01
Bilingual dictionaries hold great potential as a source of lexical resources for training automated systems for optical character recognition, machine translation and cross-language information retrieval. In this work we describe a system for extracting term lexicons from printed copies of bilingual dictionaries. We describe our approach to page and definition segmentation and entry parsing. We have used the approach to parse a number of dictionaries and demonstrate the results for retrieval using a French-English Dictionary to generate a translation lexicon and a corpus of English queries applied to French documents to evaluation cross-language IR.
Fluorescence from polystyrene - Photochemical processes in polymeric systems, 7
NASA Technical Reports Server (NTRS)
Gupta, M. C.; Gupta, A.
1983-01-01
Results are presented for measurements of the fluorescence spectra of polystyrene in dilute solution and in pure solid films. It is determined that a major potential source of experimental error is the concurrent photooxidative degradation in air which may obscure fluorescence emission from monomeric sites in solid films at 25 C. The fluorescence spectra of oriented films are evaluated in terms of the monomer to excimer fluorescence intensity ratio and the excimer 'red shift'. The monomer to excimer fluorescence intensity ratio is determined to be significantly higher in fluid solution than in solid film.
A Benchmark Study of Large Contract Supplier Monitoring Within DOD and Private Industry
1994-03-01
83 2. Long Term Supplier Relationships ...... .. 84 3. Global Sourcing . . . . . . . . . . . . .. 85 4. Refocusing on Customer Quality...monitoring and recognition, reduced number of suppliers, global sourcing, and long term contractor relationships . These initiatives were then compared to DCMC...on customer quality. 14. suBJE.C TERMS Benchmark Study of Large Contract Supplier Monitoring. 15. NUMBER OF PAGES108 16. PRICE CODE 17. SECURITY
Antibiotic Use and Need for Antimicrobial Stewardship in Long-Term Care.
Wu, Lisa Dong-Ying; Walker, Sandra A N; Elligsen, Marion; Palmay, Lesley; Simor, Andrew; Daneman, Nick
2015-01-01
Antimicrobial stewardship may be important in long-term care facilities because of unnecessary or inappropriate antibiotic use observed in these residents, coupled with their increased vulnerability to health care-associated infections. To assess antibiotic use in a long-term care facility in order to identify potential antimicrobial stewardship needs. A retrospective descriptive study was conducted at the Veterans Centre, a long-term care facility at Sunnybrook Health Sciences Centre, Toronto, Ontario. All residents taking one or more antibiotics (n = 326) were included as participants. Antibiotic-use data for patients residing in the facility between April 1, 2011, and March 31, 2012, were collected and analyzed. Totals of 358 patient encounters, 835 antibiotic prescriptions, and 193 positive culture results were documented during the study period. For 36% (302/835) of antibiotic prescriptions, the duration was more than 7 days. Cephalosporins (30%; 251/835) and fluoroquinolones (28%; 235/835) were the most frequently prescribed antibiotic classes. Urine was the most common source of samples for culture (60%; 116/193). Characteristics of antimicrobial use at this long-term care facility that might benefit from further evaluation included potentially excessive use of fluoroquinolones and cephalosporins and potentially excessive duration of antibiotic use for individual patients.
Evaluation of Chemistry-Climate Model Results using Long-Term Satellite and Ground-Based Data
NASA Technical Reports Server (NTRS)
Stolarski, Richard S.
2005-01-01
Chemistry-climate models attempt to bring together our best knowledge of the key processes that govern the composition of the atmosphere and its response to changes in forcing. We test these models on a process by process basis by comparing model results to data from many sources. A more difficult task is testing the model response to changes. One way to do this is to use the natural and anthropogenic experiments that have been done on the atmosphere and are continuing to be done. These include the volcanic eruptions of El Chichon and Pinatubo, the solar cycle, and the injection of chlorine and bromine from CFCs and methyl bromide. The test of the model's response to these experiments is their ability to produce the long-term variations in ozone and the trace gases that affect ozone. We now have more than 25 years of satellite ozone data. We have more than 15 years of satellite and ground-based data of HC1, HN03, and many other gases. I will discuss the testing of models using long-term satellite data sets, long-term measurements from the Network for Detection of Stratospheric Change (NDSC) , long-term ground-based measurements of ozone.
An Improved Neutron Transport Algorithm for Space Radiation
NASA Technical Reports Server (NTRS)
Heinbockel, John H.; Clowdsley, Martha S.; Wilson, John W.
2000-01-01
A low-energy neutron transport algorithm for use in space radiation protection is developed. The algorithm is based upon a multigroup analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. This analysis is accomplished by solving a realistic but simplified neutron transport test problem. The test problem is analyzed by using numerical and analytical procedures to obtain an accurate solution within specified error bounds. Results from the test problem are then used for determining mean values associated with rescattering terms that are associated with a multigroup solution of the straight-ahead Boltzmann equation. The algorithm is then coupled to the Langley HZETRN code through the evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for a water and an aluminum-water shield-target configuration is then compared with LAHET and MCNPX Monte Carlo code calculations for the same shield-target configuration. The algorithm developed showed a great improvement in results over the unmodified HZETRN solution. In addition, a two-directional solution of the evaporation source showed even further improvement of the fluence near the front of the water target where diffusion from the front surface is important.
Hostettler, F.D.; Wang, Y.; Huang, Y.; Cao, W.; Bekins, B.A.; Rostad, C.E.; Kulpa, C.F.; Laursen, Andrew E.
2007-01-01
In recent decades forensic fingerprinting of oil-spill hydrocarbons has emerged as an important tool for correlating oils and for evaluating their source and character. Two long-term hydrocarbon spills, an off-road diesel spill (Mandan, ND) and a crude oil spill (Bemidji, MN) experiencing methanogenic biodegradation were previously shown to be undergoing an unexpected progression of homologous n-alkane and n-alkylated cyclohexane loss. Both exhibited degradative losses proceeding from the high-molecular-weight end of the distributions, along with transitory concentration increases of lower-molecular-weight homologs. Particularly in the case of the diesel fuel spill, these methanogenic degradative patterns can result in series distributions that mimic lower cut refinery fuels or admixture with lower cut fuels. Forensic fingerprinting in this long-term spill must therefore rely on more recalcitrant series, such as polycyclic aromatic hydrocarbon or drimane sesquiterpane profiles, to prove if the spilled oil is single-sourced or whether there is verifiable admixture with other extraneous refinery fuels. Degradation processes impacting n-alkanes and n-alkylated ring compounds, which make these compounds unsuitable for fingerprinting, nevertheless are of interest in understanding methanogenic biodegradation. Copyright ?? Taylor & Francis Group, LLC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mark DeHart; William Skerjanc; Sean Morrell
2012-06-01
Analysis of the performance of the ATR with a LEU fuel design shows promise in terms of a core design that will yield the same neutron sources in target locations. A proposed integral cladding burnable absorber design appears to meet power profile requirements that will satisfy power distributions for safety limits. Performance of this fuel design is ongoing; the current work is the initial evaluation of the core performance of this fuel design with increasing burnup. Results show that LEU fuel may have a longer lifetime that HEU fuel however, such limits may be set by mechanical performance of themore » fuel rather that available reactivity. Changes seen in the radial fuel power distribution with burnup in LEU fuel will require further study to ascertain the impact on neutron fluxes in target locations. Source terms for discharged fuel have also been studied. By its very nature, LEU fuel produces much more plutonium than is present in HEU fuel at discharge. However, the effect of the plutonium inventory appears to have little affect on radiotoxicity or decay heat in the fuel.« less
Upgrade of BPM Electronics for the SPring-8 Storage Ring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasaki, Shigeki; Fujita, Takahiro; Shoji, Masazumi
2006-11-20
SPring-8, a 3rd generation synchrotron light source, has operated since 1997. Improvement of BPM performance is required as a part of upgrading activities of the storage ring as a light source. We have developed new electronics circuits for signal processing of the storage ring BPM, with target performance of sub-{mu}m range resolution with sufficiently fast measurement speed and good long-term stability. A set of the new circuits consists of multiplexers, an RF amplifier, a mixer, an IF amplifier, and a local oscillator for analog signal processing. The IF amplifier outputs are sampled with 16-bit 2-MSPS ADC on ADC boards andmore » the data are sent to a DSP board. The sampled data are processed and converted to position information in the DSP. A multiplexing method was employed to have a better stability of the performance by cancellation of variation common to each channel. Evaluation of the performance by using a prototype shows that position resolution well into the sub-{mu}m range has been achieved with a bandwidth of 1 kHz, and long-term stability of within 1 {mu}m has also been achieved.« less
Point focusing using loudspeaker arrays from the perspective of optimal beamforming.
Bai, Mingsian R; Hsieh, Yu-Hao
2015-06-01
Sound focusing is to create a concentrated acoustic field in the region surrounded by a loudspeaker array. This problem was tackled in the previous research via the Helmholtz integral approach, brightness control, acoustic contrast control, etc. In this paper, the same problem was revisited from the perspective of beamforming. A source array model is reformulated in terms of the steering matrix between the source and the field points, which lends itself to the use of beamforming algorithms such as minimum variance distortionless response (MVDR) and linearly constrained minimum variance (LCMV) originally intended for sensor arrays. The beamforming methods are compared with the conventional methods in terms of beam pattern, directional index, and control effort. Objective tests are conducted to assess the audio quality by using perceptual evaluation of audio quality (PEAQ). Experiments of produced sound field and listening tests are conducted in a listening room, with results processed using analysis of variance and regression analysis. In contrast to the conventional energy-based methods, the results have shown that the proposed methods are phase-sensitive in light of the distortionless constraint in formulating the array filters, which helps enhance audio quality and focusing performance.
An efficient soil water balance model based on hybrid numerical and statistical methods
NASA Astrophysics Data System (ADS)
Mao, Wei; Yang, Jinzhong; Zhu, Yan; Ye, Ming; Liu, Zhao; Wu, Jingwei
2018-04-01
Most soil water balance models only consider downward soil water movement driven by gravitational potential, and thus cannot simulate upward soil water movement driven by evapotranspiration especially in agricultural areas. In addition, the models cannot be used for simulating soil water movement in heterogeneous soils, and usually require many empirical parameters. To resolve these problems, this study derives a new one-dimensional water balance model for simulating both downward and upward soil water movement in heterogeneous unsaturated zones. The new model is based on a hybrid of numerical and statistical methods, and only requires four physical parameters. The model uses three governing equations to consider three terms that impact soil water movement, including the advective term driven by gravitational potential, the source/sink term driven by external forces (e.g., evapotranspiration), and the diffusive term driven by matric potential. The three governing equations are solved separately by using the hybrid numerical and statistical methods (e.g., linear regression method) that consider soil heterogeneity. The four soil hydraulic parameters required by the new models are as follows: saturated hydraulic conductivity, saturated water content, field capacity, and residual water content. The strength and weakness of the new model are evaluated by using two published studies, three hypothetical examples and a real-world application. The evaluation is performed by comparing the simulation results of the new model with corresponding results presented in the published studies, obtained using HYDRUS-1D and observation data. The evaluation indicates that the new model is accurate and efficient for simulating upward soil water flow in heterogeneous soils with complex boundary conditions. The new model is used for evaluating different drainage functions, and the square drainage function and the power drainage function are recommended. Computational efficiency of the new model makes it particularly suitable for large-scale simulation of soil water movement, because the new model can be used with coarse discretization in space and time.
ERIC Educational Resources Information Center
Littlejohn, Emily
2018-01-01
"Adaptation" originally began as a scientific term, but from 1860 to today it most often refers to an altered version of a text, film, or other literary source. When this term was first analyzed, humanities scholars often measured adaptations against their source texts, frequently privileging "original" texts. However, this…
40 CFR 401.11 - General definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Environmental Protection Agency. (d) The term point source means any discernible, confined and discrete conveyance, including but not limited to any pipe, ditch, channel, tunnel, conduit, well, discrete fissure... which pollutants are or may be discharged. (e) The term new source means any building, structure...
Assessing modelled spatial distributions of ice water path using satellite data
NASA Astrophysics Data System (ADS)
Eliasson, S.; Buehler, S. A.; Milz, M.; Eriksson, P.; John, V. O.
2010-05-01
The climate models used in the IPCC AR4 show large differences in monthly mean cloud ice. The most valuable source of information that can be used to potentially constrain the models is global satellite data. For this, the data sets must be long enough to capture the inter-annual variability of Ice Water Path (IWP). PATMOS-x was used together with ISCCP for the annual cycle evaluation in Fig. 7 while ECHAM-5 was used for the correlation with other models in Table 3. A clear distinction between ice categories in satellite retrievals, as desired from a model point of view, is currently impossible. However, long-term satellite data sets may still be used to indicate the climatology of IWP spatial distribution. We evaluated satellite data sets from CloudSat, PATMOS-x, ISCCP, MODIS and MSPPS in terms of monthly mean IWP, to determine which data sets can be used to evaluate the climate models. IWP data from CloudSat cloud profiling radar provides the most advanced data set on clouds. As CloudSat data are too short to evaluate the model data directly, it was mainly used here to evaluate IWP from the other satellite data sets. ISCCP and MSPPS were shown to have comparatively low IWP values. ISCCP shows particularly low values in the tropics, while MSPPS has particularly low values outside the tropics. MODIS and PATMOS-x were in closest agreement with CloudSat in terms of magnitude and spatial distribution, with MODIS being the best of the two. As PATMOS-x extends over more than 25 years and is in fairly close agreement with CloudSat, it was chosen as the reference data set for the model evaluation. In general there are large discrepancies between the individual climate models, and all of the models show problems in reproducing the observed spatial distribution of cloud-ice. Comparisons consistently showed that ECHAM-5 is the GCM from IPCC AR4 closest to satellite observations.
Qenam, Basel; Kim, Tae Youn; Carroll, Mark J; Hogarth, Michael
2017-12-18
Radiology reporting is a clinically oriented form of documentation that reflects critical information for patients about their health care processes. Realizing its importance, many medical institutions have started providing radiology reports in patient portals. The gain, however, can be limited because of medical language barriers, which require a way for customizing these reports for patients. The open-access, collaborative consumer health vocabulary (CHV) is a terminology system created for such purposes and can be the basis of lexical simplification processes for clinical notes. The aim of this study was to examine the comprehensibility and suitability of CHV in simplifying radiology reports for consumers. This was done by characterizing the content coverage and the lexical similarity between the terms in the reports and the CHV-preferred terms. The overall procedure was divided into the following two main stages: (1) translation and (2) evaluation. The translation process involved using MetaMap to link terms in the reports to CHV concepts. This is followed by replacing the terms with CHV-preferred terms using the concept names and sources table (MRCONSO) in the Unified Medical Language System (UMLS) Metathesaurus. In the second stage, medical terms in the reports and general terms that are used to describe medical phenomena were selected and evaluated by comparing the words in the original reports with the translated ones. The evaluation includes measuring the content coverage, investigating lexical similarity, and finding trends in missing concepts. Of the 792 terms selected from the radiology reports, 695 of them could be mapped directly to CHV concepts, indicating a content coverage of 88.5%. A total of 51 of the concepts (53%, 51/97) that could not be mapped are names of human anatomical structures and regions, followed by 28 anatomical descriptions and pathological variations (29%, 28/97). In addition, 12 radiology techniques and projections represented 12% of the unmapped concepts, whereas the remaining six concepts (6%, 12/97) were physiological descriptions. The rate of lexical similarity between the CHV-preferred terms and the terms in the radiology reports was approximately 72.6%. The CHV covered a high percentage of concepts found in the radiology reports, but unmapped concepts are associated with areas that are commonly found in radiology reporting. CHV terms also showed a high percentage of lexical similarity with terms in the reports, which contain a myriad of medical jargon. This suggests that many CHV terms might not be suitable for lay consumers who would not be facile with radiology-specific vocabulary. Therefore, further patient-centered content changes are needed of the CHV to increase its usefulness and facilitate its integration into consumer-oriented applications. ©Basel Qenam, Tae Youn Kim, Mark J Carroll, Michael Hogarth. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 18.12.2017.
Uncertainty quantification in (α,n) neutron source calculations for an oxide matrix
Pigni, M. T.; Croft, S.; Gauld, I. C.
2016-04-25
Here we present a methodology to propagate nuclear data covariance information in neutron source calculations from (α,n) reactions. The approach is applied to estimate the uncertainty in the neutron generation rates for uranium oxide fuel types due to uncertainties on 1) 17,18O( α,n) reaction cross sections and 2) uranium and oxygen stopping power cross sections. The procedure to generate reaction cross section covariance information is based on the Bayesian fitting method implemented in the R-matrix SAMMY code. The evaluation methodology uses the Reich-Moore approximation to fit the 17,18O(α,n) reaction cross-sections in order to derive a set of resonance parameters andmore » a related covariance matrix that is then used to calculate the energydependent cross section covariance matrix. The stopping power cross sections and related covariance information for uranium and oxygen were obtained by the fit of stopping power data in the -energy range of 1 keV up to 12 MeV. Cross section perturbation factors based on the covariance information relative to the evaluated 17,18O( α,n) reaction cross sections, as well as uranium and oxygen stopping power cross sections, were used to generate a varied set of nuclear data libraries used in SOURCES4C and ORIGEN for inventory and source term calculations. The set of randomly perturbed output (α,n) source responses, provide the mean values and standard deviations of the calculated responses reflecting the uncertainties in nuclear data used in the calculations. Lastly, the results and related uncertainties are compared with experiment thick target (α,n) yields for uranium oxide.« less
Infovigilance: reporting errors in official drug information sources.
Fusier, Isabelle; Tollier, Corinne; Husson, Marie-Caroline
2005-06-01
The French drug database Thériaque (http://www.theriaque.org) developed by the (Centre National Hospitalier d'Information sur le Médicament) (CNHIM), is responsible for the dissemination of independent information about all drugs available in France. Each month the CNHIM pharmacists report problems due to inaccuracies in these sources to the French drug agency. In daily practice we devised the term "infovigilance": "Activity of error or inaccuracy notification in information sources which could be responsible for medication errors". The aim of this study was to evaluate the impact of CNHIM infovigilance on the contents of the Summary of Product Characteristics (SPCs). The study was a prospective study from 09/11/2001 to 31/12/2002. The problems related to the quality of information were classified into four types (inaccuracy/confusion, error/lack of information, discordance between SPC sections and discordance between generic SPCs). (1) Number of notifications and number of SPCs integrated into the database during the study period. (2) Percentage of notifications for each type: with or without potential patient impact, with or without later correction of the SPC, per section. 2.7% (85/3151) of SPCs integrated into the database were concerned by a notification of a problem. Notifications according to type of problem were inaccuracy/confusion (32%), error/lack of information (13%), discordance between SPC sections (27%) and discordance between generic SPCs (28%). 55% of problems were evaluated as 'likely to have an impact on the patient' and 45% as 'unlikely to have an impact on the patient'. 22 of problems which have been reported to the French drug agency were corrected and new updated SPCs were published with the corrections. Our efforts to improve the quality of drug information sources through a continuous "infovigilance" process need to be continued and extended to other information sources.
Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms
NASA Technical Reports Server (NTRS)
Heidmann, James D.; Hunter, Scott D.
2001-01-01
The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.
Facilitative Components of Collaborative Learning: A Review of Nine Health Research Networks
Rittner, Jessica Levin; Johnson, Karin E.; Gerteis, Jessie; Miller, Therese
2017-01-01
Objective: Collaborative research networks are increasingly used as an effective mechanism for accelerating knowledge transfer into policy and practice. This paper explored the characteristics and collaborative learning approaches of nine health research networks. Data sources/study setting: Semi-structured interviews with representatives from eight diverse US health services research networks conducted between November 2012 and January 2013 and program evaluation data from a ninth. Study design: The qualitative analysis assessed each network's purpose, duration, funding sources, governance structure, methods used to foster collaboration, and barriers and facilitators to collaborative learning. Data collection: The authors reviewed detailed notes from the interviews to distill salient themes. Principal findings: Face-to-face meetings, intentional facilitation and communication, shared vision, trust among members and willingness to work together were key facilitators of collaborative learning. Competing priorities for members, limited funding and lack of long-term support and geographic dispersion were the main barriers to coordination and collaboration across research network members. Conclusion: The findings illustrate the importance of collaborative learning in research networks and the challenges to evaluating the success of research network functionality. Conducting readiness assessments and developing process and outcome evaluation metrics will advance the design and show the impact of collaborative research networks. PMID:28277202
Possible Dual Earthquake-Landslide Source of the 13 November 2016 Kaikoura, New Zealand Tsunami
NASA Astrophysics Data System (ADS)
Heidarzadeh, Mohammad; Satake, Kenji
2017-10-01
A complicated earthquake ( M w 7.8) in terms of rupture mechanism occurred in the NE coast of South Island, New Zealand, on 13 November 2016 (UTC) in a complex tectonic setting comprising a transition strike-slip zone between two subduction zones. The earthquake generated a moderate tsunami with zero-to-crest amplitude of 257 cm at the near-field tide gauge station of Kaikoura. Spectral analysis of the tsunami observations showed dual peaks at 3.6-5.7 and 5.7-56 min, which we attribute to the potential landslide and earthquake sources of the tsunami, respectively. Tsunami simulations showed that a source model with slip on an offshore plate-interface fault reproduces the near-field tsunami observation in terms of amplitude, but fails in terms of tsunami period. On the other hand, a source model without offshore slip fails to reproduce the first peak, but the later phases are reproduced well in terms of both amplitude and period. It can be inferred that an offshore source is necessary to be involved, but it needs to be smaller in size than the plate interface slip, which most likely points to a confined submarine landslide source, consistent with the dual-peak tsunami spectrum. We estimated the dimension of the potential submarine landslide at 8-10 km.
The influence of convective activity on the vorticity budget
NASA Technical Reports Server (NTRS)
Townsend, T. L.; Scoggins, J. R.
1983-01-01
The influence of convective activity on the vorticity budget was determined during the AVE VII and AVE-SESAME I periods. This was accomplished by evaluating each term in the expanded vorticity equation with twisting and tilting and friction representing the residual of all other terms. Convective areas were delineated by use of radar summary charts. The influence of convective activity was established by analyzing contoured fields of each term as well as numerical values and profiles of the various terms in convective and nonconvective areas. Vertical motion was computed by the kinematic method, and all computations were performed over the central United States using a grid spacing of 158 km. The results show that, in convective areas in particular, the residual is of comparable magnitude to the horizontal advection and divergence terms, and therefore, cannot be neglected. In convective areas, the residual term represents a sink of vorticity below 500 mb and a strong source near 300 mb. In nonconvective areas, the residual was small in magnitude at all levels, but tended to be negative (vorticity sink) at 300 mb. The local change term, over convective areas, tended to be balanced by the residual term, and appeared to be a good indicator of development (vorticity becoming more cyclonic). Finally, the shape of the vertical profiles of the term in the budget equation agreed with those found by other investigators for easterly waves, but the terms were one order of magnitude larger than those for easterly waves.
NASA Astrophysics Data System (ADS)
Soelistijanto, B.; Muliadi, V.
2018-03-01
Diffie-Hellman (DH) provides an efficient key exchange system by reducing the number of cryptographic keys distributed in the network. In this method, a node broadcasts a single public key to all nodes in the network, and in turn each peer uses this key to establish a shared secret key which then can be utilized to encrypt and decrypt traffic between the peer and the given node. In this paper, we evaluate the key transfer delay and cost performance of DH in opportunistic mobile networks, a specific scenario of MANETs where complete end-to-end paths rarely exist between sources and destinations; consequently, the end-to-end delays in these networks are much greater than typical MANETs. Simulation results, driven by a random node movement model and real human mobility traces, showed that DH outperforms a typical key distribution scheme based on the RSA algorithm in terms of key transfer delay, measured by average key convergence time; however, DH performs as well as the benchmark in terms of key transfer cost, evaluated by total key (copies) forwards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, P.; Putkovich, R.P.
1981-07-01
A study was conducted of the requirements for and technologies applicable to power conditioning equipment in residential solar photovoltaic systems. A survey of companies known or thought to manufacture power conditioning equipment was conducted to asses the technology. Technical issues regarding ac and dc interface requirements were studied. A baseline design was selected to be a good example of existing technology which would not need significant development effort for its implementation. Alternative technologies are evaluated to determine which meet the baseline specification, and their costs and losses are evaluated. Areas in which cost improvements can be obtained are studied, andmore » the three best candidate technologies--the current-sourced converter, the HF front end converter, and the programmed wave converter--are compared. It is concluded that the designs investigated will meet, or with slight improvement could meet, short term efficiency goals. Long term efficiency goals could be met if an isolation transformer were not required in the power conditioning equipment. None of the technologies studied can meet cost goals unless further improvements are possible. (LEW)« less
Web Page Content and Quality Assessed for Shoulder Replacement.
Matthews, John R; Harrison, Caitlyn M; Hughes, Travis M; Dezfuli, Bobby; Sheppard, Joseph
2016-01-01
The Internet has become a major source for obtaining health-related information. This study assesses and compares the quality of information available online for shoulder replacement using medical (total shoulder arthroplasty [TSA]) and nontechnical (shoulder replacement [SR]) terminology. Three evaluators reviewed 90 websites for each search term across 3 search engines (Google, Yahoo, and Bing). Websites were grouped into categories, identified as commercial or noncommercial, and evaluated with the DISCERN questionnaire. Total shoulder arthroplasty provided 53 unique sites compared to 38 websites for SR. Of the 53 TSA websites, 30% were health professional-oriented websites versus 18% of SR websites. Shoulder replacement websites provided more patient-oriented information at 48%, versus 45% of TSA websites. In total, SR websites provided 47% (42/90) noncommercial websites, with the highest number seen in Yahoo, compared with TSA at 37% (33/90), with Google providing 13 of the 33 websites (39%). Using the nonmedical terminology with Yahoo's search engine returned the most noncommercial and patient-oriented websites. However, the quality of information found online was highly variable, with most websites being unreliable and incomplete, regardless of search term.
BioFed: federated query processing over life sciences linked open data.
Hasnain, Ali; Mehmood, Qaiser; Sana E Zainab, Syeda; Saleem, Muhammad; Warren, Claude; Zehra, Durre; Decker, Stefan; Rebholz-Schuhmann, Dietrich
2017-03-15
Biomedical data, e.g. from knowledge bases and ontologies, is increasingly made available following open linked data principles, at best as RDF triple data. This is a necessary step towards unified access to biological data sets, but this still requires solutions to query multiple endpoints for their heterogeneous data to eventually retrieve all the meaningful information. Suggested solutions are based on query federation approaches, which require the submission of SPARQL queries to endpoints. Due to the size and complexity of available data, these solutions have to be optimised for efficient retrieval times and for users in life sciences research. Last but not least, over time, the reliability of data resources in terms of access and quality have to be monitored. Our solution (BioFed) federates data over 130 SPARQL endpoints in life sciences and tailors query submission according to the provenance information. BioFed has been evaluated against the state of the art solution FedX and forms an important benchmark for the life science domain. The efficient cataloguing approach of the federated query processing system 'BioFed', the triple pattern wise source selection and the semantic source normalisation forms the core to our solution. It gathers and integrates data from newly identified public endpoints for federated access. Basic provenance information is linked to the retrieved data. Last but not least, BioFed makes use of the latest SPARQL standard (i.e., 1.1) to leverage the full benefits for query federation. The evaluation is based on 10 simple and 10 complex queries, which address data in 10 major and very popular data sources (e.g., Dugbank, Sider). BioFed is a solution for a single-point-of-access for a large number of SPARQL endpoints providing life science data. It facilitates efficient query generation for data access and provides basic provenance information in combination with the retrieved data. BioFed fully supports SPARQL 1.1 and gives access to the endpoint's availability based on the EndpointData graph. Our evaluation of BioFed against FedX is based on 20 heterogeneous federated SPARQL queries and shows competitive execution performance in comparison to FedX, which can be attributed to the provision of provenance information for the source selection. Developing and testing federated query engines for life sciences data is still a challenging task. According to our findings, it is advantageous to optimise the source selection. The cataloguing of SPARQL endpoints, including type and property indexing, leads to efficient querying of data resources over the Web of Data. This could even be further improved through the use of ontologies, e.g., for abstract normalisation of query terms.
McDonald, Brian C; Goldstein, Allen H; Harley, Robert A
2015-04-21
A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.
Mitchell, Karen J; Mather, Mara; Johnson, Marcia K; Raye, Carol L; Greene, Erich J
2006-10-02
We investigated the hypothesis that arousal recruits attention to item information, thereby disrupting working memory processes that help bind items to context. Using functional magnetic resonance imaging, we compared brain activity when participants remembered negative or neutral picture-location conjunctions (source memory) versus pictures only. Behaviorally, negative trials showed disruption of short-term source, but not picture, memory; long-term picture recognition memory was better for negative than for neutral pictures. Activity in areas involved in working memory and feature integration (precentral gyrus and its intersect with superior temporal gyrus) was attenuated on negative compared with neutral source trials relative to picture-only trials. Visual processing areas (middle occipital and lingual gyri) showed greater activity for negative than for neutral trials, especially on picture-only trials.
Circular current loops, magnetic dipoles and spherical harmonic analysis.
Alldredge, L.R.
1980-01-01
Spherical harmonic analysis (SHA) is the most used method of describing the Earth's magnetic field, even though spherical harmonic coefficients (SHC) almost completely defy interpretation in terms of real sources. Some moderately successful efforts have been made to represent the field in terms of dipoles placed in the core in an effort to have the model come closer to representing real sources. Dipole sources are only a first approximation to the real sources which are thought to be a very complicated network of electrical currents in the core of the Earth. -Author
NASA Astrophysics Data System (ADS)
Hogue, T. S.; Rust, A.
2016-12-01
Fire frequency is increasing across mid-elevation forests, especially in the Northern Rockies, Sierra Nevada, southern Cascades, as well as the coastal ranges in California and southern Oregon. Numerous studies have noted increased discharge, floods and debris flows after wildfire. More recent work also shows increased water yield during dry seasons for up to ten years post-fire. However, few studies have evaluated long-term water quality response in fire-impacted watersheds. The current presentation will overview recent development of an extensive database on post-fire water quality response across the western U.S. A range of water quality parameters were gathered from 271 burned watersheds through local, state and federal agencies. Short and long-term response was evaluated for watersheds with at least 5 years of pre-fire data. Over 30 watersheds showed significant increases in NO3-, NO2-, NH3, and total nitrogen loading in the initial five years after fire and remained elevated ten years after fire. The burn severity influenced the degree of nitrogen response, where more severely burned watersheds showed higher nitrogen loading than less severely burned watersheds. Dissolved and total phosphorous showed significant increases in 32 watersheds for the first five years after fire. Dissolved ions such as calcium, magnesium, and chloride were also exported from over 32 watersheds, primarily during the first five years after fire, with the majority of impacted watersheds returning to pre-fire water quality conditions after ten years. Ongoing work includes evaluating key determinants that drive short and long-term response and developing predictive models for post-fire water quality. Watersheds impacted by wildfire are known to pose significant risks for downstream communities. Understanding short and long-term water quality change that can impact regional water supplies is critical for establishing potential treatment priorities and alternative source planning.
NASA Technical Reports Server (NTRS)
Tiwari, Vidhu S.; Kalluru, Rajamohan R.; Yueh, Fang-Yu; Singh, Jagdish P.; SaintCyr, William
2007-01-01
A spontaneous Raman scattering optical fiber sensor is developed for a specific need of NASA/SSC for long-term detection and monitoring of the quality of liquid oxygen (LOX) in the delivery line during ground testing of rocket engines. The sensor performance was tested in the laboratory and with different excitation light sources. To evaluate the sensor performance with different excitation light sources for the LOX quality application, we have used the various mixtures of liquid oxygen and liquid nitrogen as samples. The study of the sensor performance shows that this sensor offers a great deal of flexibility and provides a cost effective solution for the application. However, an improved system response time is needed for the real-time, quantitative monitoring of the quality of cryogenic fluids in harsh environment.
Computed myography: three-dimensional reconstruction of motor functions from surface EMG data
NASA Astrophysics Data System (ADS)
van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.
2008-12-01
We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.
McDougle, Leon; Way, David P; Lee, Winona K; Morfin, Jose A; Mavis, Brian E; Matthews, De'Andrea; Latham-Sadler, Brenda A; Clinchot, Daniel M
2015-08-01
The National Postbaccalaureate Collaborative (NPBC) is a partnership of Postbaccalaureate Programs (PBPs) dedicated to helping promising college graduates from disadvantaged and underrepresented backgrounds get into and succeed in medical school. This study aims to determine long-term program outcomes by looking at PBP graduates, who are now practicing physicians, in terms of health care service to the poor and underserved and contribution to health care workforce diversity. We surveyed the PBP graduates and a randomly drawn sample of non-PBP graduates from the affiliated 10 medical schools stratified by the year of medical school graduation (1996-2002). The PBP graduates were more likely to be providing care in federally designated underserved areas and practicing in institutional settings that enable access to care for vulnerable populations. The NPBC graduates serve a critical role in providing access to care for underserved populations and serve as a source for health care workforce diversity.
McDougle, Leon; Way, David P.; Lee, Winona K.; Morfin, Jose A.; Mavis, Brian E.; Wiggins, De’Andrea; Latham-Sadler, Brenda A.; Clinchot, Daniel M.
2016-01-01
The National Postbaccalaureate Collaborative (NPBC) is a partnership of Postbaccalaureate Programs (PBPs) dedicated to helping promising college graduates from disadvantaged and underrepresented backgrounds get into and succeed in medical school. This study aims to determine long-term program outcomes by looking at PBP graduates, who are now practicing physicians, in terms of healthcare service to the poor and underserved and contribution to healthcare workforce diversity. Methods We surveyed the PBP graduates and a randomly drawn sample of non-PBP graduates from the affiliated 10 medical schools stratified by the year of medical school graduation (1996-2002). Results The PBP graduates were more likely to be providing care in federally designated underserved areas and practicing in institutional settings that enable access to care for vulnerable populations. Conclusion The NPBC graduates serve a critical role in providing access to care for underserved populations and serve as a source for healthcare workforce diversity. PMID:26320900
Nuclear Medicine and Resources for Patients: How Complex are Online Patient Educational Materials?
Hansberry, David R; Shah, Kush; Agarwal, Nitin; Kim, Sung M; Intenzo, Charles M
2018-02-02
The Internet is a major source of healthcare information for patients. The American Medical Association and National Institutes of Health recommend that consumer healthcare websites be written between a 3rd and 7th grade level. The purpose of this study is to evaluate the level of readability of patient education websites pertaining to nuclear medicine. Methods: Ten search terms were Googled and the top 10 links for each term were collected and analyzed for their level of readability using 10 well-established readability scales. Results: Collectively the 99 articles were written at an 11.8 grade level (standard deviation of 3.4). Only 5 of the 99 articles were written at the NIH and AMA recommended 3rd to 7th grade. Conclusion: There is a clear discordance between the readability level of nuclear medicine related imaging terms with the NIH and AMA guidelines. This disconnect may negatively impact patient understanding contributing to poor health outcomes. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
A study of numerical methods for hyperbolic conservation laws with stiff source terms
NASA Technical Reports Server (NTRS)
Leveque, R. J.; Yee, H. C.
1988-01-01
The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.
Where to search top-K biomedical ontologies?
Oliveira, Daniela; Butt, Anila Sahar; Haller, Armin; Rebholz-Schuhmann, Dietrich; Sahay, Ratnesh
2018-03-20
Searching for precise terms and terminological definitions in the biomedical data space is problematic, as researchers find overlapping, closely related and even equivalent concepts in a single or multiple ontologies. Search engines that retrieve ontological resources often suggest an extensive list of search results for a given input term, which leads to the tedious task of selecting the best-fit ontological resource (class or property) for the input term and reduces user confidence in the retrieval engines. A systematic evaluation of these search engines is necessary to understand their strengths and weaknesses in different search requirements. We have implemented seven comparable Information Retrieval ranking algorithms to search through ontologies and compared them against four search engines for ontologies. Free-text queries have been performed, the outcomes have been judged by experts and the ranking algorithms and search engines have been evaluated against the expert-based ground truth (GT). In addition, we propose a probabilistic GT that is developed automatically to provide deeper insights and confidence to the expert-based GT as well as evaluating a broader range of search queries. The main outcome of this work is the identification of key search factors for biomedical ontologies together with search requirements and a set of recommendations that will help biomedical experts and ontology engineers to select the best-suited retrieval mechanism in their search scenarios. We expect that this evaluation will allow researchers and practitioners to apply the current search techniques more reliably and that it will help them to select the right solution for their daily work. The source code (of seven ranking algorithms), ground truths and experimental results are available at https://github.com/danielapoliveira/bioont-search-benchmark.
Jiang, Luohua; Zhang, Ben; Smith, Matthew Lee; Lorden, Andrea L; Radcliff, Tiffany A; Lorig, Kate; Howell, Benjamin L; Whitelaw, Nancy; Ory, Marcia G
2015-01-01
To evaluate the concordance between self-reported data and variables obtained from Medicare administrative data in terms of chronic conditions and health care utilization. Retrospective observational study. We analyzed data from a sample of Medicare beneficiaries who were part of the National Study of Chronic Disease Self-Management Program (CDSMP) and were eligible for the Centers for Medicare and Medicaid Services (CMS) pilot evaluation of CDSMP (n = 119). Self-reported and Medicare claims-based chronic conditions and health care utilization were examined. Percent of consistent numbers, kappa statistic (κ), and Pearson's correlation coefficient were used to evaluate concordance. The two data sources had substantial agreement for diabetes and chronic obstructive pulmonary disease (COPD) (κ = 0.75 and κ = 0.60, respectively), moderate agreement for cancer and heart disease (κ = 0.50 and κ = 0.47, respectively), and fair agreement for depression (κ = 0.26). With respect to health care utilization, the two data sources had almost perfect or substantial concordance for number of hospitalizations (κ = 0.69-0.79), moderate concordance for ED care utilization (κ = 0.45-0.61), and generally low agreement for number of physician visits (κ ≤ 0.31). Either self-reports or claim-based administrative data for diabetes, COPD, and hospitalizations can be used to analyze Medicare beneficiaries in the US. Yet, caution must be taken when only one data source is available for other types of chronic conditions and health care utilization.
Auditing the multiply-related concepts within the UMLS.
Mougin, Fleur; Grabar, Natalia
2014-10-01
This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Evaluation of holistic sexuality education: A European expert group consensus agreement.
Ketting, Evert; Friele, Minou; Michielsen, Kristien
2016-01-01
Holistic sexuality education (HSE) is a new concept in sexuality education (SE). Since it differs from other types of SE in a number of important respects, strategies developed for the evaluation of the latter are not necessarily applicable to HSE. In this paper the authors provide a basis for discussion on how to evaluate HSE. First, the international literature on evaluation of SE in general was reviewed in terms of its applicability to HSE. Second, the European Expert Group on Sexuality Education extensively discussed the requirements of its evaluation and suggested appropriate indicators and methods for evaluating HSE. The European experience in SE is scarcely represented in the general evaluation literature. The majority of the literature focuses on impact and neglects programme and implementation evaluations. Furthermore, the current literature demonstrates that evaluation criteria predominantly focus on the public health impact, while there is not yet a consensus on sexual well-being criteria and aspects of positive sexuality, which are crucial parts of HSE. Finally, experimental designs are still considered the gold standard, yet several of the conditions for their use are not fulfilled in HSE. Realising that a new evaluation framework for HSE is needed, the European expert group initiated its development and agreed upon a number of indicators that provide a starting point for further discussion. Aside from the health impact, the quality of SE programmes and their implementation also deserve attention and should be evaluated. To be applicable to HSE, the evaluation criteria need to cover more than the typical public health aspects. Since they do not register long-term and multi-component characteristics, evaluation methods such as randomised controlled trials are not sufficiently suitable for HSE. The evaluation design should rely on a number of different information sources from mixed methods that are complemented and triangulated to build a plausible case for the effectiveness of SE in general and HSE in particular.
Ghannam, K; El-Fadel, M
2013-02-01
This paper examines the relative source contribution to ground-level concentrations of carbon monoxide (CO), nitrogen dioxide (NO2), and PM10 (particulate matter with an aerodynamic diameter < 10 microm) in a coastal urban area due to emissions from an industrial complex with multiple stacks, quarrying activities, and a nearby highway. For this purpose, an inventory of CO, oxide of nitrogen (NO(x)), and PM10 emissions was coupled with the non-steady-state Mesoscale Model 5/California Puff Dispersion Modeling system to simulate individual source contributions under several spatial and temporal scales. As the contribution of a particular source to ground-level concentrations can be evaluated by simulating this single-source emissions or otherwise total emissions except that source, a set of emission sensitivity simulations was designed to examine if CALPUFF maintains a linear relationship between emission rates and predicted concentrations in cases where emitted plumes overlap and chemical transformations are simulated. Source apportionment revealed that ground-level releases (i.e., highway and quarries) extended over large areas dominated the contribution to exposure levels over elevated point sources, despite the fact that cumulative emissions from point sources are higher. Sensitivity analysis indicated that chemical transformations of NO(x) are insignificant, possibly due to short-range plume transport, with CALPUFF exhibiting a linear response to changes in emission rate. The current paper points to the significance of ground-level emissions in contributing to urban air pollution exposure and questions the viability of the prevailing paradigm of point-source emission reduction, especially that the incremental improvement in air quality associated with this common abatement strategy may not accomplish the desirable benefit in terms of lower exposure with costly emissions capping. The application of atmospheric dispersion models for source apportionment helps in identifying major contributors to regional air pollution. In industrial urban areas where multiple sources with different geometry contribute to emissions, ground-level releases extended over large areas such as roads and quarries often dominate the contribution to ground-level air pollution. Industrial emissions released at elevated stack heights may experience significant dilution, resulting in minor contribution to exposure at ground level. In such contexts, emission reduction, which is invariably the abatement strategy targeting industries at a significant investment in control equipment or process change, may result in minimal return on investment in terms of improvement in air quality at sensitive receptors.
Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.
Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S
2004-01-01
New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.
Intra-dialytic exercise training: a pragmatic approach.
Greenwood, Sharlene A; Naish, Patrick; Clark, Rachel; O'Connor, Ellen; Pursey, Victoria A; Macdougall, Iain C; Mercer, Thomas H; Koufaki, Pelagia
2014-09-01
This continuing education paper outlines the skills and knowledge required to plan, implement and evaluate a pragmatic approach to intra-dialytic exercise training. The aim of this continuing education article is to enable the nephrology multi-disciplinary team (MDT) to plan, implement and evaluate the provision of intra-dialytic exercise training for patients receiving haemodialysis therapy. After reading this article the reader should be able to: Appreciate the level of evidence base for the clinical effectiveness of renal exercise rehabilitation and locate credible sources of research and educational information Understand and consider the need for appropriate evaluation and assessment outcomes as part of a renal rehabilitation plan Understand the components of exercise programming and prescription as part of an integrated renal rehabilitation plan Develop a sustainable longer term exercise and physical activity plan. © 2014 The Authors Journal of Renal Care published by John Wiley & Sons Ltd on behalf of European Dialysis & Transplant Nurses Association/European Renal Care Association.
Evaluation Study of a Wireless Multimedia Traffic-Oriented Network Model
NASA Astrophysics Data System (ADS)
Vasiliadis, D. C.; Rizos, G. E.; Vassilakis, C.
2008-11-01
In this paper, a wireless multimedia traffic-oriented network scheme over a fourth generation system (4-G) is presented and analyzed. We conducted an extensive evaluation study for various mobility configurations in order to incorporate the behavior of the IEEE 802.11b standard over a test-bed wireless multimedia network model. In this context, the Quality of Services (QoS) over this network is vital for providing a reliable high-bandwidth platform for data-intensive sources like video streaming. Therefore, the main issues concerned in terms of QoS were the metrics for bandwidth of both dropped and lost packets and their mean packet delay under various traffic conditions. Finally, we used a generic distance-vector routing protocol which was based on an implementation of Distributed Bellman-Ford algorithm. The performance of the test-bed network model has been evaluated by using the simulation environment of NS-2.
[Comparison of 4 methods of evaluating protein quality in vegetable sources].
Muñoz de Maquiña, A; Gross, R; Schoeneberger, H; Villacorta, L
1983-12-01
This study compares the practical value of the following methods: protein efficiency ratio (PER), blood urea concentration in rats (BUC), relative nutritive value (RNV), and predicted protein value (PPV) to evaluate the protein quality of 41 diets of plant origin. Results demonstrated low correlations between PER and RNV (r = 0.66), PER and PPV (r = 0.53), RNV and PPV (r = 0.54), whereas there was a high negative correlation between PER and BUC (r = -0.89). These different procedures can be useful and valid for distinct and well-defined objectives, but the evaluation of results must be made in accordance with the purpose of the experiment. In assessing the protein quality of foodstuffs, it is therefore recommended that mathematical computer models be developed which take into account the cybernetic system of the amino acid metabolism. This would definitely reduce the actual need of expensive long-term biological assays.
Evaluating Discovery Services Architectures in the Context of the Internet of Things
NASA Astrophysics Data System (ADS)
Polytarchos, Elias; Eliakis, Stelios; Bochtis, Dimitris; Pramatari, Katerina
As the "Internet of Things" is expected to grow rapidly in the following years, the need to develop and deploy efficient and scalable Discovery Services in this context is very important for its success. Thus, the ability to evaluate and compare the performance of different Discovery Services architectures is vital if we want to allege that a given design is better at meeting requirements of a specific application. The purpose of this chapter is to provide a paradigm for the evaluation of different Discovery Services for the Internet of Things in terms of efficiency, scalability and performance through the use of simulations. The methodology presented uses the application of Discovery Services to a supply chain with the Service Lookup Service Discovery Service using OMNeT++, an open source network simulation suite. Then, we delve into the simulation design and the details of our findings.
Validity of the mockwitness paradigm: testing the assumptions.
McQuiston, Dawn E; Malpass, Roy S
2002-08-01
Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.
NASA Astrophysics Data System (ADS)
Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques
2015-12-01
In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.
Naturally occurring 32Si and low-background silicon dark matter detectors
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; ...
2018-02-10
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less