Sample records for source term development

  1. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  2. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  3. Source inventory for Department of Energy solid low-level radioactive waste disposal facilities: What it means and how to get one of your own

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M.A.

    1991-12-31

    In conducting a performance assessment for a low-level waste (LLW) disposal facility, one of the important considerations for determining the source term, which is defined as the amount of radioactivity being released from the facility, is the quantity of radioactive material present. This quantity, which will be referred to as the source inventory, is generally estimated through a review of historical records and waste tracking systems at the LLW facility. In theory, estimating the total source inventory for Department of Energy (DOE) LLW disposal facilities should be possible by reviewing the national data base maintained for LLW operations, the Solidmore » Waste Information Management System (SWIMS), or through the annual report that summarizes the SWIMS data, the Integrated Data Base (IDB) report. However, in practice, there are some difficulties in making this estimate. This is not unexpected, since the SWIMS and the IDB were not developed with the goal of developing a performance assessment source term in mind. The practical shortcomings using the existing data to develop a source term for DOE facilities will be discussed in this paper.« less

  4. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less

  5. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  6. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  7. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  8. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  9. 22 CFR 228.12 - Long-term leases.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Long-term leases. 228.12 Section 228.12 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT RULES ON SOURCE, ORIGIN AND NATIONALITY FOR COMMODITIES AND... agreement is subject to the source and origin requirements of this subpart B. For purposes of this subpart B...

  10. Flowsheets and source terms for radioactive waste projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  11. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  12. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  13. PACE. A Program for Acquiring Competence in Entrepreneurship. Resource Guide. Research and Development Series No. 194 D.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This Program for Acquiring Competence in Entrepreneurship (PACE) resource guide contains an "Annotated Glossary of Business Terms" and listings of sources of information. The glossary includes approximately 100 terms, of which the instructor should have a working knowledge. It may also be used as a handout for students. Sources of…

  14. Sample Based Unit Liter Dose Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENSEN, L.

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new datamore » to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting {mu}Ci/g or {mu}Ci/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000).« less

  15. Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2003-01-01

    A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.

  16. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less

  17. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  18. Algorithm Development and Application of High Order Numerical Methods for Shocked and Rapid Changing Solutions

    DTIC Science & Technology

    2007-12-06

    high order well-balanced schemes to a class of hyperbolic systems with source terms, Boletin de la Sociedad Espanola de Matematica Aplicada, v34 (2006...schemes to a class of hyperbolic systems with source terms, Boletin de la Sociedad Espanola de Matematica Aplicada, v34 (2006), pp.69-80. 39. Y. Xu and C.-W

  19. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  20. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  1. A two-dimensional transient analytical solution for a ponded ditch drainage system under the influence of source/sink

    NASA Astrophysics Data System (ADS)

    Sarmah, Ratan; Tiwari, Shubham

    2018-03-01

    An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.

  2. Development and evaluation of a lightweight sensor system for aerial emission sampling from open area sources

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter wi...

  3. Development and evaluation of a lightweight sensor system for aerial emission sampling from open area sources (Abstract)

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter wi...

  4. Development and evaluation of a lightweight sensor system for emission sampling from open area sources

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area sources, such as open burning. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, and black carbon, samplers for particulate matter with ...

  5. Sources for Developing a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    Organized as a bibliographic essay, this paper examines the many sources available for developing a theory of visual literacy. Several definitions are offered in order to clarify the meaning of the term "visual literacy" so that meaningful research can be conducted on the topic. Based on the review of resources, three recommendations are offered…

  6. Quantum Information Science

    DTIC Science & Technology

    2012-02-01

    group velocity matched temporal compensator crystal assembly to increase the usable range of entangled photon sources, and (vi) the development and...characterization of a new multipli- entangled photon source that increased the usable number of photon pairs by a factor of six. 15. SUBJECT TERMS...compensated crystal assembly ....................................................................................... 17 3.7 Entangled photon sources

  7. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  8. Prediction of discretization error using the error transport equation

    NASA Astrophysics Data System (ADS)

    Celik, Ismail B.; Parsons, Don Roscoe

    2017-06-01

    This study focuses on an approach to quantify the discretization error associated with numerical solutions of partial differential equations by solving an error transport equation (ETE). The goal is to develop a method that can be used to adequately predict the discretization error using the numerical solution on only one grid/mesh. The primary problem associated with solving the ETE is the formulation of the error source term which is required for accurately predicting the transport of the error. In this study, a novel approach is considered which involves fitting the numerical solution with a series of locally smooth curves and then blending them together with a weighted spline approach. The result is a continuously differentiable analytic expression that can be used to determine the error source term. Once the source term has been developed, the ETE can easily be solved using the same solver that is used to obtain the original numerical solution. The new methodology is applied to the two-dimensional Navier-Stokes equations in the laminar flow regime. A simple unsteady flow case is also considered. The discretization error predictions based on the methodology presented in this study are in good agreement with the 'true error'. While in most cases the error predictions are not quite as accurate as those from Richardson extrapolation, the results are reasonable and only require one numerical grid. The current results indicate that there is much promise going forward with the newly developed error source term evaluation technique and the ETE.

  9. Further development of a global pollution model for CO, CH4, and CH2 O

    NASA Technical Reports Server (NTRS)

    Peters, L. K.

    1975-01-01

    Global tropospheric pollution models are developed that describe the transport and the physical and chemical processes occurring between the principal sources and sinks of CH4 and CO. Results are given of long term static chemical kinetic computer simulations and preliminary short term dynamic simulations.

  10. Source Credibility in Tobacco Control Messaging

    PubMed Central

    Schmidt, Allison M.; Ranney, Leah M.; Pepper, Jessica K.; Goldstein, Adam O.

    2016-01-01

    Objectives Perceived credibility of a message’s source can affect persuasion. This paper reviews how beliefs about the source of tobacco control messages may encourage attitude and behavior change. Methods We conducted a series of searches of the peer-reviewed literature using terms from communication and public health fields. We reviewed research on source credibility, its underlying concepts, and its relation to the persuasiveness of tobacco control messages. Results We recommend an agenda for future research to bridge the gaps between communication literature on source credibility and tobacco control research. Our recommendations are to study the impact of source credibility on persuasion with long-term behavior change outcomes, in different populations and demographic groups, by developing new credibility measures that are topic- and organization-specific, by measuring how credibility operates across media platforms, and by identifying factors that enhance credibility and persuasion. Conclusions This manuscript reviews the state of research on source credibility and identifies gaps that are maximally relevant to tobacco control communication. Knowing first whether a source is perceived as credible, and second, how to enhance perceived credibility, can inform the development of future tobacco control campaigns and regulatory communications. PMID:27525298

  11. Evaluation of the Hydrologic Source Term from Underground Nuclear Tests on Pahute Mesa at the Nevada Test Site: The CHESHIRE Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawloski, G A; Tompson, A F B; Carle, S F

    The objectives of this report are to develop, summarize, and interpret a series of detailed unclassified simulations that forecast the nature and extent of radionuclide release and near-field migration in groundwater away from the CHESHIRE underground nuclear test at Pahute Mesa at the NTS over 1000 yrs. Collectively, these results are called the CHESHIRE Hydrologic Source Term (HST). The CHESHIRE underground nuclear test was one of 76 underground nuclear tests that were fired below or within 100 m of the water table between 1965 and 1992 in Areas 19 and 20 of the NTS. These areas now comprise the Pahutemore » Mesa Corrective Action Unit (CAU) for which a separate subregional scale flow and transport model is being developed by the UGTA Project to forecast the larger-scale migration of radionuclides from underground tests on Pahute Mesa. The current simulations are being developed, on one hand, to more fully understand the complex coupled processes involved in radionuclide migration, with a specific focus on the CHESHIRE test. While remaining unclassified, they are as site specific as possible and involve a level of modeling detail that is commensurate with the most fundamental processes, conservative assumptions, and representative data sets available. However, the simulation results are also being developed so that they may be simplified and interpreted for use as a source term boundary condition at the CHESHIRE location in the Pahute Mesa CAU model. In addition, the processes of simplification and interpretation will provide generalized insight as to how the source term behavior at other tests may be considered or otherwise represented in the Pahute Mesa CAU model.« less

  12. Investigating Local Sustainable Environmental Perspectives of Kenyan Community Members and Teachers

    ERIC Educational Resources Information Center

    Quigley, Cassie F.; Dogbey, James; Che, S. Megan; Hallo, Jeffrey

    2015-01-01

    Efforts to conserve and preserve the environment in developing or marginalized locales frequently involve a one-way transfer of knowledge and materials from a source in a more developed location. This situation often degenerates into a short-term donor project which risks little to no long-term impacts on local or indigenous relationships with the…

  13. Passive Localization of Multiple Sources Using Widely-Spaced Arrays With Application to Marine Mammals

    DTIC Science & Technology

    2008-09-30

    developing methods to simultaneously track multiple vocalizing marine mammals, we hope to contribute to the fields of marine mammal bioacoustics, ecology ...mammals, we hope to contribute to the fields of marine mammal bioacoustics, ecology , and anthropogenic impact mitigation. 15. SUBJECT TERMS 16. SECURITY...N00014-05-1-0074 (OA Graduate Traineeship for E-M Nosal) LONG-TERM GOALS The long-term goal of our research is to develop algorithms that use widely

  14. 75 FR 51004 - Drill Pipe From the People's Republic of China: Preliminary Determination of Sales at Less Than...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ....; Jiangsu Shuguang Huayang Drilling Tool Co., Ltd.; and Jiangyin Long-Bright Drill Pipe Manufacturing Co..., Ukraine, and Peru are countries comparable to the PRC in terms of economic development. See April 20, 2010... countries comparable to the PRC in terms of economic development. See Surrogate Country List. The sources of...

  15. REVIEW OF METHODS FOR REMOTE SENSING OF ATMOSPHERIC EMISSIONS FROM STATIONARY SOURCES

    EPA Science Inventory

    The report reviews the commercially available and developing technologies for the application of remote sensing to the measurement of source emissions. The term 'remote sensing technology', as applied in the report, means the detection or concentration measurement of trace atmosp...

  16. America's Energy Potential: A Summary and Explanation; Committee on Interior and Insular Affairs, U.S. House of Representatives, Ninety-Third Congress, First Session. [Committee Print].

    ERIC Educational Resources Information Center

    Udall, Morris K.

    This report reviews America's current energy position. The energy sources studied include oil and gas, coal, nuclear energy, solar energy, and geothermal energy. Each source is analyzed in terms of current use, technology for extracting and developing the energy, research and development funding, and projections for future consumption and…

  17. Developing Design Criteria and Scale Up Methods for Water-Stable Metal-Organic Frameworks for Adsorption Applications

    DTIC Science & Technology

    2014-09-08

    Figure 1.4: Number of publications containing the term “metal-organic frameworks” (Source: ISI Web of Science, retrieved April, 14 th , 2014) 8...1.4 Number of publications containing the term “metal-organic frameworks” (Source: ISI Web of Science, retrieved April, 14 th , 2014). 1.4...recorded with a PerkinElmer Spectrum One 10 in the range 400 – 4000 cm -1 . To record the IR spectrum, an IR beam is passed through the sample (in

  18. Mapping water availability, projected use and cost in the western United States

    NASA Astrophysics Data System (ADS)

    Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara

    2014-05-01

    New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.

  19. Toward a Mechanistic Source Term in Advanced Reactors: A Review of Past U.S. SFR Incidents, Experiments, and Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Brunett, Acacia J.; Grabaskas, David

    In 2015, as part of a Regulatory Technology Development Plan (RTDP) effort for sodium-cooled fast reactors (SFRs), Argonne National Laboratory investigated the current state of knowledge of source term development for a metal-fueled, pool-type SFR. This paper provides a summary of past domestic metal-fueled SFR incidents and experiments and highlights information relevant to source term estimations that were gathered as part of the RTDP effort. The incidents described in this paper include fuel pin failures at the Sodium Reactor Experiment (SRE) facility in July of 1959, the Fermi I meltdown that occurred in October of 1966, and the repeated meltingmore » of a fuel element within an experimental capsule at the Experimental Breeder Reactor II (EBR-II) from November 1967 to May 1968. The experiments described in this paper include the Run-Beyond-Cladding-Breach tests that were performed at EBR-II in 1985 and a series of severe transient overpower tests conducted at the Transient Reactor Test Facility (TREAT) in the mid-1980s.« less

  20. Emergency Preparedness technology support to the Health and Safety Executive (HSE), Nuclear Installations Inspectorate (NII) of the United Kingdom. Appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.

    1994-03-01

    The Nuclear Installations Inspectorate (NII) of the United Kingdom (UK) suggested the use of an accident progression logic model method developed by Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) for K Reactor to predict the magnitude and timing of radioactivity releases (the source term) based on an advanced logic model methodology. Predicted releases are output from the personal computer-based model in a level-of-confidence format. Additional technical discussions eventually led to a request from the NII to develop a proposal for assembling a similar technology to predict source terms for the UK`s advanced gas-cooled reactor (AGR) type.more » To respond to this request, WSRC is submitting a proposal to provide contractual assistance as specified in the Scope of Work. The work will produce, document, and transfer technology associated with a Decision-Oriented Source Term Estimator for Emergency Preparedness (DOSE-EP) for the NII to apply to AGRs in the United Kingdom. This document, Appendix A is a part of this proposal.« less

  1. The Case for Open Source: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    Open source has continued to evolve and in the past three years the development of a graphical user interface has made it increasingly accessible and viable for end users without special training. Open source relies to a great extent on the free software movement. In this context, the term free refers not to cost, but to the freedom users have to…

  2. Predicting vertically-nonsequential wetting patterns with a source-responsive model

    USGS Publications Warehouse

    Nimmo, John R.; Mitchell, Lara

    2013-01-01

    Water infiltrating into soil of natural structure often causes wetting patterns that do not develop in an orderly sequence. Because traditional unsaturated flow models represent a water advance that proceeds sequentially, they fail to predict irregular development of water distribution. In the source-responsive model, a diffuse domain (D) represents flow within soil matrix material following traditional formulations, and a source-responsive domain (S), characterized in terms of the capacity for preferential flow and its degree of activation, represents preferential flow as it responds to changing water-source conditions. In this paper we assume water undergoing rapid source-responsive transport at any particular time is of negligibly small volume; it becomes sensible at the time and depth where domain transfer occurs. A first-order transfer term represents abstraction from the S to the D domain which renders the water sensible. In tests with lab and field data, for some cases the model shows good quantitative agreement, and in all cases it captures the characteristic patterns of wetting that proceed nonsequentially in the vertical direction. In these tests we determined the values of the essential characterizing functions by inverse modeling. These functions relate directly to observable soil characteristics, rendering them amenable to evaluation and improvement through hydropedologic development.

  3. Vehicle Miles Traveled (VMT) Fees : Preliminary Report – Tasks 1 and 2.

    DOT National Transportation Integrated Search

    2014-03-01

    Fuel taxes are the primary source of funding for state and federal transportation programs and have been for well over 80 years. However, the long term viability of this revenue source is in question. The development of more fuel efficient engine tec...

  4. Spent fuel radionuclide source-term model for assessing spent fuel performance in geological disposal. Part I: Assessment of the instant release fraction

    NASA Astrophysics Data System (ADS)

    Johnson, Lawrence; Ferry, Cécile; Poinssot, Christophe; Lovera, Patrick

    2005-11-01

    A source-term model for the short-term release of radionuclides from spent nuclear fuel (SNF) has been developed. It provides quantitative estimates of the fraction of various radionuclides that are expected to be released rapidly (the instant release fraction, or IRF) when water contacts the UO 2 or MOX fuel after container breaching in a geological repository. The estimates are based on correlation of leaching data for radionuclides with fuel burnup and fission gas release. Extrapolation of the data to higher fuel burnup values is based on examination of data on fuel restructuring, such as rim development, and on fission gas release data, which permits bounding IRF values to be estimated assuming that radionuclide releases will be less than fission gas release. The consideration of long-term solid-state changes influencing the IRF prior to canister breaching is addressed by evaluating alpha self-irradiation enhanced diffusion, which may gradually increase the accumulation of fission products at grain boundaries.

  5. The Scaling of Broadband Shock-Associated Noise with Increasing Temperature

    NASA Technical Reports Server (NTRS)

    Miller, Steven A.

    2012-01-01

    A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline ( = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline psi = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.

  6. Development of a Hard X-ray Beam Position Monitor for Insertion Device Beams at the APS

    NASA Astrophysics Data System (ADS)

    Decker, Glenn; Rosenbaum, Gerd; Singh, Om

    2006-11-01

    Long-term pointing stability requirements at the Advanced Photon Source (APS) are very stringent, at the level of 500 nanoradians peak-to-peak or better over a one-week time frame. Conventional rf beam position monitors (BPMs) close to the insertion device source points are incapable of assuring this level of stability, owing to mechanical, thermal, and electronic stability limitations. Insertion device gap-dependent systematic errors associated with the present ultraviolet photon beam position monitors similarly limit their ability to control long-term pointing stability. We report on the development of a new BPM design sensitive only to hard x-rays. Early experimental results will be presented.

  7. The Funding of Long-Term Care in Canada: What Do We Know, What Should We Know?

    PubMed

    Grignon, Michel; Spencer, Byron G

    2018-06-01

    ABSTRACTLong-term care is a growing component of health care spending but how much is spent or who bears the cost is uncertain, and the measures vary depending on the source used. We drew on regularly published series and ad hoc publications to compile preferred estimates of the share of long-term care spending in total health care spending, the private share of long-term care spending, and the share of residential care within long-term care. For each series, we compared estimates obtainable from published sources (CIHI [Canadian Institute for Health Information] and OECD [Organization for Economic Cooperation and Development]) with our preferred estimates. We conclude that using published series without adjustment would lead to spurious conclusions on the level and evolution of spending on long-term care in Canada as well as on the distribution of costs between private and public funders and between residential and home care.

  8. Long-term monitoring of molecular markers can distinguish different seasonal patterns of fecal indicating bacteria sources.

    PubMed

    Riedel, Timothy E; Thulsiraj, Vanessa; Zimmer-Faust, Amity G; Dagit, Rosi; Krug, Jenna; Hanley, Kaitlyn T; Adamek, Krista; Ebentier, Darcy L; Torres, Robert; Cobian, Uriel; Peterson, Sophie; Jay, Jennifer A

    2015-03-15

    Elevated levels of fecal indicator bacteria (FIB) have been observed at Topanga Beach, CA, USA. To identify the FIB sources, a microbial source tracking study using a dog-, a gull- and two human-associated molecular markers was conducted at 10 sites over 21 months. Historical data suggest that episodic discharge from the lagoon at the mouth of Topanga Creek is the main source of bacteria to the beach. A decline in creek FIB/markers downstream from upper watershed development and a sharp increase in FIB/markers at the lagoon sites suggest sources are local to the lagoon. At the lagoon and beach, human markers are detected sporadically, dog marker peaks in abundance mid-winter, and gull marker is chronically elevated. Varied seasonal patterns of FIB and source markers were identified showing the importance of applying a suite of markers over long-term spatial and temporal sampling to identify a complex combination of sources of contamination. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. An extension of the Lighthill theory of jet noise to encompass refraction and shielding

    NASA Technical Reports Server (NTRS)

    Ribner, Herbert S.

    1995-01-01

    A formalism for jet noise prediction is derived that includes the refractive 'cone of silence' and other effects; outside the cone it approximates the simple Lighthill format. A key step is deferral of the simplifying assumption of uniform density in the dominant 'source' term. The result is conversion to a convected wave equation retaining the basic Lighthill source term. The main effect is to amend the Lighthill solution to allow for refraction by mean flow gradients, achieved via a frequency-dependent directional factor. A general formula for power spectral density emitted from unit volume is developed as the Lighthill-based value multiplied by a squared 'normalized' Green's function (the directional factor), referred to a stationary point source. The convective motion of the sources, with its powerful amplifying effect, also directional, is already accounted for in the Lighthill format: wave convection and source convection are decoupled. The normalized Green's function appears to be near unity outside the refraction dominated 'cone of silence', this validates our long term practice of using Lighthill-based approaches outside the cone, with extension inside via the Green's function. The function is obtained either experimentally (injected 'point' source) or numerically (computational aeroacoustics). Approximation by unity seems adequate except near the cone and except when there are shrouding jets: in that case the difference from unity quantifies the shielding effect. Further extension yields dipole and monopole source terms (cf. Morfey, Mani, and others) when the mean flow possesses density gradients (e.g., hot jets).

  10. Aerostat-lofted instrument and sampling method for determination of emissions from open area sources

    EPA Science Inventory

    An aerostat-borne instrument and sampling method was developed to characterize air samples from area sources, such as emissions from open burning. The 10 kg battery-powered instrument system, termed "the Flyer," is lofted with a helium-filled aerostat of 4 m nominal diameter and ...

  11. ERROR IN ANNUAL AVERAGE DUE TO USE OF LESS THAN EVERYDAY MEASUREMENTS

    EPA Science Inventory

    Long term averages of the concentration of PM mass and components are of interest for determining compliance with annual averages, for developing exposure surrogated for cross-sectional epidemiologic studies of the long-term of PM, and for determination of aerosol sources by chem...

  12. Physical/chemical closed-loop water-recycling

    NASA Technical Reports Server (NTRS)

    Herrmann, Cal C.; Wydeven, Theodore

    1991-01-01

    Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on Earth, in regions where extensive water recycling is needed or where advanced water treatment is essential to meet EPA health standards.

  13. Nurses' Information Seeking Behavior for Clinical Practice: A Case Study in a Developing Country.

    PubMed

    Sarbaz, Masoumeh; Kimiafar, Khalil; Sheikhtaheri, Abbas; Taherzadeh, Zhila; Eslami, Saeid

    2016-01-01

    We used a valid questionnaire to survey Iranian nurses' seeking information behavior and their confidence on different information sources. The frequently used sources were Internet" and "personal experiences" (54.8% and 48.2% respectively). "English medical journals" (61.9%) and "English textbooks" (41.3%) were the least frequently used sources. Nurses felt high confidence in sources such as "International instructions/guidelines" (58.6%) and "English medical textbooks" (50.4%). The main reasons for selecting sources were easy accessibility, being up to date and reliability. Google, Pubmed and Up to Date were the most used electronic sources. In addition, there were differences in terms of using some of these resources and nurse' age and gender. In developing information sources for nurses, factors such as reliability level, availability, and updatedness of resources should be more emphasized.

  14. Hydrologic Source Term Processes and Models for the Clearwater and Wineskin Tests, Rainier Mesa, Nevada National Security Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carle, Steven F.

    2011-05-04

    This report describes the development, processes, and results of a hydrologic source term (HST) model for the CLEARWATER (U12q) and WINESKIN (U12r) tests located on Rainier Mesa, Nevada National Security Site, Nevada (Figure 1.1). Of the 61 underground tests (involving 62 unique detonations) conducted on Rainier Mesa (Area 12) between 1957 and 1992 (USDOE, 2015), the CLEARWATER and WINESKIN tests present many unique features that warrant a separate HST modeling effort from other Rainier Mesa tests.

  15. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE PAGES

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.; ...

    2016-12-01

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  16. Development of surrogate models for the prediction of the flow around an aircraft propeller

    NASA Astrophysics Data System (ADS)

    Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros

    2018-05-01

    In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.

  17. Bremsstrahlung Dose Yield for High-Intensity Short-Pulse Laser–Solid Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, Taiee; Bauer, Johannes M.; Liu, James C.

    A bremsstrahlung source term has been developed by the Radiation Protection (RP) group at SLAC National Accelerator Laboratory for high-intensity short-pulse laser–solid experiments between 10 17 and 10 22 W cm –2. This source term couples the particle-in-cell plasma code EPOCH and the radiation transport code FLUKA to estimate the bremsstrahlung dose yield from laser–solid interactions. EPOCH characterizes the energy distribution, angular distribution, and laser-to-electron conversion efficiency of the hot electrons from laser–solid interactions, and FLUKA utilizes this hot electron source term to calculate a bremsstrahlung dose yield (mSv per J of laser energy on target). The goal of thismore » paper is to provide RP guidelines and hazard analysis for high-intensity laser facilities. In conclusion, a comparison of the calculated bremsstrahlung dose yields to radiation measurement data is also made.« less

  18. On the application of subcell resolution to conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Chang, Shih-Hung

    1989-01-01

    LeVeque and Yee recently investigated a one-dimensional scalar conservation law with stiff source terms modeling the reacting flow problems and discovered that for the very stiff case most of the current finite difference methods developed for non-reacting flows would produce wrong solutions when there is a propagating discontinuity. A numerical scheme, essentially nonoscillatory/subcell resolution - characteristic direction (ENO/SRCD), is proposed for solving conservation laws with stiff source terms. This scheme is a modification of Harten's ENO scheme with subcell resolution, ENO/SR. The locations of the discontinuities and the characteristic directions are essential in the design. Strang's time-splitting method is used and time evolutions are done by advancing along the characteristics. Numerical experiment using this scheme shows excellent results on the model problem of LeVeque and Yee. Comparisons of the results of ENO, ENO/SR, and ENO/SRCD are also presented.

  19. Classification of light sources and their interaction with active and passive environments

    NASA Astrophysics Data System (ADS)

    El-Dardiry, Ramy G. S.; Faez, Sanli; Lagendijk, Ad

    2011-03-01

    Emission from a molecular light source depends on its optical and chemical environment. This dependence is different for various sources. We present a general classification in terms of constant-amplitude and constant-power sources. Using this classification, we have described the response to both changes in the local density of states and stimulated emission. The unforeseen consequences of this classification are illustrated for photonic studies by random laser experiments and are in good agreement with our correspondingly developed theory. Our results require a revision of studies on sources in complex media.

  20. Sources of Cigarettes among Adolescent Smokers: Free or Purchased?

    ERIC Educational Resources Information Center

    Jansen, Paul; Toomey, Traci L.; Nelson, Toben F.; Fabian, Lindsey E. A.; Lenk, Kathleen M.; Forster, Jean L.

    2011-01-01

    Few studies have described youth cigarette sources in terms of whether the cigarettes were free or purchased. Understanding the different ways youth obtain tobacco can guide development of interventions to more effectively reduce youth smoking. Purpose: To determine the propensity for youth to purchase cigarettes versus obtain cigarettes for free,…

  1. Assessment of macroseismic intensity in the Nile basin, Egypt

    NASA Astrophysics Data System (ADS)

    Fergany, Elsayed

    2018-01-01

    This work intends to assess deterministic seismic hazard and risk analysis in terms of the maximum expected intensity map of the Egyptian Nile basin sector. Seismic source zone model of Egypt was delineated based on updated compatible earthquake catalog in 2015, focal mechanisms, and the common tectonic elements. Four effective seismic source zones were identified along the Nile basin. The observed macroseismic intensity data along the basin was used to develop intensity prediction equation defined in terms of moment magnitude. Expected maximum intensity map was proven based on the developed intensity prediction equation, identified effective seismic source zones, and maximum expected magnitude for each zone along the basin. The earthquake hazard and risk analysis was discussed and analyzed in view of the maximum expected moment magnitude and the maximum expected intensity values for each effective source zone. Moderate expected magnitudes are expected to put high risk at Cairo and Aswan regions. The results of this study could be a recommendation for the planners in charge to mitigate the seismic risk at these strategic zones of Egypt.

  2. Use of commercial and social sources of alcohol by underage drinkers: the role of pubertal timing.

    PubMed

    Storvoll, Elisabet E; Pape, Hilde; Rossow, Ingeborg

    2008-01-01

    We have explored whether alcohol use and procurement of alcohol from commercial and social sources vary with pubertal timing. A sub-sample of 9291 Norwegian minors (13-17 year-olds) was extracted from a nationwide school survey (response rate: 92%). Adolescents who had matured early (early developers, EDs) reported higher consumption and more alcohol-related harm than those who had matured late (late developers, LDs) or at the "normal" time (on time developers, ODs). Purchases from on-premise and off-premise outlets were much more important sources of alcohol for EDs than for ODs and LDs - both in relative and absolute terms. Moreover, EDs were somewhat more likely to obtain alcohol from social sources. Taken together, the findings indicate that adolescents who mature early have access to a larger variety of sources of alcohol than adolescents who mature later - which in turn may explain their increased level of drinking.

  3. Management of Ultimate Risk of Nuclear Power Plants by Source Terms - Lessons Learned from the Chernobyl Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genn Saji

    2006-07-01

    The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less

  4. The Application of Function Points to Predict Source Lines of Code for Software Development

    DTIC Science & Technology

    1992-09-01

    there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available

  5. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  6. A controlled variation scheme for convection treatment in pressure-based algorithm

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Thakur, Siddharth; Tucker, Kevin

    1993-01-01

    Convection effect and source terms are two primary sources of difficulties in computing turbulent reacting flows typically encountered in propulsion devices. The present work intends to elucidate the individual as well as the collective roles of convection and source terms in the fluid flow equations, and to devise appropriate treatments and implementations to improve our current capability of predicting such flows. A controlled variation scheme (CVS) has been under development in the context of a pressure-based algorithm, which has the characteristics of adaptively regulating the amount of numerical diffusivity, relative to central difference scheme, according to the variation in local flow field. Both the basic concepts and a pragmatic assessment will be presented to highlight the status of this work.

  7. A highly sensitive search strategy for clinical trials in Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) was developed.

    PubMed

    Manríquez, Juan J

    2008-04-01

    Systematic reviews should include as many articles as possible. However, many systematic reviews use only databases with high English language content as sources of trials. Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) is an underused source of trials, and there is not a validated strategy for searching clinical trials to be used in this database. The objective of this study was to develop a sensitive search strategy for clinical trials in LILACS. An analytical survey was performed. Several single and multiple-term search strategies were tested for their ability to retrieve clinical trials in LILACS. Sensitivity, specificity, and accuracy of each single and multiple-term strategy were calculated using the results of a hand-search of 44 Chilean journals as gold standard. After combining the most sensitive, specific, and accurate single and multiple-term search strategy, a strategy with a sensitivity of 97.75% (95% confidence interval [CI]=95.98-99.53) and a specificity of 61.85 (95% CI=61.19-62.51) was obtained. LILACS is a source of trials that could improve systematic reviews. A new highly sensitive search strategy for clinical trials in LILACS has been developed. It is hoped this search strategy will improve and increase the utilization of LILACS in future systematic reviews.

  8. Coupling long and short term decisions in the design of urban water supply infrastructure for added reliability and flexibility

    NASA Astrophysics Data System (ADS)

    Marques, G.; Fraga, C. C. S.; Medellin-Azuara, J.

    2016-12-01

    The expansion and operation of urban water supply systems under growing demands, hydrologic uncertainty and water scarcity requires a strategic combination of supply sources for reliability, reduced costs and improved operational flexibility. The design and operation of such portfolio of water supply sources involves integration of long and short term planning to determine what and when to expand, and how much to use of each supply source accounting for interest rates, economies of scale and hydrologic variability. This research presents an integrated methodology coupling dynamic programming optimization with quadratic programming to optimize the expansion (long term) and operations (short term) of multiple water supply alternatives. Lagrange Multipliers produced by the short-term model provide a signal about the marginal opportunity cost of expansion to the long-term model, in an iterative procedure. A simulation model hosts the water supply infrastructure and hydrologic conditions. Results allow (a) identification of trade offs between cost and reliability of different expansion paths and water use decisions; (b) evaluation of water transfers between urban supply systems; and (c) evaluation of potential gains by reducing water system losses as a portfolio component. The latter is critical in several developing countries where water supply system losses are high and often neglected in favor of more system expansion.

  9. Global biodiversity monitoring: from data sources to essential biodiversity variables

    USGS Publications Warehouse

    Proenca, Vania; Martin, Laura J.; Pereira, Henrique M.; Fernandez, Miguel; McRae, Louise; Belnap, Jayne; Böhm, Monika; Brummitt, Neil; Garcia-Moreno, Jaime; Gregory, Richard D.; Honrado, Joao P; Jürgens, Norbert; Opige, Michael; Schmeller, Dirk S.; Tiago, Patricia; van Sway, Chris A

    2016-01-01

    Essential Biodiversity Variables (EBVs) consolidate information from varied biodiversity observation sources. Here we demonstrate the links between data sources, EBVs and indicators and discuss how different sources of biodiversity observations can be harnessed to inform EBVs. We classify sources of primary observations into four types: extensive and intensive monitoring schemes, ecological field studies and satellite remote sensing. We characterize their geographic, taxonomic and temporal coverage. Ecological field studies and intensive monitoring schemes inform a wide range of EBVs, but the former tend to deliver short-term data, while the geographic coverage of the latter is limited. In contrast, extensive monitoring schemes mostly inform the population abundance EBV, but deliver long-term data across an extensive network of sites. Satellite remote sensing is particularly suited to providing information on ecosystem function and structure EBVs. Biases behind data sources may affect the representativeness of global biodiversity datasets. To improve them, researchers must assess data sources and then develop strategies to compensate for identified gaps. We draw on the population abundance dataset informing the Living Planet Index (LPI) to illustrate the effects of data sources on EBV representativeness. We find that long-term monitoring schemes informing the LPI are still scarce outside of Europe and North America and that ecological field studies play a key role in covering that gap. Achieving representative EBV datasets will depend both on the ability to integrate available data, through data harmonization and modeling efforts, and on the establishment of new monitoring programs to address critical data gaps.

  10. Quiet Clean Short-Haul Experimental Engine (QCSEE): Acoustic treatment development and design

    NASA Technical Reports Server (NTRS)

    Clemons, A.

    1979-01-01

    Acoustic treatment designs for the quiet clean short-haul experimental engines are defined. The procedures used in the development of each noise-source suppressor device are presented and discussed in detail. A complete description of all treatment concepts considered and the test facilities utilized in obtaining background data used in treatment development are also described. Additional supporting investigations that are complementary to the treatment development work are presented. The expected suppression results for each treatment configuration are given in terms of delta SPL versus frequency and in terms of delta PNdB.

  11. Upper and lower bounds of ground-motion variabilities: implication for source properties

    NASA Astrophysics Data System (ADS)

    Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino

    2017-04-01

    One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).

  12. Liquid-metal-ion source development for space propulsion at ARC.

    PubMed

    Tajmar, M; Scharlemann, C; Genovese, A; Buldrini, N; Steiger, W; Vasiljevich, I

    2009-04-01

    The Austrian Research Centers have a long history of developing indium Liquid-Metal-Ion Source (LMIS) for space applications including spacecraft charging compensators, SIMS and propulsion. Specifically the application as a thruster requires long-term operation as well as high-current operation which is very challenging. Recently, we demonstrated the operation of a cluster of single LMIS at an average current of 100muA each for more than 4800h and developed models for tip erosion and droplet deposition suggesting that such a LMIS can operate up to 20,000h or more. In order to drastically increase the current, a porous multi-tip source that allows operation up to several mA was developed. Our paper will highlight the problem areas and challenges from our LMIS development focusing on space propulsion applications.

  13. Physical/chemical closed-loop water-recycling for long-duration missions

    NASA Technical Reports Server (NTRS)

    Herrmann, Cal C.; Wydeven, Ted

    1990-01-01

    Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on earth, in regions where extensive water ecycling is needed or where advanced water treatment is essential to meet EPA health standards.

  14. Reflections on the role of open source in health information system interoperability.

    PubMed

    Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G

    2007-01-01

    This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.

  15. A model for jet-noise analysis using pressure-gradient correlations on an imaginary cone

    NASA Technical Reports Server (NTRS)

    Norum, T. D.

    1974-01-01

    The technique for determining the near and far acoustic field of a jet through measurements of pressure-gradient correlations on an imaginary conical surface surrounding the jet is discussed. The necessary analytical developments are presented, and their feasibility is checked by using a point source as the sound generator. The distribution of the apparent sources on the cone, equivalent to the point source, is determined in terms of the pressure-gradient correlations.

  16. Term Paper Resource Guide to Twentieth-Century United States History.

    ERIC Educational Resources Information Center

    Muccigrosso, Robert; Blazek, Ron; Maggio, Teri

    Geared to the needs of high school and undergraduate students, this guide presents 500 term paper ideas and print and nonprint sources on 20th-century U.S. history. Entries on 100 of the most important topics, issues, and developments in U.S. history, beginning with the Spanish-American War, are organized in chronological order. Each entry…

  17. Energy for Development: Third World Options. Worldwatch Paper 15.

    ERIC Educational Resources Information Center

    Hayes, Denis

    Focusing on the need for energy to sustain economic development on a long-term basis, the document examines energy options of the post-petroleum era in developing nations. Nuclear power and solar power are the most important among proposed alternative energy sources. Limited applicability of nuclear technology to the Third World is discussed.…

  18. Path to Market for Compact Modular Fusion Power Cores

    NASA Astrophysics Data System (ADS)

    Woodruff, Simon; Baerny, Jennifer K.; Mattor, Nathan; Stoulil, Don; Miller, Ronald; Marston, Theodore

    2012-08-01

    The benefits of an energy source whose reactants are plentiful and whose products are benign is hard to measure, but at no time in history has this energy source been more needed. Nuclear fusion continues to promise to be this energy source. However, the path to market for fusion systems is still regularly a matter for long-term (20 + year) plans. This white paper is intended to stimulate discussion of faster commercialization paths, distilling guidance from investors, utilities, and the wider energy research community (including from ARPA-E). There is great interest in a small modular fusion system that can be developed quickly and inexpensively. A simple model shows how compact modular fusion can produce a low cost development path by optimizing traditional systems that burn deuterium and tritium, operating not only at high magnetic field strength, but also by omitting some components that allow for the core to become more compact and easier to maintain. The dominant hurdles to the development of low cost, practical fusion systems are discussed, primarily in terms of the constraints placed on the cost of development stages in the private sector. The main finding presented here is that the bridge from DOE Office of Science to the energy market can come at the Proof of Principle development stage, providing the concept is sufficiently compact and inexpensive that its development allows for a normal technology commercialization path.

  19. Performance evaluation of a permanent ring magnet based helicon plasma source for negative ion source research

    NASA Astrophysics Data System (ADS)

    Pandey, Arun; Bandyopadhyay, M.; Sudhir, Dass; Chakraborty, A.

    2017-10-01

    Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.

  20. Performance evaluation of a permanent ring magnet based helicon plasma source for negative ion source research.

    PubMed

    Pandey, Arun; Bandyopadhyay, M; Sudhir, Dass; Chakraborty, A

    2017-10-01

    Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.

  1. Filtered Mass Density Function for Design Simulation of High Speed Airbreathing Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Drozda, T. G.; Sheikhi, R. M.; Givi, Peyman

    2001-01-01

    The objective of this research is to develop and implement new methodology for large eddy simulation of (LES) of high-speed reacting turbulent flows. We have just completed two (2) years of Phase I of this research. This annual report provides a brief and up-to-date summary of our activities during the period: September 1, 2000 through August 31, 2001. In the work within the past year, a methodology termed "velocity-scalar filtered density function" (VSFDF) is developed and implemented for large eddy simulation (LES) of turbulent flows. In this methodology the effects of the unresolved subgrid scales (SGS) are taken into account by considering the joint probability density function (PDF) of all of the components of the velocity and scalar vectors. An exact transport equation is derived for the VSFDF in which the effects of the unresolved SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source terms appear in closed form. The remaining unclosed terms in this equation are modeled. A system of stochastic differential equations (SDEs) which yields statistically equivalent results to the modeled VSFDF transport equation is constructed. These SDEs are solved numerically by a Lagrangian Monte Carlo procedure. The consistency of the proposed SDEs and the convergence of the Monte Carlo solution are assessed by comparison with results obtained by an Eulerian LES procedure in which the corresponding transport equations for the first two SGS moments are solved. The unclosed SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source in the Eulerian LES are replaced by corresponding terms from VSFDF equation. The consistency of the results is then analyzed for a case of two dimensional mixing layer.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barry, Kenneth

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted tomore » the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced settling by particle growth are the dominant processes for determining DFs for expected conditions in an iPWR containment. These processes are dependent on the areato-volume (A/V) ratio, which should benefit iPWR designs because these reactors have higher A/Vs compared to existing LWRs.« less

  3. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  4. Sources of Evidence-of-Learning: Learning and Assessment in the Era of Big Data

    ERIC Educational Resources Information Center

    Cope, Bill; Kalantzis, Mary

    2015-01-01

    This article sets out to explore a shift in the sources of evidence-of-learning in the era of networked computing. One of the key features of recent developments has been popularly characterized as "big data". We begin by examining, in general terms, the frame of reference of contemporary debates on machine intelligence and the role of…

  5. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  6. A Need for a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    1982-01-01

    Examines sources available for developing a theory of visual literacy and attempts to clarify the meaning of the term. Suggests that visual thinking, a concept supported by recent research on mental imagery, visualization, and dual coding, ought to be the emphasis for future theory development. (FL)

  7. Tracing the source of difficult to settle fine particles which cause turbidity in the Hitotsuse Reservoir, Japan.

    PubMed

    Murakami, Toshiki; Suzuki, Yoshihiro; Oishi, Hiroyuki; Ito, Kenichi; Nakao, Toshio

    2013-05-15

    A unique method to trace the source of "difficult-to-settle fine particles," which are a causative factor of long-term turbidity in reservoirs was developed. This method is characterized by cluster analysis of XRD (X-ray diffraction) data and homology comparison of major component compositions between "difficult-to-settle fine particles" contained in landslide soil samples taken from the upstream of a dam, and suspended "long-term turbid water particles" in the reservoir, which is subject to long-term turbidity. The experiment carried out to validate the proposed method, demonstrated a high possibility of being able to make an almost identical match between "difficult-to-settle fine particles" taken from landslide soils at specific locations and "long-term turbid water particles" taken from a reservoir. This method has the potential to determine substances causing long-term turbidity and the locations of soils from which those substances came. Appropriate countermeasures can then be taken at those specific locations. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  9. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  10. PHENOstruct: Prediction of human phenotype ontology terms using heterogeneous data sources.

    PubMed

    Kahanda, Indika; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa

    2015-01-01

    The human phenotype ontology (HPO) was recently developed as a standardized vocabulary for describing the phenotype abnormalities associated with human diseases. At present, only a small fraction of human protein coding genes have HPO annotations. But, researchers believe that a large portion of currently unannotated genes are related to disease phenotypes. Therefore, it is important to predict gene-HPO term associations using accurate computational methods. In this work we demonstrate the performance advantage of the structured SVM approach which was shown to be highly effective for Gene Ontology term prediction in comparison to several baseline methods. Furthermore, we highlight a collection of informative data sources suitable for the problem of predicting gene-HPO associations, including large scale literature mining data.

  11. Leveraging terminological resources for mapping between rare disease information sources.

    PubMed

    Rance, Bastien; Snyder, Michelle; Lewis, Janine; Bodenreider, Olivier

    2013-01-01

    Rare disease information sources are incompletely and inconsistently cross-referenced to one another, making it difficult for information seekers to navigate across them. The development of such cross-references established manually by experts is generally labor intensive and costly. To develop an automatic mapping between two of the major rare diseases information sources, GARD and Orphanet, by leveraging terminological resources, especially the UMLS. We map the rare disease terms from Orphanet and ORDR to the UMLS. We use the UMLS as a pivot to bridge between the rare disease terminologies. We compare our results to a mapping obtained through manually established cross-references to OMIM. Our mapping has a precision of 94%, a recall of 63% and an F1-score of 76%. Our automatic mapping should help facilitate the development of more complete and consistent cross-references between GARD and Orphanet, and is applicable to other rare disease information sources as well.

  12. Residual Gas in Closed Systems. III: Development and Reduction of Gases Generated by Source Materials

    NASA Technical Reports Server (NTRS)

    Palosz, W.

    2003-01-01

    The amounts and composition of residual gases formed in sealed ampoules loaded with different sources (elements and II-VI and IV-VI compounds) after consecutive annealings were investigated. A given source was subjected to a series of heat treatments, with intermediate measurements and removal of the gas accumulated in the system. The results of these experiments are discussed in terms of the underlying thermochemical and kinetic phenomena and practical limitations of reducing the amount of residual gases in sealed ampoules.

  13. Tracking Nitrogen Sources, Transformation, and Transport at a Basin Scale with Complex Plain River Networks.

    PubMed

    Yi, Qitao; Chen, Qiuwen; Hu, Liuming; Shi, Wenqing

    2017-05-16

    This research developed an innovative approach to reveal nitrogen sources, transformation, and transport in large and complex river networks in the Taihu Lake basin using measurement of dual stable isotopes of nitrate. The spatial patterns of δ 15 N corresponded to the urbanization level, and the nitrogen cycle was associated with the hydrological regime at the basin level. During the high flow season of summer, nonpoint sources from fertilizer/soils and atmospheric deposition constituted the highest proportion of the total nitrogen load. The point sources from sewage/manure, with high ammonium concentrations and high δ 15 N and δ 18 O contents in the form of nitrate, accounted for the largest inputs among all sources during the low flow season of winter. Hot spot areas with heavy point source pollution were identified, and the pollutant transport routes were revealed. Nitrification occurred widely during the warm seasons, with decreased δ 18 O values; whereas great potential for denitrification existed during the low flow seasons of autumn and spring. The study showed that point source reduction could have effects over the short-term; however, long-term efforts to substantially control agriculture nonpoint sources are essential to eutrophication alleviation for the receiving lake, which clarifies the relationship between point and nonpoint source control.

  14. LES-Modeling of a Partially Premixed Flame using a Deconvolution Turbulence Closure

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Wu, Hao; Ihme, Matthias

    2015-11-01

    The modeling of the turbulence/chemistry interaction in partially premixed and multi-stream combustion remains an outstanding issue. By extending a recently developed constrained minimum mean-square error deconvolution (CMMSED) method, to objective of this work is to develop a source-term closure for turbulent multi-stream combustion. In this method, the chemical source term is obtained from a three-stream flamelet model, and CMMSED is used as closure model, thereby eliminating the need for presumed PDF-modeling. The model is applied to LES of a piloted turbulent jet flame with inhomogeneous inlets, and simulation results are compared with experiments. Comparisons with presumed PDF-methods are performed, and issues regarding resolution and conservation of the CMMSED method are examined. The author would like to acknowledge the support of funding from Stanford Graduate Fellowship.

  15. High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2010-12-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Comparative genomic and physiological analysis of nutrient response to NH4+, NH4+:NO3- and NO3- in barley seedlings.

    PubMed

    Lopes, Marta S; Araus, José L

    2008-09-01

    Long-term differences in photosynthesis, respiration and growth of plants receiving distinct nitrogen (N) sources imply that N metabolism generates signals that regulate metabolism and development. The molecular basis of these signals remains unclear. Here we studied the gene expression profiles of barley (Hordeum vulgare L. cv. Graphic) seedlings fertilized either with ammonium (NH4+), with ammonium and nitrate (NH4+:NO3-), or with nitrate (NO3-) only. Our transcriptome analysis after 48 h of growth in these N sources showed major changes in the expression of genes involved in N metabolism (nitrate reductase), signalling (protein kinases and protein phosphatases), photosynthesis (chlorophyll a/b-binding protein and a PsbQ domain), where increases in NO3- as compared with NH4+ were observed. Moreover, NH4+ assimilation induced genes participating in C and sugars metabolism (phosphoglycerate kinase, glucosyltranferase and galactokinase), respiration (cytochrome c oxidase), protein fate (heat shock proteins) and development (MTN3-like protein). These changes in gene expression could well explain the long-term growth depression observed in NH4+ plants. Even if a few genes participating in protein fate (proteases) and development (OsNAC5) were upregulated in NH4+ as compared with NH4+:NO3-, the general pattern of expression was quite similar between these two N sources. Taken together, these results indicated that other downstream mechanisms should be involved in the synergetic long-term response of NH4+:NO3-.

  17. Photovoltaics as a terrestrial energy source. Volume 3: An overview

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1980-01-01

    Photovoltaic (PV) systems were evaluated in terms of their potential for terrestrial application A comprehensive overview of important issues which bear on photovoltaic (PV) systems development is presented. Studies of PV system costs, the societal implications of PV system development, and strategies in PV research and development in relationship to current energy policies are summarized.

  18. Development of a wireless air pollution sensor package for aerial-sampling of emissions

    EPA Science Inventory

    A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter wi...

  19. DEVELOPMENT AND VALIDATION OF AN AIR-TO-BEEF FOOD CHAIN MODEL FOR DIOXIN-LIKE COMPOUNDS

    EPA Science Inventory

    A model for predicting concentrations of dioxin-like compounds in beef is developed and tested. The key premise of the model is that concentrations of these compounds in air are the source term, or starting point, for estimating beef concentrations. Vapor-phase concentrations t...

  20. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  1. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  2. Generation of GHS Scores from TEST and online sources ...

    EPA Pesticide Factsheets

    Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat toxicity, developmental toxicity, endocrine activity, and mutagenicity. It can be used to evaluate ecotoxicity (in terms of acute fathead minnow toxicity) and fate (in terms of bioconcentration factor). It also be used to estimate a variety of key physicochemical properties such as melting point, boiling point, vapor pressure, water solubility, and bioconcentration factor. A web-based version of T.E.S.T. is currently being developed to allow predictions to be made from other web tools. Online data sources such as from NCCT’s Chemistry Dashboard, REACH dossiers, or from ChemHat.org can also be utilized to obtain GHS (Global Harmonization System) scores for comparing alternatives. The purpose of this talk is to show how GHS (Global Harmonization Score) data can be obtained from literature sources and from T.E.S.T. (Toxicity Estimation Software Tool). This data will be used to compare chemical alternatives in the alternatives assessment dashboard (a 2018 CSS product).

  3. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gydesen, S.P.

    The purpose of this letter report is to reconstruct from available information that data which can be used to develop daily reactor operating history for 1960--1964. The information needed for source team calculations (as determined by the Source Terms Task Leader) were extracted and included in this report. The data on the amount of uranium dissolved by the separations plants (expressed both as tons and as MW) is also included in this compilation.

  5. Green materials for sustainable development

    NASA Astrophysics Data System (ADS)

    Purwasasmita, B. S.

    2017-03-01

    Sustainable development is an integrity of multidiscipline concept combining ecological, social and economic aspects to construct a liveable human living system. The sustainable development can be support through the development of green materials. Green materials offers a unique characteristic and properties including abundant in nature, less toxic, economically affordable and versatility in term of physical and chemical properties. Green materials can be applied for a numerous field in science and technology applications including for energy, building, construction and infrastructures, materials science and engineering applications and pollution management and technology. For instance, green materials can be developed as a source for energy production. Green materials including biomass-based source can be developed as a source for biodiesel and bioethanol production. Biomass-based materials also can be transformed into advanced functionalized materials for advanced bio-applications such as the transformation of chitin into chitosan which further used for biomedicine, biomaterials and tissue engineering applications. Recently, cellulose-based material and lignocellulose-based materials as a source for the developing functional materials attracted the potential prospect for biomaterials, reinforcing materials and nanotechnology. Furthermore, the development of pigment materials has gaining interest by using the green materials as a source due to their unique properties. Eventually, Indonesia as a large country with a large biodiversity can enhance the development of green material to strengthen our nation competitiveness and develop the materials technology for the future.

  6. An Improved Elastic and Nonelastic Neutron Transport Algorithm for Space Radiation

    NASA Technical Reports Server (NTRS)

    Clowdsley, Martha S.; Wilson, John W.; Heinbockel, John H.; Tripathi, R. K.; Singleterry, Robert C., Jr.; Shinn, Judy L.

    2000-01-01

    A neutron transport algorithm including both elastic and nonelastic particle interaction processes for use in space radiation protection for arbitrary shield material is developed. The algorithm is based upon a multiple energy grouping and analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. The algorithm is then coupled to the Langley HZETRN code through a bidirectional neutron evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for an aluminum water shield-target configuration is then compared with MCNPX and LAHET Monte Carlo calculations for the same shield-target configuration. With the Monte Carlo calculation as a benchmark, the algorithm developed in this paper showed a great improvement in results over the unmodified HZETRN solution. In addition, a high-energy bidirectional neutron source based on a formula by Ranft showed even further improvement of the fluence results over previous results near the front of the water target where diffusion out the front surface is important. Effects of improved interaction cross sections are modest compared with the addition of the high-energy bidirectional source terms.

  7. Analysis of streamflow distribution of non-point source nitrogen export from long-term urban-rural catchments to guide watershed management in the Chesapeake Bay watershed

    NASA Astrophysics Data System (ADS)

    Duncan, J. M.; Band, L. E.; Groffman, P.

    2017-12-01

    Discharge, land use, and watershed management practices (stream restoration and stormwater control measures) have been found to be important determinants of nitrogen (N) export to receiving waters. We used long-term water quality stations from the Baltimore Ecosystem Study Long-Term Ecological Research (BES LTER) Site to quantify nitrogen export across streamflow conditions at the small watershed scale. We calculated nitrate and total nitrogen fluxes using methodology that allows for changes over time; weighted regressions on time, discharge, and seasonality. Here we tested the hypotheses that a) while the largest N stream fluxes occur during storm events, there is not a clear relationship between N flux and discharge and b) N export patterns are aseasonal in developed watersheds where sources are larger and retention capacity is lower. The goal is to scale understanding from small watersheds to larger ones. Developing a better understanding of hydrologic controls on nitrogen export is essential for successful adaptive watershed management at societally meaningful spatial scales.

  8. Development and Performance of a Filter Radiometer Monitor System for Integrating Sphere Sources

    NASA Technical Reports Server (NTRS)

    Ding, Leibo; Kowalewski, Matthew G.; Cooper, John W.; Smith, GIlbert R.; Barnes, Robert A.; Waluschka, Eugene; Butler, James J.

    2011-01-01

    The NASA Goddard Space Flight Center (GSFC) Radiometric Calibration Laboratory (RCL) maintains several large integrating sphere sources covering the visible to the shortwave infrared wavelength range. Two critical, functional requirements of an integrating sphere source are short and long-term operational stability and repeatability. Monitoring the source is essential in determining the origin of systemic errors, thus increasing confidence in source performance and quantifying repeatability. If monitor data falls outside the established parameters, this could be an indication that the source requires maintenance or re-calibration against the National Institute of Science and Technology (NIST) irradiance standard. The GSFC RCL has developed a Filter Radiometer Monitoring System (FRMS) to continuously monitor the performance of its integrating sphere calibration sources in the 400 to 2400nm region. Sphere output change mechanisms include lamp aging, coating (e.g. BaSO4) deterioration, and ambient water vapor level. The Filter Radiometer Monitor System (FRMS) wavelength bands are selected to quantify changes caused by these mechanisms. The FRMS design and operation are presented, as well as data from monitoring four of the RCL s integrating sphere sources.

  9. Evaluating soybean breeding lines developed from differenct sources of resistance to phomopsis seed decay

    USDA-ARS?s Scientific Manuscript database

    Phomopsis seed decay (PSD) causes poor soybean seed quality worldwide. The primary causal agent of PSD is Phomopsis longicolla (syn. Diaporthe longicolla). Breeding for PSD-resistance is the most effective long-term strategy to control this disease. To develop soybean lines with resistance to PSD, m...

  10. On the Development of Citizenship Education Outlook in China

    ERIC Educational Resources Information Center

    Xiaoman, Zhu; Xiujun, Feng

    2008-01-01

    A source-identifying and comparative study of the development of the outlook on citizenship education in China and the Western countries indicates that there emerges a tendency of similar orientations in terms of relations between citizens and the state and society, between citizens' rights and obligations and between citizenship education and…

  11. Applied research opportunities in developed campgrounds

    Treesearch

    Carl P. Wiedemann

    2002-01-01

    Developed area camping is an important recreational activity in terms of both participation and as a source of revenue for public agencies. A major challenge for administrators in the public sector is how to increase revenues on limited budgets without sacrificing customer satisfaction. Applied research could make a valuable contribution to decision making, but not...

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.C. Ryman

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operatingmore » conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the Quality Assurance Requirements and Description (Ref. 7.28). The performance of the calculation and development of this document are carried out in accordance with AP-3.124, ''Design Calculation and Analyses'' (Ref. 7.29).« less

  13. Development of episodic and autobiographical memory: The importance of remembering forgetting

    PubMed Central

    Bauer, Patricia J.

    2015-01-01

    Some memories of the events of our lives have a long shelf-life—they remain accessible to recollection even after long delays. Yet many other of our experiences are forgotten, sometimes very soon after they take place. In spite of the prevalence of forgetting, theories of the development of episodic and autobiographical memory largely ignore it as a potential source of variance in explanation of age-related variability in long-term recall. They focus instead on what may be viewed as positive developmental changes, that is, changes that result in improvements in the quality of memory representations that are formed. The purpose of this review is to highlight the role of forgetting as an important variable in understanding the development of episodic and autobiographical memory. Forgetting processes are implicated as a source of variability in long-term recall due to the protracted course of development of the neural substrate responsible for transformation of fleeting experiences into memory traces that can be integrated into long-term stores and retrieved at later points in time. It is logical to assume that while the substrate is developing, neural processing is relatively inefficient and ineffective, resulting in loss of information from memory (i.e., forgetting). For this reason, focus on developmental increases in the quality of representations of past events and experiences will tell only a part of the story of how memory develops. A more complete account is afforded when we also consider changes in forgetting. PMID:26644633

  14. Next Generation of Leaching Tests

    EPA Science Inventory

    A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...

  15. Darwin Core: An Evolving Community-Developed Biodiversity Data Standard

    PubMed Central

    Wieczorek, John; Bloom, David; Guralnick, Robert; Blum, Stan; Döring, Markus; Giovanni, Renato; Robertson, Tim; Vieglais, David

    2012-01-01

    Biodiversity data derive from myriad sources stored in various formats on many distinct hardware and software platforms. An essential step towards understanding global patterns of biodiversity is to provide a standardized view of these heterogeneous data sources to improve interoperability. Fundamental to this advance are definitions of common terms. This paper describes the evolution and development of Darwin Core, a data standard for publishing and integrating biodiversity information. We focus on the categories of terms that define the standard, differences between simple and relational Darwin Core, how the standard has been implemented, and the community processes that are essential for maintenance and growth of the standard. We present case-study extensions of the Darwin Core into new research communities, including metagenomics and genetic resources. We close by showing how Darwin Core records are integrated to create new knowledge products documenting species distributions and changes due to environmental perturbations. PMID:22238640

  16. Reprint of: High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2011-05-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Effect of Loss on Multiplexed Single-Photon Sources (Open Access Publisher’s Version)

    DTIC Science & Technology

    2015-04-28

    lossy components on near- and long-term experimental goals, we simulate themultiplexed sources when used formany- photon state generation under various...efficient integer factorization and digital quantum simulation [7, 8], which relies critically on the development of a high-performance, on-demand photon ...SPDC) or spontaneous four-wave mixing: parametric processes which use a pump laser in a nonlinearmaterial to spontaneously generate photon pairs

  18. Hand Gesture Data Collection Procedure Using a Myo Armband for Machine Learning

    DTIC Science & Technology

    2015-09-01

    instructions, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection information...data using a Myo armband. The source code for this work is included as an Appendix. 15. SUBJECT TERMS Myo, Machine Learning, Classifier, Data...development in multiple platfonns (e.g., Windows, iOS, Android , etc.) and many languages (e.g. , Java, C++, C#, Lua, etc.). For the data collection

  19. Spurious Solutions Of Nonlinear Differential Equations

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Sweby, P. K.; Griffiths, D. F.

    1992-01-01

    Report utilizes nonlinear-dynamics approach to investigate possible sources of errors and slow convergence and non-convergence of steady-state numerical solutions when using time-dependent approach for problems containing nonlinear source terms. Emphasizes implications for development of algorithms in CFD and computational sciences in general. Main fundamental conclusion of study is that qualitative features of nonlinear differential equations cannot be adequately represented by finite-difference method and vice versa.

  20. Economic dispatch optimization for system integrating renewable energy sources

    NASA Astrophysics Data System (ADS)

    Jihane, Kartite; Mohamed, Cherkaoui

    2018-05-01

    Nowadays, the use of energy is growing especially in transportation and electricity industries. However this energy is based on conventional sources which pollute the environment. Multi-source system is seen as the best solution to sustainable development. This paper proposes the Economic Dispatch (ED) of hybrid renewable power system. The hybrid system is composed of ten thermal generators, photovoltaic (PV) generator and wind turbine generator. To show the importance of renewable energy sources (RES) in the energy mix we have ran the simulation for system integrated PV only and PV plus wind. The result shows that the system with renewable energy sources (RES) is more compromising than the system without RES in terms of fuel cost.

  1. Derivation and application of the reciprocity relations for radiative transfer with internal illumination

    NASA Technical Reports Server (NTRS)

    Cogley, A. C.

    1975-01-01

    A Green's function formulation is used to derive basic reciprocity relations for planar radiative transfer in a general medium with internal illumination. Reciprocity (or functional symmetry) allows an explicit and generalized development of the equivalence between source and probability functions. Assuming similar symmetry in three-dimensional space, a general relationship is derived between planar-source intensity and point-source total directional energy. These quantities are expressed in terms of standard (universal) functions associated with the planar medium, while all results are derived from the differential equation of radiative transfer.

  2. Next generation data harmonization

    NASA Astrophysics Data System (ADS)

    Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg

    2015-05-01

    Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.

  3. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  4. Development of Approach for Long-Term Management of Disused Sealed Radioactive Sources - 13630

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinker, M.; Reber, E.; Mansoux, H.

    Radioactive sources are used widely throughout the world in a variety of medical, industrial, research and military applications. When such radioactive sources are no longer used and are not intended to be used for the practice for which an authorization was granted, they are designated as 'disused sources'. Whether appropriate controls are in place during the useful life of a source or not, the end of this useful life is often a turning point after which it is more difficult to ensure the safety and security of the source over time. For various reasons, many disused sources cannot be returnedmore » to the manufacturer or the supplier for reuse or recycling. When these attempts fail, disused sources should be declared as radioactive waste and should be managed as such, in compliance with relevant international legal instruments and safety standards. However, disposal remains an unresolved issue in many counties, due to in part to limited public acceptance, insufficient funding, and a lack of practical examples of strategies for determining suitable disposal options. As a result, disused sources are often stored indefinitely at the facilities where they were once used. In order to prevent disused sources from becoming orphan sources, each country must develop and implement a comprehensive waste management strategy that includes disposal of disused sources. The International Atomic Energy Agency (IAEA) fosters international cooperation between countries and encourages the development of a harmonized 'cradle to grave' approach to managing sources consistent with international legal instruments, IAEA safety standards, and international good practices. This 'cradle to grave' approach requires the development of a national policy and implementing strategy, an adequate legal and regulatory framework, and adequate resources and infrastructure that cover the entire life cycle, from production and use of radioactive sources to disposal. (authors)« less

  5. A large eddy simulation scheme for turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Gao, Feng

    1993-01-01

    The recent development of the dynamic subgrid-scale (SGS) model has provided a consistent method for generating localized turbulent mixing models and has opened up great possibilities for applying the large eddy simulation (LES) technique to real world problems. Given the fact that the direct numerical simulation (DNS) can not solve for engineering flow problems in the foreseeable future (Reynolds 1989), the LES is certainly an attractive alternative. It seems only natural to bring this new development in SGS modeling to bear on the reacting flows. The major stumbling block for introducing LES to reacting flow problems has been the proper modeling of the reaction source terms. Various models have been proposed, but none of them has a wide range of applicability. For example, some of the models in combustion have been based on the flamelet assumption which is only valid for relatively fast reactions. Some other models have neglected the effects of chemical reactions on the turbulent mixing time scale, which is certainly not valid for fast and non-isothermal reactions. The probability density function (PDF) method can be usefully employed to deal with the modeling of the reaction source terms. In order to fit into the framework of LES, a new PDF, the large eddy PDF (LEPDF), is introduced. This PDF provides an accurate representation for the filtered chemical source terms and can be readily calculated in the simulations. The details of this scheme are described.

  6. A modification of the Regional Nutrient Management model (ReNuMa) to identify long-term changes in riverine nitrogen sources

    NASA Astrophysics Data System (ADS)

    Hu, Minpeng; Liu, Yanmei; Wang, Jiahui; Dahlgren, Randy A.; Chen, Dingjiang

    2018-06-01

    Source apportionment is critical for guiding development of efficient watershed nitrogen (N) pollution control measures. The ReNuMa (Regional Nutrient Management) model, a semi-empirical, semi-process-oriented model with modest data requirements, has been widely used for riverine N source apportionment. However, the ReNuMa model contains limitations for addressing long-term N dynamics by ignoring temporal changes in atmospheric N deposition rates and N-leaching lag effects. This work modified the ReNuMa model by revising the source code to allow yearly changes in atmospheric N deposition and incorporation of N-leaching lag effects into N transport processes. The appropriate N-leaching lag time was determined from cross-correlation analysis between annual watershed individual N source inputs and riverine N export. Accuracy of the modified ReNuMa model was demonstrated through analysis of a 31-year water quality record (1980-2010) from the Yongan watershed in eastern China. The revisions considerably improved the accuracy (Nash-Sutcliff coefficient increased by ∼0.2) of the modified ReNuMa model for predicting riverine N loads. The modified model explicitly identified annual and seasonal changes in contributions of various N sources (i.e., point vs. nonpoint source, surface runoff vs. groundwater) to riverine N loads as well as the fate of watershed anthropogenic N inputs. Model results were consistent with previously modeled or observed lag time length as well as changes in riverine chloride and nitrate concentrations during the low-flow regime and available N levels in agricultural soils of this watershed. The modified ReNuMa model is applicable for addressing long-term changes in riverine N sources, providing decision-makers with critical information for guiding watershed N pollution control strategies.

  7. Sol-gel coated ion sources for liquid chromatography-direct electron ionization mass spectrometry.

    PubMed

    Riboni, Nicolò; Magrini, Laura; Bianchi, Federica; Careri, Maria; Cappiello, Achille

    2017-07-25

    Advances in interfacing liquid chromatography and electron ionization mass spectrometry are presented. New ion source coatings synthesized by sol-gel technology were developed and tested as vaporization surfaces in terms of peak intensity, peak width and peak delay for the liquid chromatography-direct electron ionization mass spectrometry (Direct-EI) determination of environmental pollutants like polycyclic aromatic hydrocarbons and steroids. Silica-, titania-, and zirconia-based coatings were sprayed inside the stainless steel ion source and characterized in terms of thermal stability, film thickness and morphology. Negligible weight losses until 350-400 °C were observed for all the materials, with coating thicknesses in the 6 (±1)-11 (±2) μm range for optimal ionization process. The best performances in terms of both peak intensity and peak width were obtained by using the silica-based coating: the detection of the investigated compounds was feasible at low ng μl -1 levels with a good precision (RSD < 9% for polycyclic aromatic hydrocarbons and <11% for hormones). Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Numerical Simulations of Reacting Flows Using Asynchrony-Tolerant Schemes for Exascale Computing

    NASA Astrophysics Data System (ADS)

    Cleary, Emmet; Konduri, Aditya; Chen, Jacqueline

    2017-11-01

    Communication and data synchronization between processing elements (PEs) are likely to pose a major challenge in scalability of solvers at the exascale. Recently developed asynchrony-tolerant (AT) finite difference schemes address this issue by relaxing communication and synchronization between PEs at a mathematical level while preserving accuracy, resulting in improved scalability. The performance of these schemes has been validated for simple linear and nonlinear homogeneous PDEs. However, many problems of practical interest are governed by highly nonlinear PDEs with source terms, whose solution may be sensitive to perturbations caused by communication asynchrony. The current work applies the AT schemes to combustion problems with chemical source terms, yielding a stiff system of PDEs with nonlinear source terms highly sensitive to temperature. Examples shown will use single-step and multi-step CH4 mechanisms for 1D premixed and nonpremixed flames. Error analysis will be discussed both in physical and spectral space. Results show that additional errors introduced by the AT schemes are negligible and the schemes preserve their accuracy. We acknowledge funding from the DOE Computational Science Graduate Fellowship administered by the Krell Institute.

  9. Supersonic propulsion simulation by incorporating component models in the large perturbation inlet (LAPIN) computer code

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Richard, Jacques C.

    1991-01-01

    An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.

  10. The Development of Lifecycle Data for Hydrogen Fuel Production and Delivery

    DOT National Transportation Integrated Search

    2017-10-01

    An evaluation of renewable hydrogen production technologies anticipated to be available in the short, mid- and long-term timeframes was conducted. Renewable conversion pathways often rely on a combination of renewable and fossil energy sources, with ...

  11. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  12. Leveraging Terminological Resources for Mapping between Rare Disease Information Sources

    PubMed Central

    Rance, Bastien; Snyder, Michelle; Lewis, Janine; Bodenreider, Olivier

    2015-01-01

    Background Rare disease information sources are incompletely and inconsistently cross-referenced to one another, making it difficult for information seekers to navigate across them. The development of such cross-references established manually by experts is generally labor intensive and costly. Objectives To develop an automatic mapping between two of the major rare diseases information sources, GARD and Orphanet, by leveraging terminological resources, especially the UMLS. Methods We map the rare disease terms from Orphanet and ORDR to the UMLS. We use the UMLS as a pivot to bridge between the rare disease terminologies. We compare our results to a mapping obtained through manually established cross-references to OMIM. Results Our mapping has a precision of 94%, a recall of 63% and an F1-score of 76%. Our automatic mapping should help facilitate the development of more complete and consistent cross-references between GARD and Orphanet, and is applicable to other rare disease information sources as well. PMID:23920611

  13. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  14. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  15. Hydrogeochemistry of the drinking water sources of Derebogazi Village (Kahramanmaras) and their effects on human health.

    PubMed

    Uras, Yusuf; Uysal, Yagmur; Arikan, Tugba Atilan; Kop, Alican; Caliskan, Mustafa

    2015-06-01

    The aim of this study was to investigate the sources of drinking water for Derebogazi Village, Kahramanmaras Province, Turkey, in terms of hydrogeochemistry, isotope geochemistry, and medical geology. Water samples were obtained from seven different water sources in the area, all of which are located within quartzite units of Paleozoic age, and isotopic analyses of (18)O and (2)H (deuterium) were conducted on the samples. Samples were collected from the region for 1 year. Water quality of the samples was assessed in terms of various water quality parameters, such as temperature, pH, conductivity, alkalinity, trace element concentrations, anion-cation measurements, and metal concentrations, using ion chromatography, inductively coupled plasma (ICP) mass spectrometry, ICP-optical emission spectrometry techniques. Regional health surveys had revealed that the heights of local people are significantly below the average for the country. In terms of medical geology, the sampled drinking water from the seven sources was deficient in calcium and magnesium ions, which promote bone development. Bone mineral density screening tests were conducted on ten females using dual energy X-ray absorptiometry to investigate possible developmental disorder(s) and potential for mineral loss in the region. Of these ten women, three had T-scores close to the osteoporosis range (T-score < -2.5).

  16. Photovoltaics as a terrestrial energy source. Volume 1: An introduction

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1980-01-01

    Photovoltaic (PV) systems were examined their potential for terrestrial application and future development. Photovoltaic technology, existing and potential photovoltaic applications, and the National Photovoltaics Program are reviewed. The competitive environment for this electrical source, affected by the presence or absence of utility supplied power is evaluated in term of systems prices. The roles of technological breakthroughs, directed research and technology development, learning curves, and commercial demonstrations in the National Program are discussed. The potential for photovoltaics to displace oil consumption is examined, as are the potential benefits of employing PV in either central-station or non-utility owned, small, distributed systems.

  17. SNL Five-Year Facilities & Infrastructure Plan FY2015-2019

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipriani, Ralph J.

    2014-12-01

    Sandia’s development vision is to provide an agile, flexible, safer, more secure, and efficient enterprise that leverages the scientific and technical capabilities of the workforce and supports national security requirements in multiple areas. Sandia’s Five-Year Facilities & Infrastructure Planning program represents a tool to budget and prioritize immediate and short-term actions from indirect funding sources in light of the bigger picture of proposed investments from direct-funded, Work for Others and other funding sources. As a complementary F&I investment program, Sandia’s indirect investment program supports incremental achievement of the development vision within a constrained resource environment.

  18. Large Energy Development Projects: Lessons Learned from Space and Politics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Harrison H.

    2005-04-15

    The challenge to global energy future lies in meeting the needs and aspirations of the ten to twelve billion earthlings that will be on this planet by 2050. At least an eight-fold increase in annual production will be required by the middle of this century. The energy sources that can be considered developed and 'in the box' for consideration as sources for major increases in supply over the next half century are fossil fuels, nuclear fission, and, to a lesser degree, various forms of direct and stored solar energy and conservation. None of these near-term sources of energy will providemore » an eight-fold or more increase in energy supply for various technical, environmental and political reasons.Only a few potential energy sources that fall 'out of the box' appear worthy of additional consideration as possible contributors to energy demand in 2050 and beyond. These particular candidates are deuterium-tritium fusion, space solar energy, and lunar helium-3 fusion. The primary advantage that lunar helium-3 fusion will have over other 'out of the box' energy sources in the pre-2050 timeframe is a clear path into the private capital markets. The development and demonstration of new energy sources will require several development paths, each of Apollo-like complexity and each with sub-paths of parallel development for critical functions and components.« less

  19. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  20. Master's Training as a Part of Young Researcher's Professional Development: British and Ukrainian Experience

    ERIC Educational Resources Information Center

    Bidyuk, Natalya

    2014-01-01

    The problem of the professional development of young researchers in terms of Master's training has been analyzed. The analysis of the literature references, documental and other sources gave grounds to state that the basic principle of Master's professional training is a research-oriented paradigm. The necessity of using the innovative ideas of…

  1. MUFFSgenMC: An Open Source MUon Flexible Framework for Spectral GENeration for Monte Carlo Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatzidakis, Stylianos; Greulich, Christopher

    A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.

  2. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  3. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  4. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  5. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  6. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  7. Outlook for alternative energy sources. [aviation fuels

    NASA Technical Reports Server (NTRS)

    Card, M. E.

    1980-01-01

    Predictions are made concerning the development of alternative energy sources in the light of the present national energy situation. Particular emphasis is given to the impact of alternative fuels development on aviation fuels. The future outlook for aircraft fuels is that for the near term, there possibly will be no major fuel changes, but minor specification changes may be possible if supplies decrease. In the midterm, a broad cut fuel may be used if current development efforts are successful. As synfuel production levels increase beyond the 1990's there may be some mixtures of petroleum-based and synfuel products with the possibility of some shale distillate and indirect coal liquefaction products near the year 2000.

  8. A glossary for biometeorology

    NASA Astrophysics Data System (ADS)

    Gosling, Simon N.; Bryce, Erin K.; Dixon, P. Grady; Gabriel, Katharina M. A.; Gosling, Elaine Y.; Hanes, Jonathan M.; Hondula, David M.; Liang, Liang; Bustos Mac Lean, Priscilla Ayleen; Muthers, Stefan; Nascimento, Sheila Tavares; Petralli, Martina; Vanos, Jennifer K.; Wanka, Eva R.

    2014-03-01

    Here we present, for the first time, a glossary of biometeorological terms. The glossary aims to address the need for a reliable source of biometeorological definitions, thereby facilitating communication and mutual understanding in this rapidly expanding field. A total of 171 terms are defined, with reference to 234 citations. It is anticipated that the glossary will be revisited in coming years, updating terms and adding new terms, as appropriate. The glossary is intended to provide a useful resource to the biometeorology community, and to this end, readers are encouraged to contact the lead author to suggest additional terms for inclusion in later versions of the glossary as a result of new and emerging developments in the field.

  9. A glossary for biometeorology.

    PubMed

    Gosling, Simon N; Bryce, Erin K; Dixon, P Grady; Gabriel, Katharina M A; Gosling, Elaine Y; Hanes, Jonathan M; Hondula, David M; Liang, Liang; Bustos Mac Lean, Priscilla Ayleen; Muthers, Stefan; Nascimento, Sheila Tavares; Petralli, Martina; Vanos, Jennifer K; Wanka, Eva R

    2014-03-01

    Here we present, for the first time, a glossary of biometeorological terms. The glossary aims to address the need for a reliable source of biometeorological definitions, thereby facilitating communication and mutual understanding in this rapidly expanding field. A total of 171 terms are defined, with reference to 234 citations. It is anticipated that the glossary will be revisited in coming years, updating terms and adding new terms, as appropriate. The glossary is intended to provide a useful resource to the biometeorology community, and to this end, readers are encouraged to contact the lead author to suggest additional terms for inclusion in later versions of the glossary as a result of new and emerging developments in the field.

  10. Bovine placenta: a review on morphology, components, and defects from terminology and clinical perspectives.

    PubMed

    Peter, Augustine T

    2013-10-15

    The bovine placenta has been the subject of many studies. Concurrently, several specialized terms have been developed to describe its development, morphology, components, function, and pathology. Many of these terms are simple, some are difficult to understand and use, and others are antiquated and may not be scientifically accurate. Defining and adopting terminology for the bovine placenta that is clear, precise and understandable, and available in a single source is expected to facilitate exchange of clinical and research information. This review presents a brief overview of the current knowledge regarding the bovine placenta and attempts to define terms. In this process, conventional terminology is presented, and contemporary and novel terms are proposed from a biological perspective. For example, use of terms such as syndesmochorial, retained placenta, and large offspring syndrome should be revisited. Furthermore, the clinical relevance of the structure and function of the bovine placenta is reviewed. Finally, terms discussed in this review are summarized (in table format). Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Atomic processes and equation of state of high Z plasmas for EUV sources and their effects on the spatial and temporal evolution of the plasmas

    NASA Astrophysics Data System (ADS)

    Sasaki, Akira; Sunahara, Atushi; Furukawa, Hiroyuki; Nishihara, Katsunobu; Nishikawa, Takeshi; Koike, Fumihiro

    2016-03-01

    Laser-produced plasma (LPP) extreme ultraviolet (EUV) light sources have been intensively investigated due to potential application to next-generation semiconductor technology. Current studies focus on the atomic processes and hydrodynamics of plasmas to develop shorter wavelength sources at λ = 6.x nm as well as to improve the conversion efficiency (CE) of λ = 13.5 nm sources. This paper examines the atomic processes of mid-z elements, which are potential candidates for λ = 6.x nm source using n=3-3 transitions. Furthermore, a method to calculate the hydrodynamics of the plasmas in terms of the initial interaction between a relatively weak prepulse laser is presented.

  12. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish releasemore » fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.« less

  13. [Comments on "A practical dictionary of Chinese medicine" by Wiseman].

    PubMed

    Lan, Feng-li

    2006-02-01

    At least 24 Chinese-English dictionaries of Chinese Medicine have been published in China during the recent 24 years (1984-2003). This thesis comments on "A Practical Dictionary of Chinese Medicine" by Wiseman, agreeing on its establishing principles, sources and formation methods of the English system of Chinese medical terminology, and pointing out the defect. The author holds that study on the origin and development of TCM terms, standardization of Chinese medical terms in different layers, i.e. Chinese medical in classic, in commonly used modern TCM terms, and integrative medical texts, are prerequisites to the standardization of English translation of Chinese medical terms.

  14. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  15. Bound-preserving modified exponential Runge-Kutta discontinuous Galerkin methods for scalar hyperbolic equations with stiff source terms

    NASA Astrophysics Data System (ADS)

    Huang, Juntao; Shu, Chi-Wang

    2018-05-01

    In this paper, we develop bound-preserving modified exponential Runge-Kutta (RK) discontinuous Galerkin (DG) schemes to solve scalar hyperbolic equations with stiff source terms by extending the idea in Zhang and Shu [43]. Exponential strong stability preserving (SSP) high order time discretizations are constructed and then modified to overcome the stiffness and preserve the bound of the numerical solutions. It is also straightforward to extend the method to two dimensions on rectangular and triangular meshes. Even though we only discuss the bound-preserving limiter for DG schemes, it can also be applied to high order finite volume schemes, such as weighted essentially non-oscillatory (WENO) finite volume schemes as well.

  16. Source term evaluation for combustion modeling

    NASA Technical Reports Server (NTRS)

    Sussman, Myles A.

    1993-01-01

    A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.

  17. Efficiency calibration and minimum detectable activity concentration of a real-time UAV airborne sensor system with two gamma spectrometers.

    PubMed

    Tang, Xiao-Bin; Meng, Jia; Wang, Peng; Cao, Ye; Huang, Xi; Wen, Liang-Sheng; Chen, Da

    2016-04-01

    A small-sized UAV (NH-UAV) airborne system with two gamma spectrometers (LaBr3 detector and HPGe detector) was developed to monitor activity concentration in serious nuclear accidents, such as the Fukushima nuclear accident. The efficiency calibration and determination of minimum detectable activity concentration (MDAC) of the specific system were studied by MC simulations at different flight altitudes, different horizontal distances from the detection position to the source term center and different source term sizes. Both air and ground radiation were considered in the models. The results obtained may provide instructive suggestions for in-situ radioactivity measurements of NH-UAV. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Methods for nuclear air-cleaning-system accident-consequence assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less

  19. Multi-Detector Analysis System for Spent Nuclear Fuel Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reber, Edward Lawrence; Aryaeinejad, Rahmat; Cole, Jerald Donald

    1999-09-01

    The Spent Nuclear Fuel (SNF) Non-Destructive Analysis (NDA) program at INEEL is developing a system to characterize SNF for fissile mass, radiation source term, and fissile isotopic content. The system is based on the integration of the Fission Assay Tomography System (FATS) and the Gamma-Neutron Analysis Technique (GNAT) developed under programs supported by the DOE Office of Non-proliferation and National Security. Both FATS and GNAT were developed as separate systems to provide information on the location of special nuclear material in weapons configuration (FATS role), and to measure isotopic ratios of fissile material to determine if the material was frommore » a weapon (GNAT role). FATS is capable of not only determining the presence and location of fissile material but also the quantity of fissile material present to within 50%. GNAT determines the ratios of the fissile and fissionable material by coincidence methods that allow the two prompt (immediately) produced fission fragments to be identified. Therefore, from the combination of FATS and GNAT, MDAS is able to measure the fissile material, radiation source term, and fissile isotopics content.« less

  20. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  1. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  2. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  3. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  4. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  5. Implementation of the Leaching Environmental Assessment Framework

    EPA Science Inventory

    New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field condit...

  6. Increasing Confidence In Treatment Performance Assessment Using Geostatistical Methods

    EPA Science Inventory

    It is well established that the presence of dense non-aqueous phase liquids (DNAPLs) such as trichloroethylene (TCE) in aquifer systems represents a very long-term source of groundwater contamination. Significant effort in recent years has been focussed on developing effective me...

  7. Environmental performance of bio-based and biodegradable plastics: the road ahead.

    PubMed

    Lambert, Scott; Wagner, Martin

    2017-11-13

    Future plastic materials will be very different from those that are used today. The increasing importance of sustainability promotes the development of bio-based and biodegradable polymers, sometimes misleadingly referred to as 'bioplastics'. Because both terms imply "green" sources and "clean" removal, this paper aims at critically discussing the sometimes-conflicting terminology as well as renewable sources with a special focus on the degradation of these polymers in natural environments. With regard to the former we review innovations in feedstock development (e.g. microalgae and food wastes). In terms of the latter, we highlight the effects that polymer structure, additives, and environmental variables have on plastic biodegradability. We argue that the 'biodegradable' end-product does not necessarily degrade once emitted to the environment because chemical additives used to make them fit for purpose will increase the longevity. In the future, this trend may continue as the plastics industry also is expected to be a major user of nanocomposites. Overall, there is a need to assess the performance of polymer innovations in terms of their biodegradability especially under realistic waste management and environmental conditions, to avoid the unwanted release of plastic degradation products in receiving environments.

  8. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    USGS Publications Warehouse

    Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.

  9. Towards A Topological Framework for Integrating Semantic Information Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael

    2014-09-07

    In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.

  10. Retrofitting impervious urban infrastructure with green technology for rainfall-runoff restoration, indirect reuse and pollution load reduction.

    PubMed

    Sansalone, John; Raje, Saurabh; Kertesz, Ruben; Maccarone, Kerrilynn; Seltzer, Karl; Siminari, Michele; Simms, Peter; Wood, Brandon

    2013-12-01

    The built environs alter hydrology and water resource chemistry. Florida is subject to nutrient criteria and is promulgating "no-net-load-increase" criteria for runoff and constituents (nutrients and particulate matter, PM). With such criteria, green infrastructure, hydrologic restoration, indirect reuse and source control are potential design solutions. The study simulates runoff and constituent load control through urban source area re-design to provide long-term "no-net-load-increases". A long-term continuous simulation of pre- and post-development response for an existing surface parking facility is quantified. Retrofits include a biofiltration area reactor (BAR) for hydrologic and denitrification control. A linear infiltration reactor (LIR) of cementitious permeable pavement (CPP) provides infiltration, adsorption and filtration. Pavement cleaning provided source control. Simulation of climate and source area data indicates re-design achieves "no-net-load-increases" at lower costs compared to standard construction. The retrofit system yields lower cost per nutrient load treated compared to Best Management Practices (BMPs). Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Screening and validation of EXTraS data products

    NASA Astrophysics Data System (ADS)

    Carpano, Stefania; Haberl, F.; De Luca, A.; Tiengo, A.: Israel, G.; Rodriguez, G.; Belfiore, A.; Rosen, S.; Read, A.; Wilms, J.; Kreikenbohm, A.; Law-Green, D.

    2015-09-01

    The EXTraS project (Exploring the X-ray Transient and variable Sky) is aimed at fullyexploring the serendipitous content of the XMM-Newton EPIC database in the timedomain. The project is funded within the EU/FP7-Cooperation Space framework and is carried out by a collaboration including INAF (Italy), IUSS (Italy), CNR/IMATI (Italy), University of Leicester (UK), MPE (Germany) and ECAP (Germany). The several tasks consist in characterise aperiodicvariability for all 3XMM sources, search for short-term periodic variability on hundreds of thousands sources, detect new transient sources that are missed by standard source detection and hence not belonging to the 3XMM catalogue, search for long term variability by measuring fluxes or upper limits for both pointed and slew observations, and finally perform multiwavelength characterisation andclassification. Screening and validation of the different products is essentially in order to reject flawed results, generated by the automatic pipelines. We present here the screening tool we developed in the form of a Graphical User Interface and our plans for a systematic screening of the different catalogues.

  12. An Empirical Study of the Change Project as Both Teaching Tool and Outcome of an Educational Leadership Development Program.

    ERIC Educational Resources Information Center

    King, Jean A.; Schleisman, Jane; Kistler, Susan

    The Bush Foundation's leadership-development programs are an important source of inservice leadership training in Minnesota. The extent to which these programs influence pre-collegiate education is explored. The paper draws on a longitudinal study that asked two basic questions: what are the long-term effects of the Bush Leadership Programs on…

  13. The Relationship of Parental Knowledge to the Development of Extremely Low Birth Weight Infants.

    ERIC Educational Resources Information Center

    Dichtelmiller, Margo; And Others

    1992-01-01

    This study found that mothers (n=40) of extremely high-risk infants (averaging 1000 grams birthweight and 28 weeks gestational age) called upon the same experiences and sources of information as mothers of full-term infants. Infants of mothers with above average knowledge about infancy scored significantly higher on tests of infant development at…

  14. Parents' Faith and Hope during the Pediatric Palliative Phase and the Association with Long-Term Parental Adjustment.

    PubMed

    van der Geest, Ivana M M; van den Heuvel-Eibrink, Marry M; Falkenburg, Nette; Michiels, Erna M C; van Vliet, Liesbeth; Pieters, Rob; Darlington, Anne-Sophie E

    2015-05-01

    The loss of a child is associated with an increased risk for developing psychological problems. However, studies investigating the impact of parents' faith and hope for a cure during the palliative phase on long-term parental psychological functioning are limited. The study's objective was to explore the role of faith and hope as a source of coping and indicator of long-term parental adjustment. Eighty-nine parents of 57 children who died of cancer completed questionnaires retrospectively, exploring faith, hope, and sources of coping, and measuring parents' current level of grief and depression. For 19 parents (21%) faith was very important during the palliative phase. The majority of parents remained hopeful for a meaningful time with their child (n=68, 76%); a pain-free death (n=58, 65%); and a cure (n=30, 34%). Their child (n=70, 79%) was parents' main source of coping. Twelve parents (14%) suffered from traumatic grief, and 22 parents (25%) showed symptoms of depression. Parents' faith was not associated with less long-term traumatic grief (OR=0.86, p=0.51) or symptoms of depression (OR=0.95, p=0.74), and parents' hope for a cure was not related to more long-term traumatic grief (OR=1.07, p=0.71) or symptoms of depression (OR=1.12, p=0.47). Faith was important for a minority of parents and was not associated with less long-term traumatic grief or symptoms of depression. The majority of parents remained hopeful. Hope for a cure was not associated with more long-term traumatic grief or symptoms of depression.

  15. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator.

    PubMed

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-11-01

    In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k - ɛ model, RNG k - ɛ model, realizable k - ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use.

  16. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  17. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  18. Water Resources Adaptation to Global Changes: Risk Management through Sustainable Infrastructure Planning and Managements

    EPA Science Inventory

    Global changes due to cyclic and long-term climatic variations, demographic changes and economic development, have impacts on the quality and quantity of potable and irrigation source waters. Internal and external climatic forcings, for example, redistribute precipitation season...

  19. Water Resources Adaptation to Global Changes: Risk Management through Sustainable Infrastructure Planning and Management - Paper

    EPA Science Inventory

    Global changes due to cyclic and long-term climatic variations, demographic changes and economic development, have impacts on the quality and quantity of potable and irrigation source waters. Internal and external climatic forcings, for example, redistribute precipitation season...

  20. Building Large Collections of Chinese and English Medical Terms from Semi-Structured and Encyclopedia Websites

    PubMed Central

    Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric

    2013-01-01

    To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (), Object Recall (), and Surface Head recall (). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available. PMID:23874426

  1. Building large collections of Chinese and English medical terms from semi-structured and encyclopedia websites.

    PubMed

    Xu, Yan; Wang, Yining; Sun, Jian-Tao; Zhang, Jianwen; Tsujii, Junichi; Chang, Eric

    2013-01-01

    To build large collections of medical terms from semi-structured information sources (e.g. tables, lists, etc.) and encyclopedia sites on the web. The terms are classified into the three semantic categories, Medical Problems, Medications, and Medical Tests, which were used in i2b2 challenge tasks. We developed two systems, one for Chinese and another for English terms. The two systems share the same methodology and use the same software with minimum language dependent parts. We produced large collections of terms by exploiting billions of semi-structured information sources and encyclopedia sites on the Web. The standard performance metric of recall (R) is extended to three different types of Recall to take the surface variability of terms into consideration. They are Surface Recall (R(S)), Object Recall (R(O)), and Surface Head recall (R(H)). We use two test sets for Chinese. For English, we use a collection of terms in the 2010 i2b2 text. Two collections of terms, one for English and the other for Chinese, have been created. The terms in these collections are classified as either of Medical Problems, Medications, or Medical Tests in the i2b2 challenge tasks. The English collection contains 49,249 (Problems), 89,591 (Medications) and 25,107 (Tests) terms, while the Chinese one contains 66,780 (Problems), 101,025 (Medications), and 15,032 (Tests) terms. The proposed method of constructing a large collection of medical terms is both efficient and effective, and, most of all, independent of language. The collections will be made publicly available.

  2. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Regional and temporal differences in nitrate trends discerned from long-term water quality monitoring data

    USGS Publications Warehouse

    Stets, Edward G.; Kelly, Valerie J.; Crawford, Charles G.

    2015-01-01

    Riverine nitrate (NO3) is a well-documented driver of eutrophication and hypoxia in coastal areas. The development of the elevated river NO3 concentration is linked to anthropogenic inputs from municipal, agricultural, and atmospheric sources. The intensity of these sources has varied regionally, through time, and in response to multiple causes such as economic drivers and policy responses. This study uses long-term water quality, land use, and other ancillary data to further describe the evolution of river NO3 concentrations at 22 monitoring stations in the United States (U.S.). The stations were selected for long-term data availability and to represent a range of climate and land-use conditions. We examined NO3 at the monitoring stations, using a flow-weighting scheme meant to account for interannual flow variability allowing greater focus on river chemical conditions. River NO3 concentration increased strongly during 1945-1980 at most of the stations and have remained elevated, but stopped increasing during 1981-2008. NO3 increased to a greater extent at monitoring stations in the Midwest U.S. and less so at those in the Eastern and Western U.S. We discuss 20th Century agricultural development in the U.S. and demonstrate that regional differences in NO3 concentration patterns were strongly related to an agricultural index developed using principal components analysis. This unique century-scale dataset adds to our understanding of long-term NO3 patterns in the U.S.

  4. Nodal Green’s Function Method Singular Source Term and Burnable Poison Treatment in Hexagonal Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.A. Bingham; R.M. Ferrer; A.M. ougouag

    2009-09-01

    An accurate and computationally efficient two or three-dimensional neutron diffusion model will be necessary for the development, safety parameters computation, and fuel cycle analysis of a prismatic Very High Temperature Reactor (VHTR) design under Next Generation Nuclear Plant Project (NGNP). For this purpose, an analytical nodal Green’s function solution for the transverse integrated neutron diffusion equation is developed in two and three-dimensional hexagonal geometry. This scheme is incorporated into HEXPEDITE, a code first developed by Fitzpatrick and Ougouag. HEXPEDITE neglects non-physical discontinuity terms that arise in the transverse leakage due to the transverse integration procedure application to hexagonal geometry andmore » cannot account for the effects of burnable poisons across nodal boundaries. The test code being developed for this document accounts for these terms by maintaining an inventory of neutrons by using the nodal balance equation as a constraint of the neutron flux equation. The method developed in this report is intended to restore neutron conservation and increase the accuracy of the code by adding these terms to the transverse integrated flux solution and applying the nodal Green’s function solution to the resulting equation to derive a semi-analytical solution.« less

  5. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  6. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  7. Developing Performance Estimates for High Precision Astrometry with TMT

    NASA Astrophysics Data System (ADS)

    Schoeck, Matthias; Do, Tuan; Ellerbroek, Brent; Herriot, Glen; Meyer, Leo; Suzuki, Ryuji; Wang, Lianqi; Yelda, Sylvana

    2013-12-01

    Adaptive optics on Extremely Large Telescopes will open up many new science cases or expand existing science into regimes unattainable with the current generation of telescopes. One example of this is high-precision astrometry, which has requirements in the range from 10 to 50 micro-arc-seconds for some instruments and science cases. Achieving these requirements imposes stringent constraints on the design of the entire observatory, but also on the calibration procedures, observing sequences and the data analysis techniques. This paper summarizes our efforts to develop a top down astrometry error budget for TMT. It is predominantly developed for the first-light AO system, NFIRAOS, and the IRIS instrument, but many terms are applicable to other configurations as well. Astrometry error sources are divided into 5 categories: Reference source and catalog errors, atmospheric refraction correction errors, other residual atmospheric effects, opto-mechanical errors and focal plane measurement errors. Results are developed in parametric form whenever possible. However, almost every error term in the error budget depends on the details of the astrometry observations, such as whether absolute or differential astrometry is the goal, whether one observes a sparse or crowded field, what the time scales of interest are, etc. Thus, it is not possible to develop a single error budget that applies to all science cases and separate budgets are developed and detailed for key astrometric observations. Our error budget is consistent with the requirements for differential astrometry of tens of micro-arc-seconds for certain science cases. While no show stoppers have been found, the work has resulted in several modifications to the NFIRAOS optical surface specifications and reference source design that will help improve the achievable astrometry precision even further.

  8. Southeast Asia Report.

    DTIC Science & Technology

    1987-04-27

    have imported these materials at prices higher than the material prices set by the state. The price of coconut oil in the southern provinces has...important source of exports. Here it is necessary to pay attention to developing the various kinds of food products: vegetables, beans, peanuts, oil ...short-term industrial crops, and must effectively develop such long-range industrial crops as coffee, tea, pepper, coconuts , etc., to fully utilize

  9. Influence of regeneration method and tissue source on the frequency of somatic variation in Populus to infection by Septoria musiva

    Treesearch

    Michael E. Ostry; Ronald L. Hackett; Charles H. Michler; R. Serres; B. McCown

    1994-01-01

    Septoria leaf spot and canker are serious diseases of many hybrid poplar clones in plantations established for biomass production. Developing resistant clones through breeding is the best long-term strategy to minimize tree damage caused by this disease. Tissue culture and somaclonal selection techniques may reduce the time needed to develop disease resistance in...

  10. Guidelines for Analysis of Indigeneous and Private Health Care Planning in Developing Countries. International Health Planning Methods Series, Volume 6.

    ERIC Educational Resources Information Center

    Scrimshaw, Susan

    This guidebook is both a practical tool and a source book to aid health planners assess the importance, extent, and impact of indigenous and private sector medical systems in developing nations. Guidelines are provided for assessment in terms of: use patterns; the meaning and importance to users of various available health services; and ways of…

  11. Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application

    NASA Astrophysics Data System (ADS)

    Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni

    2018-06-01

    Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.

  12. The sound of moving bodies. Ph.D. Thesis - Cambridge Univ.

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth Steven

    1990-01-01

    The importance of the quadrupole source term in the Ffowcs, Williams, and Hawkings (FWH) equation was addressed. The quadrupole source contains fundamental components of the complete fluid mechanics problem, which are ignored only at the risk of error. The results made it clear that any application of the acoustic analogy should begin with all of the source terms in the FWH theory. The direct calculation of the acoustic field as part of the complete unsteady fluid mechanics problem using CFD is considered. It was shown that aeroelastic calculation can indeed be made with CFD codes. The results indicate that the acoustic field is the most susceptible component of the computation to numerical error. Therefore, the ability to measure the damping of acoustic waves is absolutely essential both to develop acoustic computations. Essential groundwork for a new approach to the problem of sound generation by moving bodies is presented. This new computational acoustic approach holds the promise of solving many problems hitherto pushed aside.

  13. A point-source outbreak of Coccidioidomycosis among a highway construction crew.

    PubMed

    Nicas, Mark

    2018-01-01

    Coccidioidomycosis is an infection caused by inhaling spores of the soil fungus Coccidioides immitis (hereafter termed Cocci). Cocci is endemic in certain areas of California. When soil containing the fungus is disturbed, as during earth-moving activities, respirable Cocci spores can become airborne and be inhaled by persons in the vicinity. This article describes a cluster of seven Cocciodioidomycosis cases among a highway construction crew that occurred in June/July 2008 in Kern County, CA, which is among the most highly endemic regions for Cocci in California. The exposures spanned no more than seven work days, and illness developed within two to three weeks of the exposures. Given the common source of exposure (soil dust generated at the work site) and the multiple cases occurring close in time, the cluster can also be termed a "point-source outbreak." The contractor was not informed of the infection risk and did not take adequate precautions against dust exposure. Appropriate engineering/administrative controls and respiratory protection are discussed.

  14. EPA observational studies of children's respiratory health in the Detroit and Dearborn, Michigan

    EPA Science Inventory

    Previous research has suggested that long-term exposures to mobile-source emissions might be associated with the development of allergies and asthma in children. Between 2004 and 2007, EPA scientists successfully conducted nested observational studies of children aged 7-12 years ...

  15. The Current Status of Behaviorism and Neurofeedback

    ERIC Educational Resources Information Center

    Fultz, Dwight E.

    2009-01-01

    There appears to be no dominant conceptual model for the process and outcomes of neurofeedback among practitioners or manufacturers. Behaviorists are well-positioned to develop a neuroscience-based source code in which neural activity is described in behavioral terms, providing a basis for behavioral conceptualization and education of…

  16. Multisource inverse-geometry CT. Part I. System concept and development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Man, Bruno, E-mail: deman@ge.com; Harrison, Dan

    Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: Themore » authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals.« less

  17. Inexpensive, Low Power, Open-Source Data Logging in the Field

    NASA Astrophysics Data System (ADS)

    Sandell, C. T.; Wickert, A. D.

    2016-12-01

    Collecting a robust data set of environmental conditions with commercial equipment is often cost prohibitive. I present the ALog, a general-purpose, inexpensive, low-power, open-source data logger that has proven its durability on long-term deployments in the harsh conditions of high altitude glaciers and humid river deltas. The ALog was developed to fill the need for a capable, rugged, easy-to-use, inexpensive, open-source hardware targeted at long-term remote deployment in nearly any environment. Building on the popular Arduino platform, the hardware features a high-precision clock, full size SD card slot for high-volume data storage, screw terminals, six analog inputs, two digital inputs, one digital interrupt, 3.3V and 5V power outputs, and SPI and I2C communication capability. The design is focused on extremely low power consumption allowing the Alog to be deployed for years on a single set of common alkaline batteries. The power efficiency of the Alog eliminates the difficulties associated with field power collection including additional hardware and installation costs, dependence on weather conditions, possible equipment failure, and the transport of bulky/heavy equipment to a remote site. Battery power increases suitable data collection sites (too shaded for photovoltaics) and allows for low profile installation options (including underground). The ALog has gone through continuous development with over four years of successful data collection in hydrologic field research. Over this time, software support for a wide range of sensors has been made available such as ultrasonic rangefinders (for water level, snow accumulation and glacial melt), temperature sensors (air and groundwater), humidity sensors, pyranometers, inclinometers, rain gauges, soil moisture and water potential sensors, resistance-based tools to measure frost heave, and cameras that trigger on events. The software developed for use with the ALog allows simple integration of established commercial sensors, including example implementation code so users with limited programming knowledge can get up and running with ease. All development files including design schematics, circuit board layouts, and source code files are open-source to further eliminate barriers to its use and allow community development contribution.

  18. Multisource inverse-geometry CT. Part I. System concept and development

    PubMed Central

    De Man, Bruno; Uribe, Jorge; Baek, Jongduk; Harrison, Dan; Yin, Zhye; Longtin, Randy; Roy, Jaydeep; Waters, Bill; Wilson, Colin; Short, Jonathan; Inzinna, Lou; Reynolds, Joseph; Neculaes, V. Bogdan; Frutschy, Kristopher; Senzig, Bob; Pelc, Norbert

    2016-01-01

    Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: The authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals. PMID:27487877

  19. Monitoring Heritage Buildings with Open Source Hardware Sensors: A Case Study of the Mosque-Cathedral of Córdoba

    PubMed Central

    Mesas-Carrascosa, Francisco Javier; Verdú Santano, Daniel; Meroño de Larriva, Jose Emilio; Ortíz Cordero, Rafael; Hidalgo Fernández, Rafael Enrique; García-Ferrer, Alfonso

    2016-01-01

    A number of physical factors can adversely affect cultural heritage. Therefore, monitoring parameters involved in the deterioration process, principally temperature and relative humidity, is useful for preventive conservation. In this study, a total of 15 microclimate stations using open source hardware were developed and stationed at the Mosque-Cathedral of Córdoba, which is registered with UNESCO for its outstanding universal value, to assess the behavior of interior temperature and relative humidity in relation to exterior weather conditions, public hours and interior design. Long-term monitoring of these parameters is of interest in terms of preservation and reducing the costs of future conservation strategies. Results from monitoring are presented to demonstrate the usefulness of this system. PMID:27690056

  20. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  1. Aeromicrobiology/air quality

    USGS Publications Warehouse

    Andersen, Gary L.; Frisch, A.S.; Kellogg, Christina A.; Levetin, E.; Lighthart, Bruce; Paterno, D.

    2009-01-01

    The most prevalent microorganisms, viruses, bacteria, and fungi, are introduced into the atmosphere from many anthropogenic sources such as agricultural, industrial and urban activities, termed microbial air pollution (MAP), and natural sources. These include soil, vegetation, and ocean surfaces that have been disturbed by atmospheric turbulence. The airborne concentrations range from nil to great numbers and change as functions of time of day, season, location, and upwind sources. While airborne, they may settle out immediately or be transported great distances. Further, most viable airborne cells can be rendered nonviable due to temperature effects, dehydration or rehydration, UV radiation, and/or air pollution effects. Mathematical microbial survival models that simulate these effects have been developed.

  2. Characterization of the Multi-Blade 10B-based detector at the CRISP reflectometer at ISIS for neutron reflectometry at ESS

    NASA Astrophysics Data System (ADS)

    Piscitelli, F.; Mauri, G.; Messi, F.; Anastasopoulos, M.; Arnold, T.; Glavic, A.; Höglund, C.; Ilves, T.; Lopez Higuera, I.; Pazmandi, P.; Raspino, D.; Robinson, L.; Schmidt, S.; Svensson, P.; Varga, D.; Hall-Wilton, R.

    2018-05-01

    The Multi-Blade is a Boron-10-based gaseous thermal neutron detector developed to face the challenge arising in neutron reflectometry at neutron sources. Neutron reflectometers are challenging instruments in terms of instantaneous counting rate and spatial resolution. This detector has been designed according to the requirements given by the reflectometers at the European Spallation Source (ESS) in Sweden. The Multi-Blade has been installed and tested on the CRISP reflectometer at the ISIS neutron and muon source in U.K.. The results on the detailed detector characterization are discussed in this manuscript.

  3. Recent skyshine calculations at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degtyarenko, P.

    1997-12-01

    New calculations of the skyshine dose distribution of neutrons and secondary photons have been performed at Jefferson Lab using the Monte Carlo method. The dose dependence on neutron energy, distance to the neutron source, polar angle of a source neutron, and azimuthal angle between the observation point and the momentum direction of a source neutron have been studied. The azimuthally asymmetric term in the skyshine dose distribution is shown to be important in the dose calculations around high-energy accelerator facilities. A parameterization formula and corresponding computer code have been developed which can be used for detailed calculations of the skyshinemore » dose maps.« less

  4. Joint optimization of source, mask, and pupil in optical lithography

    NASA Astrophysics Data System (ADS)

    Li, Jia; Lam, Edmund Y.

    2014-03-01

    Mask topography effects need to be taken into consideration for more advanced resolution enhancement techniques in optical lithography. However, rigorous 3D mask model achieves high accuracy at a large computational cost. This work develops a combined source, mask and pupil optimization (SMPO) approach by taking advantage of the fact that pupil phase manipulation is capable of partially compensating for mask topography effects. We first design the pupil wavefront function by incorporating primary and secondary spherical aberration through the coefficients of the Zernike polynomials, and achieve optimal source-mask pair under the condition of aberrated pupil. Evaluations against conventional source mask optimization (SMO) without incorporating pupil aberrations show that SMPO provides improved performance in terms of pattern fidelity and process window sizes.

  5. Characterization and long term operation of a novel superconducting undulator with 15 mm period length in a synchrotron light source

    NASA Astrophysics Data System (ADS)

    Casalbuoni, S.; Cecilia, A.; Gerstl, S.; Glamann, N.; Grau, A. W.; Holubek, T.; Meuter, C.; de Jauregui, D. Saez; Voutta, R.; Boffo, C.; Gerhard, Th.; Turenne, M.; Walter, W.

    2016-11-01

    A new cryogen-free full scale (1.5 m long) superconducting undulator with a period length of 15 mm (SCU15) has been successfully tested in the ANKA storage ring. This represents a very important milestone in the development of superconducting undulators for third and fourth generation light sources carried on by the collaboration between the Karlsruhe Institute of Technology and the industrial partner Babcock Noell GmbH. SCU15 is the first full length device worldwide that with beam reaches a higher peak field than what expected with the same geometry (vacuum gap and period length) with an ideal cryogenic permanent magnet undulator built with the best material available PrFeB. After a summary on the design and main parameters of the device, we present here the characterization in terms of spectral properties and the long term operation of the SCU15 in the ANKA storage ring.

  6. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  7. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  8. Fate of hydrocarbon pollutants in source and non-source control sustainable drainage systems.

    PubMed

    Roinas, Georgios; Mant, Cath; Williams, John B

    2014-01-01

    Sustainable drainage (SuDs) is an established method for managing runoff from developments, and source control is part of accepted design philosophy. However, there are limited studies into the contribution source control makes to pollutant removal, especially for roads. This study examines organic pollutants, total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAHs), in paired source and non-source control full-scale SuDs systems. Sites were selected to cover local roads, trunk roads and housing developments, with a range of SuDs, including porous asphalt, swales, detention basins and ponds. Soil and water samples were taken bi-monthly over 12 months to assess pollutant loads. Results show first flush patterns in storm events for solids, but not for TPH. The patterns of removal for specific PAHs were also different, reflecting varying physico-chemical properties. The potential of trunk roads for pollution was illustrated by peak runoff for TPH of > 17,000 μg/l. Overall there was no significant difference between pollutant loads from source and non-source control systems, but the dynamic nature of runoff means that longer-term data are required. The outcomes of this project will increase understanding of organic pollutants behaviour in SuDs. This will provide design guidance about the most appropriate systems for treating these pollutants.

  9. Numerical and experimental evaluations of the flow past nested chevrons

    NASA Technical Reports Server (NTRS)

    Foss, J. F.; Foss, J. K.; Spalart, P. R.

    1989-01-01

    An effort is made to contribute to the development of CFD by relating the successful use of vortex dynamics in the computation of the pressure drop past a planar array of chevron-shaped obstructions. An ensemble of results was used to compute the loss coefficient k, stimulating an experimental program for the assessment of the measured loss coefficient for the same geometry. The most provocative result of this study has been the representation of kinetic energy production in terms of vorticity source terms.

  10. SoS Navigator 2.0: A Context-Based Approach to System-of-Systems Challenges

    DTIC Science & Technology

    2008-06-01

    in a Postindustrial Age. MIT Press, 1984. [ Kolb 1984] Kolb , David A. Experiential Learning : Experience as the Source of Learning and Develop- ment...terms of experiential learning , and the work of Rosen [Rosen 1991] in terms of the relational approach to understanding anticipa- tive systems. Our...Supporting Techniques and Tools 17  3.2  The Learning /Transformation Cycle 19  3.3  Summary of SoS Navigator Processes and Techniques 20  4  Case Summaries 22

  11. Photovoltaic village power application: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Poley, W. A.; Scudder, L. R.

    1978-01-01

    The village power application represents a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in both the government and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 12 MW min and a foreign market of about 10 GW exists.

  12. Photovoltaic water pumping applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Scudder, L. R.; Poley, W. A.; Cusick, J. P.

    1978-01-01

    Water pumping applications represent a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in the government, commercial/institutional and public sectors. The foreign demand and sources of funding for water pumping systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 240 megawatts and a foreign market of about 6 gigawatts exist.

  13. SISSY: An efficient and automatic algorithm for the analysis of EEG sources based on structured sparsity.

    PubMed

    Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I

    2017-08-15

    Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  15. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  16. Implementation issues of the nearfield equivalent source imaging microphone array

    NASA Astrophysics Data System (ADS)

    Bai, Mingsian R.; Lin, Jia-Hong; Tseng, Chih-Wen

    2011-01-01

    This paper revisits a nearfield microphone array technique termed nearfield equivalent source imaging (NESI) proposed previously. In particular, various issues concerning the implementation of the NESI algorithm are examined. The NESI can be implemented in both the time domain and the frequency domain. Acoustical variables including sound pressure, particle velocity, active intensity and sound power are calculated by using multichannel inverse filters. Issues concerning sensor deployment are also investigated for the nearfield array. The uniform array outperformed a random array previously optimized for far-field imaging, which contradicts the conventional wisdom in far-field arrays. For applications in which only a patch array with scarce sensors is available, a virtual microphone approach is employed to ameliorate edge effects using extrapolation and to improve imaging resolution using interpolation. To enhance the processing efficiency of the time-domain NESI, an eigensystem realization algorithm (ERA) is developed. Several filtering methods are compared in terms of computational complexity. Significant saving on computations can be achieved using ERA and the frequency-domain NESI, as compared to the traditional method. The NESI technique was also experimentally validated using practical sources including a 125 cc scooter and a wooden box model with a loudspeaker fitted inside. The NESI technique proved effective in identifying broadband and non-stationary sources produced by the sources.

  17. Programmable solid state atom sources for nanofabrication.

    PubMed

    Han, Han; Imboden, Matthias; Stark, Thomas; del Corro, Pablo G; Pardo, Flavio; Bolle, Cristian A; Lally, Richard W; Bishop, David J

    2015-06-28

    In this paper we discuss the development of a MEMS-based solid state atom source that can provide controllable atom deposition ranging over eight orders of magnitude, from ten atoms per square micron up to hundreds of atomic layers, on a target ∼1 mm away. Using a micron-scale silicon plate as a thermal evaporation source we demonstrate the deposition of indium, silver, gold, copper, iron, aluminum, lead and tin. Because of their small sizes and rapid thermal response times, pulse width modulation techniques are a powerful way to control the atomic flux. Pulsing the source with precise voltages and timing provides control in terms of when and how many atoms get deposited. By arranging many of these devices into an array, one has a multi-material, programmable solid state evaporation source. These micro atom sources are a complementary technology that can enhance the capability of a variety of nano-fabrication techniques.

  18. Introduction to Agricultural Marketing.

    ERIC Educational Resources Information Center

    Futrell, Gene; And Others

    This marketing unit focuses on the importance of forecasting in order for a farm family to develop marketing plans. It describes sources of information and includes a glossary of marketing terms and exercises using both fundamental and technical methods to predict prices in order to improve forecasting ability. The unit is organized in the…

  19. 75 FR 25270 - Administration for Children and Families; Single-Source Program Expansion Supplement Grant

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ...--(i) To assist refugees in obtaining the skills which are necessary for economic self-sufficiency... the Ethiopian Community Development Council, Inc. (ECDC), located in Arlington, VA. Current economic... collaboration to meet these challenges. Provision of technical assistance is essential to support the long- term...

  20. Monitoring Knowledge Base (MKB)

    EPA Pesticide Factsheets

    The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial process descriptions, and permitting techniques, including flexible permit development. Using MKB, one can gain a comprehensive understanding of emissions sources, control devices, and monitoring techniques, enabling one to determine appropriate permit terms and conditions.

  1. The telecommunications and data acquisition report

    NASA Technical Reports Server (NTRS)

    Renzetti, N. A. (Editor)

    1982-01-01

    Developments in Earth-based radio technology are reported. The Deep Space Network is discussed in terms of its advanced systems, network and facility engineering and implementation, operations, and energy sources. Problems in pulse communication and radio frequency interference are addressed with emphasis on pulse position modulation and laser beam collimation.

  2. Artificial Intelligence and School Library Media Centers.

    ERIC Educational Resources Information Center

    Young, Robert J.

    1990-01-01

    Discusses developments in artificial intelligence in terms of their impact on school library media centers and the role of media specialists. Possible uses of expert systems, hypertext, and CD-ROM technologies in school media centers are examined and the challenges presented by these technologies are discussed. Fourteen sources of additional…

  3. Validation of Operational Multiscale Environment Model With Grid Adaptivity (OMEGA).

    DTIC Science & Technology

    1995-12-01

    Center for the period of the Chernobyl Nuclear Accident. The physics of the model is tested using National Weather Service Medium Range Forecast data by...Climatology Center for the first three days following the release at the Chernobyl Nuclear Plant. A user-defined source term was developed to simulate

  4. Error sources in passive and active microwave satellite soil moisture over Australia

    USDA-ARS?s Scientific Manuscript database

    Development of a long-term climate record of soil moisture (SM) involves combining historic and present satellite-retrieved SM data sets. This in turn requires a consistent characterization and deep understanding of the systematic differences and errors in the individual data sets, which vary due to...

  5. What Is Information Design?

    ERIC Educational Resources Information Center

    Redish, Janice C. (Ginny)

    2000-01-01

    Defines two meanings of information design: the overall process of developing a successful document; and the way the information is presented on the screen (layout, typography, color, and so forth). Discusses the future importance of both of these meanings of information design, in terms of design for the web and single-sources (planning…

  6. NEXT GENERATION LEACHING TESTS FOR EVALUATING LEACHING OF INORGANIC CONSTITUENTS

    EPA Science Inventory

    In the U.S. as in other countries, there is increased interest in using industrial by-products as alternative or secondary materials, helping to conserve virgin or raw materials. The LEAF and associated test methods are being used to develop the source term for leaching or any i...

  7. NSRD-10: Leak Path Factor Guidance Using MELCOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less

  8. NASA's Quiet Aircraft Technology Project

    NASA Technical Reports Server (NTRS)

    Whitfield, Charlotte E.

    2004-01-01

    NASA's Quiet Aircraft Technology Project is developing physics-based understanding, models and concepts to discover and realize technology that will, when implemented, achieve the goals of a reduction of one-half in perceived community noise (relative to 1997) by 2007 and a further one-half in the far term. Noise sources generated by both the engine and the airframe are considered, and the effects of engine/airframe integration are accounted for through the propulsion airframe aeroacoustics element. Assessments of the contribution of individual source noise reductions to the reduction in community noise are developed to guide the work and the development of new tools for evaluation of unconventional aircraft is underway. Life in the real world is taken into account with the development of more accurate airport noise models and flight guidance methodology, and in addition, technology is being developed that will further reduce interior noise at current weight levels or enable the use of lighter-weight structures at current noise levels.

  9. Long-Term Exposure to Transportation Noise in Relation to Development of Obesity—a Cohort Study

    PubMed Central

    Eriksson, Charlotta; Lind, Tomas; Mitkovskaya, Natalya; Wallas, Alva; Ögren, Mikael; Östenson, Claes-Göran; Pershagen, Göran

    2017-01-01

    Background: Exposure to transportation noise is widespread and has been associated with obesity in some studies. However, the evidence from longitudinal studies is limited and little is known about effects of combined exposure to different noise sources. Objectives: The aim of this longitudinal study was to estimate the association between exposure to noise from road traffic, railways, or aircraft and the development of obesity markers. Methods: We assessed individual long-term exposure to road traffic, railway, and aircraft noise based on residential histories in a cohort of 5,184 men and women from Stockholm County. Noise levels were estimated at the most exposed façade of each dwelling. Waist circumference, weight, and height were measured at recruitment and after an average of 8.9 y of follow-up. Extensive information on potential confounders was available from repeated questionnaires and registers. Results: Waist circumference increased 0.04cm/y (95% CI: 0.02, 0.06) and 0.16cm/y (95% CI: 0.14, 0.17) per 10 dB Lden in relation to road traffic and aircraft noise, respectively. No corresponding association was seen for railway noise. Weight gain was only related to aircraft noise exposure. A similar pattern occurred for incidence rate ratios (IRRs) of central obesity and overweight. The IRR of central obesity increased from 1.22 (95% CI: 1.08, 1.39) in those exposed to only one source of transportation noise to 2.26 (95% CI: 1.55, 3.29) among those exposed to all three sources. Conclusion: Our results link transportation noise exposure to development of obesity and suggest that combined exposure from different sources may be particularly harmful. https://doi.org/10.1289/EHP1910 PMID:29161230

  10. On multidisciplinary research on the application of remote sensing to water resources problems. [Wisconsin

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1973-01-01

    Research objectives during 1972-73 were to: (1) Ascertain the extent to which special aerial photography can be operationally used in monitoring water pollution parameters. (2) Ascertain the effectiveness of remote sensing in the investigation of nearshore mixing and coastal entrapment in large water bodies. (3) Develop an explicit relationship of the extent of the mixing zone in terms of the outfall, effluent and water body characteristics. (4) Develop and demonstrate the use of the remote sensing method as an effective legal implement through which administrative agencies and courts can not only investigate possible pollution sources but also legally prove the source of water pollution. (5) Evaluate the field potential of remote sensing techniques in monitoring algal blooms and aquatic macrophytes, and the use of these as indicators of lake eutrophication level. (6) Develop a remote sensing technique for the determination of the location and extent of hydrologically active source areas in a watershed.

  11. Developing novel peat isotope proxies from vascular plant-dominated peatlands of New Zealand to reconstruct Southern Hemisphere climate dynamics

    NASA Astrophysics Data System (ADS)

    Roland, T.; Amesbury, M. J.; Charman, D.; Newnham, R.; Royles, J.; Griffiths, H.; Ratcliffe, J.; Rees, A.; Campbell, D.; Baisden, T.; Keller, E. D.

    2017-12-01

    The Southern Annular Mode (SAM) is a key control on the strength and position of the southern westerly winds (SWW), which are a major influence on Southern Hemisphere (SH) mid- to high-latitude climate. A shift towards a more positive SAM has occurred since the 1950s, driven by ozone layer thinning and enhanced by greenhouse gas driven warming. Although these recent changes are thought to be unprecedented over the last 1000 years, the longer-term behaviour of the SAM is poorly understood. We are developing stable isotope proxies from plant cellulose in vascular plant-dominated (Empodisma spp.) peatlands in New Zealand that we hypothesise are related to changes in past temperature (δ13C) and precipitation moisture source (δ18O). The moisture source signal is driven by the balance between Southern Ocean sources (depleted δ18O) and sub-tropical sources (enriched δ18O), reflecting the relative states of SAM and the El Niño-Southern Oscillation. We aim to provide palaeoclimatic context for the recent positive trend in the SAM, and explore the long-term relationship between the SAM and ENSO, testing the contention that tropical Pacific variability is a key influence on past and future SAM variability. Terrestrial palaeoclimate records in the Southern Hemisphere are often spatially isolated and temporally fragmented. However, New Zealand is ideally placed to test such hypotheses as it registers strong correlations between SAM, temperature and precipitation, and it straddles the zone of interaction between the SWW and sub-tropical moisture sources, reflected in a strong precipitation δ18O gradient. We report data from surface samples across New Zealand and explore the spatial and temporal patterns in stable isotopes in cellulose and water that we will use to interpret the palaeoenvironmental data. Preliminary downcore data will be used to demonstrate the efficacy of this approach to reconstructing moisture sources and temperature linked to moisture source variability.

  12. Computation of nonlinear ultrasound fields using a linearized contrast source method.

    PubMed

    Verweij, Martin D; Demi, Libertario; van Dongen, Koen W A

    2013-08-01

    Nonlinear ultrasound is important in medical diagnostics because imaging of the higher harmonics improves resolution and reduces scattering artifacts. Second harmonic imaging is currently standard, and higher harmonic imaging is under investigation. The efficient development of novel imaging modalities and equipment requires accurate simulations of nonlinear wave fields in large volumes of realistic (lossy, inhomogeneous) media. The Iterative Nonlinear Contrast Source (INCS) method has been developed to deal with spatiotemporal domains measuring hundreds of wavelengths and periods. This full wave method considers the nonlinear term of the Westervelt equation as a nonlinear contrast source, and solves the equivalent integral equation via the Neumann iterative solution. Recently, the method has been extended with a contrast source that accounts for spatially varying attenuation. The current paper addresses the problem that the Neumann iterative solution converges badly for strong contrast sources. The remedy is linearization of the nonlinear contrast source, combined with application of more advanced methods for solving the resulting integral equation. Numerical results show that linearization in combination with a Bi-Conjugate Gradient Stabilized method allows the INCS method to deal with fairly strong, inhomogeneous attenuation, while the error due to the linearization can be eliminated by restarting the iterative scheme.

  13. Review of atmospheric ammonia data in the context of developing technologies, changing climate, and future policy evidence needs

    NASA Astrophysics Data System (ADS)

    Braban, Christine; Tang, Sim; Bealey, Bill; Roberts, Elin; Stephens, Amy; Galloway, Megan; Greenwood, Sarah; Sutton, Mark; Nemitz, Eiko; Leaver, David

    2017-04-01

    Ambient ammonia measurements have been undertaken both in the atmosphere to understand sources, concentrations at background and vulnerable ecosystems and for long term monitoring of concentrations. As a pollutant which is projected to increase concentration in the coming decades with significant policy challenges to implementing mitigation strategies it is useful to assess what has been measured, where and why. In this study a review of the literature, has shown that ammonia measurements are frequently not publically reported and in general not reposited in the open data centres, available for research. The specific sectors where measurements have been undertaken are: agricultural point source assessments, agricultural surface exchange measurements, sensitive ecosystem monitoring, landscape/regional studies and governmental long term monitoring. Less frequently ammonia is measured as part of an intensive atmospheric chemistry field campaign. Technology is developing which means a shift from chemical denuder methods to spectroscopic techniques may be possible, however chemical denuding techniques with off-line laboratory analysis will likely be an economical approach for some time to come. This paper reviews existing datasets from the different sectors of research and integrates them for a global picture to allow both a long term understanding and facilitate comparison with future measurements.

  14. History of surgery for atrial fibrillation.

    PubMed

    Edgerton, Zachary J; Edgerton, James R

    2009-12-01

    There is a rich history of surgery for atrial fibrillation. Initial procedures were aimed at controlling the ventricular response rate. Later procedures were directed at converting atrial fibrillation to normal sinus rhythm. These culminated in the Cox Maze III procedure. While highly effective, the complexity and morbidity of the cut and sew Maze III limited its adoption. Enabling technology has developed alternate energy sources designed to produce a transmural atrial scar without cutting and sewing. Termed the Maze IV, this lessened the morbidity of the procedure and widened the applicability. Further advances in minimal access techniques are now being developed to allow totally thorascopic placement of all the left atrial lesions on the full, beating heart, using alternate energy sources.

  15. Development of stable, narrow spectral line-width, fiber delivered laser source for spin exchange optical pumping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bo; Tong, Xin; Jiang, Chenyang

    2015-06-05

    In this study, we developed a stable, narrow spectral line-width, fiber delivered laser source for spin exchange optical pumping. An optimized external cavity equipped with an off-the-shelf volume holographic grating narrowed the spectral line-width of a 100 W high-power diode laser and stabilized the laser spectrum. The laser spectrum showed a high side mode suppression ratio of >30 dB and good long-term stability (center wavelength drifting within ±0.002 nm during 220 h of operation). Finally, our laser is delivered by a multimode fiber with power ~70 W, center wavelength of 794.77 nm, and spectral bandwidth of ~0.12 nm.

  16. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less

  17. High-Order Residual-Distribution Hyperbolic Advection-Diffusion Schemes: 3rd-, 4th-, and 6th-Order

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza R.; Nishikawa, Hiroaki

    2014-01-01

    In this paper, spatially high-order Residual-Distribution (RD) schemes using the first-order hyperbolic system method are proposed for general time-dependent advection-diffusion problems. The corresponding second-order time-dependent hyperbolic advection- diffusion scheme was first introduced in [NASA/TM-2014-218175, 2014], where rapid convergences over each physical time step, with typically less than five Newton iterations, were shown. In that method, the time-dependent hyperbolic advection-diffusion system (linear and nonlinear) was discretized by the second-order upwind RD scheme in a unified manner, and the system of implicit-residual-equations was solved efficiently by Newton's method over every physical time step. In this paper, two techniques for the source term discretization are proposed; 1) reformulation of the source terms with their divergence forms, and 2) correction to the trapezoidal rule for the source term discretization. Third-, fourth, and sixth-order RD schemes are then proposed with the above techniques that, relative to the second-order RD scheme, only cost the evaluation of either the first derivative or both the first and the second derivatives of the source terms. A special fourth-order RD scheme is also proposed that is even less computationally expensive than the third-order RD schemes. The second-order Jacobian formulation was used for all the proposed high-order schemes. The numerical results are then presented for both steady and time-dependent linear and nonlinear advection-diffusion problems. It is shown that these newly developed high-order RD schemes are remarkably efficient and capable of producing the solutions and the gradients to the same order of accuracy of the proposed RD schemes with rapid convergence over each physical time step, typically less than ten Newton iterations.

  18. [Effects of long-term fertilization on microbial biomass carbon and nitrogen and on carbon source utilization of microbes in a red soil].

    PubMed

    Sun, Feng-xia; Zhang, Wei-hua; Xu, Ming-gang; Zhang, Wen-ju; Li, Zhao-qiang; Zhang, Jing-ye

    2010-11-01

    In order to explore the effects of long-term fertilization on the microbiological characters of red soil, soil samples were collected from a 19-year long-term experimental field in Qiyang of Hunan, with their microbial biomass carbon (MBC) and nitrogen (MBN) and microbial utilization ratio of carbon sources analyzed. The results showed that after 19-year fertilization, the soil MBC and MBN under the application of organic manure and of organic manure plus inorganic fertilizers were 231 and 81 mg x kg(-1) soil, and 148 and 73 mg x kg(-1) soil, respectively, being significantly higher than those under non-fertilization, inorganic fertilization, and inorganic fertilization plus straw incorporation. The ratio of soil MBN to total N under the application of organic manure and of organic manure plus inorganic fertilizers was averagely 6.0%, significantly higher than that under non-fertilization and inorganic fertilization. Biolog-ECO analysis showed that the average well color development (AWCD) value was in the order of applying organic manure plus inorganic fertilizers = applying organic manure > non-fertilization > inorganic fertilization = inorganic fertilization plus straw incorporation. Under the application of organic manure or of organic manure plus inorganic fertilizers, the microbial utilization rate of carbon sources, including carbohydrates, carboxylic acids, amino acids, polymers, phenols, and amines increased; while under inorganic fertilization plus straw incorporation, the utilization rate of polymers was the highest, and that of carbohydrates was the lowest. Our results suggested that long-term application of organic manure could increase the red soil MBC, MBN, and microbial utilization rate of carbon sources, improve soil fertility, and maintain a better crop productivity.

  19. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  20. Photovoltaic village power application: assessment of the near-term market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenblum, L.; Bifano, W.J.; Poley, W.A.

    1978-01-01

    A preliminary assessment of the near-term market for photovoltaic village power applications is presented. One of the objectives of the Department of Energy's (DOE) National Photovoltaic Program is to stimulate the demand for photovoltaic power systems so that appropriate markets will be developed in the near-term to support the increasing photovoltaic production capacity also being developed by DOE. The village power application represents such a potential market for photovoltaics. The price of energy for photovoltaic systems is compared to that of utility line extensions and diesel generators. The potential ''domestic''' demand (including the 50 states of the union plus themore » areas under legal control of the U.S. government) is defined in both the goverment and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries are also discussed briefly. It is concluded that a near-term domestic market of at least 12 MW (peak) and a foreign market of about 10 GW (peak) exists and that significant market penetration should be possible beginning in the 1981--82 period.« less

  1. Recognition-Primed Decisions, Ethical Intuition and Borrowing Experience

    DTIC Science & Technology

    2012-04-30

    32 Sigmund Freud (6 May 1856 – 23 September 1939), was an Austrian neurologist who founded the discipline of psychoanalysis. 33 Kornblum, W...employees as well as to shape their behavior and actions. Setting command climate involves a process that Sigmund Freud32 termed Socialization...According to Freud , socialization is the primary source of moral development and people develop morally when one’s internal desires are overcome by

  2. Development of Accommodation Models for Soldiers in Vehicles: Squad

    DTIC Science & Technology

    2014-09-01

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data from a previous study of Soldier posture and position were analyzed to develop statistical...range of seat height and seat back angle. All of the models include the effects of body armor and body borne gear. 15. SUBJECT TERMS Anthropometry

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a framework for quantifying the degree to which risk is reduced as mass is removed from shallow, saturated, low-permeability, dual-porosity, DNAPL source zones. Risk is defined in terms of meeting an alternate concentration level (ACL) at a compliance well in an aquifer underlying the source zone. Themore » ACL is back-calculated from a carcinogenic health-risk characterization at a downstream water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phases (dissolved, sorbed, free product). Due to the uncertainties in currently-available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making risk-reduction calculations for specific technologies. Despite the qualitative nature of the exercise, results imply that very high mass-removal efficiencies are required to achieve significant long-term risk reduction with technology, applications of finite duration. 17 refs., 7 figs., 6 tabs.« less

  4. ISS Ambient Air Quality: Updated Inventory of Known Aerosol Sources

    NASA Technical Reports Server (NTRS)

    Meyer, Marit

    2014-01-01

    Spacecraft cabin air quality is of fundamental importance to crew health, with concerns encompassing both gaseous contaminants and particulate matter. Little opportunity exists for direct measurement of aerosol concentrations on the International Space Station (ISS), however, an aerosol source model was developed for the purpose of filtration and ventilation systems design. This model has successfully been applied, however, since the initial effort, an increase in the number of crewmembers from 3 to 6 and new processes on board the ISS necessitate an updated aerosol inventory to accurately reflect the current ambient aerosol conditions. Results from recent analyses of dust samples from ISS, combined with a literature review provide new predicted aerosol emission rates in terms of size-segregated mass and number concentration. Some new aerosol sources have been considered and added to the existing array of materials. The goal of this work is to provide updated filtration model inputs which can verify that the current ISS filtration system is adequate and filter lifetime targets are met. This inventory of aerosol sources is applicable to other spacecraft, and becomes more important as NASA considers future long term exploration missions, which will preclude the opportunity for resupply of filtration products.

  5. Systematically biological prioritizing remediation sites based on datasets of biological investigations and heavy metals in soil

    NASA Astrophysics Data System (ADS)

    Lin, Wei-Chih; Lin, Yu-Pin; Anthony, Johnathen

    2015-04-01

    Heavy metal pollution has adverse effects on not only the focal invertebrate species of this study, such as reduction in pupa weight and increased larval mortality, but also on the higher trophic level organisms which feed on them, either directly or indirectly, through the process of biomagnification. Despite this, few studies regarding remediation prioritization take species distribution or biological conservation priorities into consideration. This study develops a novel approach for delineating sites which are both contaminated by any of 5 readily bioaccumulated heavy metal soil contaminants and are of high ecological importance for the highly mobile, low trophic level focal species. The conservation priority of each site was based on the projected distributions of 6 moth species simulated via the presence-only maximum entropy species distribution model followed by the subsequent application of a systematic conservation tool. In order to increase the number of available samples, we also integrated crowd-sourced data with professionally-collected data via a novel optimization procedure based on a simulated annealing algorithm. This integration procedure was important since while crowd-sourced data can drastically increase the number of data samples available to ecologists, still the quality or reliability of crowd-sourced data can be called into question, adding yet another source of uncertainty in projecting species distributions. The optimization method screens crowd-sourced data in terms of the environmental variables which correspond to professionally-collected data. The sample distribution data was derived from two different sources, including the EnjoyMoths project in Taiwan (crowd-sourced data) and the Global Biodiversity Information Facility (GBIF) ?eld data (professional data). The distributions of heavy metal concentrations were generated via 1000 iterations of a geostatistical co-simulation approach. The uncertainties in distributions of the heavy metals were then quantified based on the overall consistency between realizations. Finally, Information-Gap Decision Theory (IGDT) was applied to rank the remediation priorities of contaminated sites in terms of both spatial consensus of multiple heavy metal realizations and the priority of specific conservation areas. Our results show that the crowd-sourced optimization algorithm developed in this study is effective at selecting suitable data from crowd-sourced data. By using this technique the available sample data increased to a total number of 96, 162, 72, 62, 69 and 62 or, that is, 2.6, 1.6, 2.5, 1.6, 1.2 and 1.8 times that originally available through the GBIF professionally-assembled database. Additionally, for all species considered the performance of models, in terms of test-AUC values, based on the combination of both data sources exceeded those models which were based on a single data source. Furthermore, the additional optimization-selected data lowered the overall variability, and therefore uncertainty, of model outputs. Based on the projected species distributions, our results revealed that around 30% of high species hotspot areas were also identified as contaminated. The decision-making tool, IGDT, successfully yielded remediation plans in terms of specific ecological value requirements, false positive tolerance rates of contaminated areas, and expected decision robustness. The proposed approach can be applied both to identify high conservation priority sites contaminated by heavy metals, based on the combination of screened crowd-sourced and professionally-collected data, and in making robust remediation decisions.

  6. Using Satellite Observations to Evaluate the AeroCOM Volcanic Emissions Inventory and the Dispersal of Volcanic SO2 Clouds in MERRA

    NASA Technical Reports Server (NTRS)

    Hughes, Eric J.; Krotkov, Nickolay; da Silva, Arlindo; Colarco, Peter

    2015-01-01

    Simulation of volcanic emissions in climate models requires information that describes the eruption of the emissions into the atmosphere. While the total amount of gases and aerosols released from a volcanic eruption can be readily estimated from satellite observations, information about the source parameters, like injection altitude, eruption time and duration, is often not directly known. The AeroCOM volcanic emissions inventory provides estimates of eruption source parameters and has been used to initialize volcanic emissions in reanalysis projects, like MERRA. The AeroCOM volcanic emission inventory provides an eruptions daily SO2 flux and plume top altitude, yet an eruption can be very short lived, lasting only a few hours, and emit clouds at multiple altitudes. Case studies comparing the satellite observed dispersal of volcanic SO2 clouds to simulations in MERRA have shown mixed results. Some cases show good agreement with observations Okmok (2008), while for other eruptions the observed initial SO2 mass is half of that in the simulations, Sierra Negra (2005). In other cases, the initial SO2 amount agrees with the observations but shows very different dispersal rates, Soufriere Hills (2006). In the aviation hazards community, deriving accurate source terms is crucial for monitoring and short-term forecasting (24-h) of volcanic clouds. Back trajectory methods have been developed which use satellite observations and transport models to estimate the injection altitude, eruption time, and eruption duration of observed volcanic clouds. These methods can provide eruption timing estimates on a 2-hour temporal resolution and estimate the altitude and depth of a volcanic cloud. To better understand the differences between MERRA simulations and volcanic SO2 observations, back trajectory methods are used to estimate the source term parameters for a few volcanic eruptions and compared to their corresponding entry in the AeroCOM volcanic emission inventory. The nature of these mixed results is discussed with respect to the source term estimates.

  7. Annual Rates on Seismogenic Italian Sources with Models of Long-Term Predictability for the Time-Dependent Seismic Hazard Assessment In Italy

    NASA Astrophysics Data System (ADS)

    Murru, Maura; Falcone, Giuseppe; Console, Rodolfo

    2016-04-01

    The present study is carried out in the framework of the Center for Seismic Hazard (CPS) INGV, under the agreement signed in 2015 with the Department of Civil Protection for developing a new model of seismic hazard of the country that can update the current reference (MPS04-S1; zonesismiche.mi.ingv.it and esse1.mi.ingv.it) released between 2004 and 2006. In this initiative, we participate with the Long-Term Stress Transfer (LTST) Model to provide the annual occurrence rate of a seismic event on the entire Italian territory, from a Mw4.5 minimum magnitude, considering bins of 0.1 magnitude units on geographical cells of 0.1° x 0.1°. Our methodology is based on the fusion of a statistical time-dependent renewal model (Brownian Passage Time, BPT, Matthews at al., 2002) with a physical model which considers the permanent effect in terms of stress that undergoes a seismogenic source in result of the earthquakes that occur on surrounding sources. For each considered catalog (historical, instrumental and individual seismogenic sources) we determined a distinct rate value for each cell of 0.1° x 0.1° for the next 50 yrs. If the cell falls within one of the sources in question, we adopted the respective value of rate, which is referred only to the magnitude of the event characteristic. This value of rate is divided by the number of grid cells that fall on the horizontal projection of the source. If instead the cells fall outside of any seismic source we considered the average value of the rate obtained from the historical and the instrumental catalog, using the method of Frankel (1995). The annual occurrence rate was computed for any of the three considered distributions (Poisson, BPT and BPT with inclusion of stress transfer).

  8. Computational Fluid Dynamics Simulation of Flows in an Oxidation Ditch Driven by a New Surface Aerator

    PubMed Central

    Huang, Weidong; Li, Kun; Wang, Gan; Wang, Yingzhe

    2013-01-01

    Abstract In this article, we present a newly designed inverse umbrella surface aerator, and tested its performance in driving flow of an oxidation ditch. Results show that it has a better performance in driving the oxidation ditch than the original one with higher average velocity and more uniform flow field. We also present a computational fluid dynamics model for predicting the flow field in an oxidation ditch driven by a surface aerator. The improved momentum source term approach to simulate the flow field of the oxidation ditch driven by an inverse umbrella surface aerator was developed and validated through experiments. Four kinds of turbulent models were investigated with the approach, including the standard k−ɛ model, RNG k−ɛ model, realizable k−ɛ model, and Reynolds stress model, and the predicted data were compared with those calculated with the multiple rotating reference frame approach (MRF) and sliding mesh approach (SM). Results of the momentum source term approach are in good agreement with the experimental data, and its prediction accuracy is better than MRF, close to SM. It is also found that the momentum source term approach has lower computational expenses, is simpler to preprocess, and is easier to use. PMID:24302850

  9. OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

    PubMed

    Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van

    2017-08-01

    Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Observed ground-motion variabilities and implication for source properties

    NASA Astrophysics Data System (ADS)

    Cotton, F.; Bora, S. S.; Bindi, D.; Specht, S.; Drouet, S.; Derras, B.; Pina-Valdes, J.

    2016-12-01

    One of the key challenges of seismology is to be able to calibrate and analyse the physical factors that control earthquake and ground-motion variabilities. Within the framework of empirical ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-field records and modern regression algorithms allow to decompose these residuals into between-event and a within-event residual components. The between-event term quantify all the residual effects of the source (e.g. stress-drops) which are not accounted by magnitude term as the only source parameter of the model. Between-event residuals provide a new and rather robust way to analyse the physical factors that control earthquake source properties and associated variabilities. We first will show the correlation between classical stress-drops and between-event residuals. We will also explain why between-event residuals may be a more robust way (compared to classical stress-drop analysis) to analyse earthquake source-properties. We will finally calibrate between-events variabilities using recent high-quality global accelerometric datasets (NGA-West 2, RESORCE) and datasets from recent earthquakes sequences (Aquila, Iquique, Kunamoto). The obtained between-events variabilities will be used to evaluate the variability of earthquake stress-drops but also the variability of source properties which cannot be explained by a classical Brune stress-drop variations. We will finally use the between-event residual analysis to discuss regional variations of source properties, differences between aftershocks and mainshocks and potential magnitude dependencies of source characteristics.

  11. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  12. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  13. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    PubMed

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2018-03-01

    The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.

  14. A high accuracy sequential solver for simulation and active control of a longitudinal combustion instability

    NASA Technical Reports Server (NTRS)

    Shyy, W.; Thakur, S.; Udaykumar, H. S.

    1993-01-01

    A high accuracy convection scheme using a sequential solution technique has been developed and applied to simulate the longitudinal combustion instability and its active control. The scheme has been devised in the spirit of the Total Variation Diminishing (TVD) concept with special source term treatment. Due to the substantial heat release effect, a clear delineation of the key elements employed by the scheme, i.e., the adjustable damping factor and the source term treatment has been made. By comparing with the first-order upwind scheme previously utilized, the present results exhibit less damping and are free from spurious oscillations, offering improved quantitative accuracy while confirming the spectral analysis reported earlier. A simple feedback type of active control has been found to be capable of enhancing or attenuating the magnitude of the combustion instability.

  15. Formulation of US international energy policies

    NASA Astrophysics Data System (ADS)

    1980-09-01

    To find out how the United States develops international energy policy, GAO reviewed five major energy issues covering the period from early 1977 through 1979. The issues are: vulnerabilities to petroleum supply interruptions; long term national security strategy on imported oil prices; export of U.S. oil and gas production equipment and technology to the Soviety Union; World Bank initiatives to assist in financing oil and gas exploration and development in oil-importing developing countries; and the role of gas imports relative to the nation's future sources of gas.

  16. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  17. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  18. Effect of animal-source food supplement prior to and during pregnancy on birthweight and prematurity in rural Vietnam: a brief study description.

    PubMed

    Tu, Ngu; King, Janet C; Dirren, Henri; Thu, Hoang Nga; Ngoc, Quyen Phi; Diep, Anh Nguyen Thi

    2014-12-01

    Maternal nutritional status is an important predictor of infant birthweight. Most previous attempts to improve birthweight through multiple micronutrient supplementation have been initiated after women are pregnant. Interventions to improve maternal nutritional status prior to conception may be more effective in preventing low birthweight and improving other infant health outcomes. To compare the effects of maternal supplementation with animal-source food from preconception to term or from mid-gestation to term with routine prenatal care on birthweight, the prevalence of preterm births, intrauterine growth restriction, and infant growth during the first 12 months of life and on maternal nutrient status and the incidence of maternal and infant infections. Young women from 29 rural communes in northwestern Vietnam were recruited when they registered to marry and were randomized to one of three interventions: animal-source food supplement 5 days per week from marriage to term (approximately 13 months), animal-source food supplement 5 days per week from 16 weeks of gestation to term (approximately 5 months), or routine prenatal care without supplementalfeeding. Data on infant birthweight and gestational age, maternal and infant anthropometry, micronutrient status, and infections in the infant and mother were collected at various time points. In a preliminary study of women of reproductive age in this area of Vietnam, 40% of the women were underweight (body mass index < 18.5) and anemic. About 50% had infections. Rice was the dietary staple, and nutrient-rich, animal-source foods were rarely consumed by women. Iron, zinc, vitamin A, folate, and vitamin B12 intakes were inadequate in about 40% of the women. The study is still ongoing, and further data are not yet available. The results of this study will provide important data regarding whether improved intake of micronutrient-rich animal-source foods that are locally available and affordable before and during pregnancy improves maternal and infant health and development. This food-based approach may have global implications regarding how and when to initiate sustainable nutritional interventions to improve maternal and infant health.

  19. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less

  20. An efficient and stable hydrodynamic model with novel source term discretization schemes for overland flow and flood simulations

    NASA Astrophysics Data System (ADS)

    Xia, Xilin; Liang, Qiuhua; Ming, Xiaodong; Hou, Jingming

    2017-05-01

    Numerical models solving the full 2-D shallow water equations (SWEs) have been increasingly used to simulate overland flows and better understand the transient flow dynamics of flash floods in a catchment. However, there still exist key challenges that have not yet been resolved for the development of fully dynamic overland flow models, related to (1) the difficulty of maintaining numerical stability and accuracy in the limit of disappearing water depth and (2) inaccurate estimation of velocities and discharges on slopes as a result of strong nonlinearity of friction terms. This paper aims to tackle these key research challenges and present a new numerical scheme for accurately and efficiently modeling large-scale transient overland flows over complex terrains. The proposed scheme features a novel surface reconstruction method (SRM) to correctly compute slope source terms and maintain numerical stability at small water depth, and a new implicit discretization method to handle the highly nonlinear friction terms. The resulting shallow water overland flow model is first validated against analytical and experimental test cases and then applied to simulate a hypothetic rainfall event in the 42 km2 Haltwhistle Burn, UK.

  1. The aromatic amino acids biosynthetic pathway: A core platform for products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lievense, J.C.; Frost, J.W.

    The aromatic amino acids biosynthetic pathway is viewed conventionally and primarily as the source of the amino acids L-tyrosine, L-phenylalanine. The authors have recognized the expanded role of the pathway as the major source of aromatic raw materials on earth. With the development of metabolic engineering approaches, it is now possible to biosynthesize a wide variety of aromatic compounds from inexpensive, clean, abundant, renewable sugars using fermentation methods. Examples of already and soon-to-be commercialized biosynthesis of such compounds are described. The long-term prospects are also assessed.

  2. A boundary element approach to optimization of active noise control sources on three-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cunefare, K. A.; Koopmann, G. H.

    1991-01-01

    This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.

  3. OntoFox: web-based support for ontology reuse

    PubMed Central

    2010-01-01

    Background Ontology development is a rapidly growing area of research, especially in the life sciences domain. To promote collaboration and interoperability between different projects, the OBO Foundry principles require that these ontologies be open and non-redundant, avoiding duplication of terms through the re-use of existing resources. As current options to do so present various difficulties, a new approach, MIREOT, allows specifying import of single terms. Initial implementations allow for controlled import of selected annotations and certain classes of related terms. Findings OntoFox http://ontofox.hegroup.org/ is a web-based system that allows users to input terms, fetch selected properties, annotations, and certain classes of related terms from the source ontologies and save the results using the RDF/XML serialization of the Web Ontology Language (OWL). Compared to an initial implementation of MIREOT, OntoFox allows additional and more easily configurable options for selecting and rewriting annotation properties, and for inclusion of all or a computed subset of terms between low and top level terms. Additional methods for including related classes include a SPARQL-based ontology term retrieval algorithm that extracts terms related to a given set of signature terms and an option to extract the hierarchy rooted at a specified ontology term. OntoFox's output can be directly imported into a developer's ontology. OntoFox currently supports term retrieval from a selection of 15 ontologies accessible via SPARQL endpoints and allows users to extend this by specifying additional endpoints. An OntoFox application in the development of the Vaccine Ontology (VO) is demonstrated. Conclusions OntoFox provides a timely publicly available service, providing different options for users to collect terms from external ontologies, making them available for reuse by import into client OWL ontologies. PMID:20569493

  4. Multisource geological data mining and its utilization of uranium resources exploration

    NASA Astrophysics Data System (ADS)

    Zhang, Jie-lin

    2009-10-01

    Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.

  5. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  6. Propagation of sound waves through a linear shear layer: A closed form solution

    NASA Technical Reports Server (NTRS)

    Scott, J. N.

    1978-01-01

    Closed form solutions are presented for sound propagation from a line source in or near a shear layer. The analysis was exact for all frequencies and was developed assuming a linear velocity profile in the shear layer. This assumption allowed the solution to be expressed in terms of parabolic cyclinder functions. The solution is presented for a line monopole source first embedded in the uniform flow and then in the shear layer. Solutions are also discussed for certain types of dipole and quadrupole sources. Asymptotic expansions of the exact solutions for small and large values of Strouhal number gave expressions which correspond to solutions previously obtained for these limiting cases.

  7. 3D reconstruction software comparison for short sequences

    NASA Astrophysics Data System (ADS)

    Strupczewski, Adam; Czupryński, BłaŻej

    2014-11-01

    Large scale multiview reconstruction is recently a very popular area of research. There are many open source tools that can be downloaded and run on a personal computer. However, there are few, if any, comparisons between all the available software in terms of accuracy on small datasets that a single user can create. The typical datasets for testing of the software are archeological sites or cities, comprising thousands of images. This paper presents a comparison of currently available open source multiview reconstruction software for small datasets. It also compares the open source solutions with a simple structure from motion pipeline developed by the authors from scratch with the use of OpenCV and Eigen libraries.

  8. A Parameter Identification Method for Helicopter Noise Source Identification and Physics-Based Semi-Empirical Modeling

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric, II; Schmitz, Fredric H.

    2010-01-01

    A new physics-based parameter identification method for rotor harmonic noise sources is developed using an acoustic inverse simulation technique. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. This new method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor Blade-Vortex Interaction (BVI) noise, allowing accurate estimates of BVI noise to be made for operating conditions based on a small number of measurements taken at different operating conditions.

  9. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  10. The Nature of Sustainability as Viewed by European Students

    ERIC Educational Resources Information Center

    Lockley, John; Jarrath, Martin

    2013-01-01

    Sustainability as a concept, though well understood in general terms, is often politically captured by interest groups and as such expressed through issues like concern for global climate change or the need to develop more efficient energy sources, to address regional, national or international priorities. Education for sustainability as a concept…

  11. 76 FR 54235 - Supplement to the FY2010 Single-Source Cooperative Agreement With the World Health Organization...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... Continue Development of Sustainable Influenza Vaccine Production Capacity in Under Resourced Nations... prove critical to the long-term viability of this program while bolstering the influenza vaccine... light of the threat of an influenza pandemic it was originally designed with the goals of bolstering...

  12. No Longer Conveyor but Creator: Developing an Epistemology of the World Wide Web.

    ERIC Educational Resources Information Center

    Trombley, Laura E. Skandera; Flanagan, William G.

    2001-01-01

    Discusses the impact of the World Wide Web in terms of epistemology. Topics include technological innovations, including new dimensions of virtuality; the accessibility of information; tracking Web use via cookies; how the Web transforms the process of learning and knowing; linking information sources; and the Web as an information delivery…

  13. Extension Approach for an Effective Fisheries and Aquaculture Extension Service in India

    ERIC Educational Resources Information Center

    Kumaran, M.; Vimala, D. Deboral; Chandrasekaran, V. S.; Alagappan, M.; Raja, S.

    2012-01-01

    Purpose: Public-funded fisheries extension services have been blamed as poor and responsible for the slow pace of aquaculture development in India. The present investigation aimed to find concrete interventions to streamline the extension service by understanding the research-extension-farmer linkage indirectly in terms of information sources of…

  14. Classes of Legitimate Evidence for Identifying Effective Teaching.

    ERIC Educational Resources Information Center

    Wagner, Paul A.

    A criterion for selecting sources of evidence to evaluate effective teaching is described. It is suggested that teaching effectiveness is not measured solely in terms of cognitive change in students but in the extent to which academics practice teaching in accordance with the moral dictates of the profession. In developing a teacher effectiveness…

  15. Role of Universities in the National Innovation System. Discussion Paper

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2011

    2011-01-01

    Over recent years governments have been placing more emphasis on innovation as a source of national competitiveness. Governments now assess their investments across many areas in terms of the contribution that such investments make to increasing innovation. This has been especially significant for education and in particular for the development of…

  16. Exploring Biology Teachers' Pedagogical Content Knowledge in the Teaching of Genetics in Swaziland Science Classrooms

    ERIC Educational Resources Information Center

    Mthethwa-Kunene, Eunice; Onwu, Gilbert Oke; de Villiers, Rian

    2015-01-01

    This study explored the pedagogical content knowledge (PCK) and its development of four experienced biology teachers in the context of teaching school genetics. PCK was defined in terms of teacher content knowledge, pedagogical knowledge and knowledge of students' preconceptions and learning difficulties. Data sources of teacher knowledge base…

  17. Data, Data Everywhere--Not a Report in Sight!

    ERIC Educational Resources Information Center

    Norman, Wendy

    2003-01-01

    Presents six steps of data warehouse development that result in valuable, long-term reporting solutions, discussing how to choose the right reporting vehicle. The six steps are: defining one's needs; mapping the source for each element; extracting the data; cleaning and verifying the data; moving the data into a relational database; and developing…

  18. Efficient 1.6 Micron Laser Source for Methane DIAL

    NASA Technical Reports Server (NTRS)

    Shuman, Timothy; Burnham, Ralph; Nehrir, Amin R.; Ismail, Syed; Hair, Johnathan W.

    2013-01-01

    Methane is a potent greenhouse gas and on a per molecule basis has a warming influence 72 times that of carbon dioxide over a 20 year horizon. Therefore, it is important to look at near term radiative effects due to methane to develop mitigation strategies to counteract global warming trends via ground and airborne based measurements systems. These systems require the development of a time-resolved DIAL capability using a narrow-line laser source allowing observation of atmospheric methane on local, regional and global scales. In this work, a demonstrated and efficient nonlinear conversion scheme meeting the performance requirements of a deployable methane DIAL system is presented. By combining a single frequency 1064 nm pump source and a seeded KTP OPO more than 5 mJ of 1.6 µm pulse energy is generated with conversion efficiencies in excess of 20%. Even without active cavity control instrument limited linewidths (50 pm) were achieved with an estimated spectral purity of 95%. Tunable operation over 400 pm (limited by the tuning range of the seed laser) was also demonstrated. This source demonstrated the critical needs for a methane DIAL system motivating additional development of the technology.

  19. Personal assistance services in the workplace: A literature review.

    PubMed

    Dowler, Denetta L; Solovieva, Tatiana I; Walls, Richard T

    2011-10-01

    Personal assistance services (PAS) can be valuable adjuncts to the complement of accommodations that support workers with disabilities. This literature review explored the professional literature on the use of PAS in the workplace. Bibliographic sources were used to locate relevant research studies on the use of PAS in the workplace. The studies in this review used both qualitative and quantitative methods to identify current definitions of work-related and personal care-related PAS, agency-directed versus consumer-directed PAS, long-term and short-term funding issues, development of PAS policy, and barriers to successful implementation of PAS. The studies uncovered issues related to (a) recruiting, training, and retaining personal assistants, (b) employer concerns, (c) costs and benefits of workplace PAS, (d) wages and incentives for personal assistants, and (e) sources for financing PAS as a workplace accommodation. The findings reveal the value and benefits of effective PAS on the job. PAS can lead to successful employment of people with disabilities when other accommodations cannot provide adequate workplace support. Additionally, the evolution of workplace PAS is dependent on development of realistic PAS policy and funding options. Published by Elsevier Inc.

  20. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  1. Ancient Glass: A Literature Search and its Role in Waste Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Pierce, Eric M.

    2010-07-01

    When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less

  2. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  3. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  4. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  5. Turbulent transport in premixed flames

    NASA Technical Reports Server (NTRS)

    Rutland, C. J.; Cant, R. S.

    1994-01-01

    Simulations of planar, premixed turbulent flames with heat release were used to study turbulent transport. Reynolds stress and Reynolds flux budgets were obtained and used to guide the investigation of important physical effects. Essentially all pressure terms in the transport equations were found to be significant. In the Reynolds flux equations, these terms are the major source of counter-gradient transport. Viscous and molecular terms were also found to be significant, with both dilatational and solenoidal terms contributing to the Reynolds stress dissipation. The BML theory of premixed turbulent combustion was critically examined in detail. The BML bimodal pdf was found to agree well with the DNS data. All BML decompositions, through the third moments, show very good agreement with the DNS results. Several BML models for conditional terms were checked using the DNS data and were found to require more extensive development.

  6. Assessing the risk of second malignancies after modern radiotherapy

    PubMed Central

    Newhauser, Wayne D.; Durante, Marco

    2014-01-01

    Recent advances in radiotherapy have enabled the use of different types of particles, such as protons and heavy ions, as well as refinements to the treatment of tumours with standard sources (photons). However, the risk of second cancers arising in long-term survivors continues to be a problem. The long-term risks from treatments such as particle therapy have not yet been determined and are unlikely to become apparent for many years. Therefore, there is a need to develop risk assessments based on our current knowledge of radiation-induced carcinogenesis. PMID:21593785

  7. Glossary of reference terms for alternative test methods and their validation.

    PubMed

    Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas

    2014-01-01

    This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.

  8. Typology of historical sources and the reconstruction of long-term historical changes of riverine fish: a case study of the Austrian Danube and northern Russian rivers

    PubMed Central

    Haidvogl, Gertrud; Lajus, Dmitry; Pont, Didier; Schmid, Martin; Jungwirth, Mathias; Lajus, Julia

    2014-01-01

    Historical data are widely used in river ecology to define reference conditions or to investigate the evolution of aquatic systems. Most studies rely on printed documents from the 19th century, thus missing pre-industrial states and human impacts. This article discusses historical sources that can be used to reconstruct the development of riverine fish communities from the Late Middle Ages until the mid-20th century. Based on the studies of the Austrian Danube and northern Russian rivers, we propose a classification scheme of printed and archival sources and describe their fish ecological contents. Five types of sources were identified using the origin of sources as the first criterion: (i) early scientific surveys, (ii) fishery sources, (iii) fish trading sources, (iv) fish consumption sources and (v) cultural representations of fish. Except for early scientific surveys, all these sources were produced within economic and administrative contexts. They did not aim to report about historical fish communities, but do contain information about commercial fish and their exploitation. All historical data need further analysis for a fish ecological interpretation. Three case studies from the investigated Austrian and Russian rivers demonstrate the use of different source types and underline the necessity for a combination of different sources and a methodology combining different disciplinary approaches. Using a large variety of historical sources to reconstruct the development of past fish ecological conditions can support future river management by going beyond the usual approach of static historical reference conditions. PMID:25284959

  9. Replacing effective spectral radiance by temperature in occupational exposure limits to protect against retinal thermal injury from light and near IR radiation.

    PubMed

    Madjidi, Faramarz; Behroozy, Ali

    2014-01-01

    Exposure to visible light and near infrared (NIR) radiation in the wavelength region of 380 to 1400 nm may cause thermal retinal injury. In this analysis, the effective spectral radiance of a hot source is replaced by its temperature in the exposure limit values in the region of 380-1400 nm. This article describes the development and implementation of a computer code to predict those temperatures, corresponding to the exposure limits proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). Viewing duration and apparent diameter of the source were inputs for the computer code. At the first stage, an infinite series was created for calculation of spectral radiance by integration with Planck's law. At the second stage for calculation of effective spectral radiance, the initial terms of this infinite series were selected and integration was performed by multiplying these terms by a weighting factor R(λ) in the wavelength region 380-1400 nm. At the third stage, using a computer code, the source temperature that can emit the same effective spectral radiance was found. As a result, based only on measuring the source temperature and accounting for the exposure time and the apparent diameter of the source, it is possible to decide whether the exposure to visible and NIR in any 8-hr workday is permissible. The substitution of source temperature for effective spectral radiance provides a convenient way to evaluate exposure to visible light and NIR.

  10. A Study of Regional Waveform Calibration in the Eastern Mediterranean Region.

    NASA Astrophysics Data System (ADS)

    di Luccio, F.; Pino, A.; Thio, H.

    2002-12-01

    We modeled Pnl phases from several moderate magnitude events in the eastern Mediterranean to test methods and to develop path calibrations for source determination. The study region spanning from the eastern part of the Hellenic arc to the eastern Anatolian fault is mostly interested by moderate earthquakes, that can produce relevant damages. The selected area consists of several tectonic environment, which produces increased level of difficulty in waveform modeling. The results of this study are useful for the analysis of regional seismicity and for seismic hazard as well, in particular because very few broadband seismic stations are available in the selected area. The obtained velocity model gives a 30 km crustal tickness and low upper mantle velocities. The applied inversion procedure to determine the source mechanism has been successful, also in terms of discrimination of depth, for the entire range of selected paths. We conclude that using the true calibration of the seismic structure and high quality broadband data, it is possible to determine the seismic source in terms of mechanism, even with a single station.

  11. Taste and odor occurrence in Lake William C. Bowen and Municipal Reservoir #1, Spartanburg County, South Carolina

    USGS Publications Warehouse

    Journey, Celeste; Arrington, Jane M.

    2009-01-01

    The U.S. Geological Survey and Spartanburg Water are working cooperatively on an ongoing study of Lake Bowen and Reservoir #1 to identify environmental factors that enhance or influence the production of geosmin in the source-water reservoirs. Spartanburg Water is using information from this study to develop management strategies to reduce (short-term solution) and prevent (long-term solution) geosmin occurrence. Spartanburg Water utility treats and distributes drinking water to the Spartanburg area of South Carolina. The drinking water sources for the area are Lake William C. Bowen (Lake Bowen) and Municipal Reservoir #1 (Reservoir #1), located north of Spartanburg. These reservoirs, which were formed by the impoundment of the South Pacolet River, were assessed in 2006 by the South Carolina Department of Health and Environmental Control (SCDHEC) as being fully supportive of all uses based on established criteria. Nonetheless, Spartanburg Water had noted periodic taste and odor problems due to the presence of geosmin, a naturally occurring compound in the source water. Geosmin is not harmful, but its presence in drinking water is aesthetically unpleasant.

  12. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  13. Task III: Development of an Effective Computational Methodology for Body Force Representation of High-speed Rotor 37

    NASA Technical Reports Server (NTRS)

    Tan, Choon-Sooi; Suder, Kenneth (Technical Monitor)

    2003-01-01

    A framework for an effective computational methodology for characterizing the stability and the impact of distortion in high-speed multi-stage compressor is being developed. The methodology consists of using a few isolated-blade row Navier-Stokes solutions for each blade row to construct a body force database. The purpose of the body force database is to replace each blade row in a multi-stage compressor by a body force distribution to produce same pressure rise and flow turning. To do this, each body force database is generated in such a way that it can respond to the changes in local flow conditions. Once the database is generated, no hrther Navier-Stokes computations are necessary. The process is repeated for every blade row in the multi-stage compressor. The body forces are then embedded as source terms in an Euler solver. The method is developed to have the capability to compute the performance in a flow that has radial as well as circumferential non-uniformity with a length scale larger than a blade pitch; thus it can potentially be used to characterize the stability of a compressor under design. It is these two latter features as well as the accompanying procedure to obtain the body force representation that distinguish the present methodology from the streamline curvature method. The overall computational procedures have been developed. A dimensional analysis was carried out to determine the local flow conditions for parameterizing the magnitudes of the local body force representation of blade rows. An Euler solver was modified to embed the body forces as source terms. The results from the dimensional analysis show that the body forces can be parameterized in terms of the two relative flow angles, the relative Mach number, and the Reynolds number. For flow in a high-speed transonic blade row, they can be parameterized in terms of the local relative Mach number alone.

  14. Comparison of Physics Frameworks for WebGL-Based Game Engine

    NASA Astrophysics Data System (ADS)

    Yogya, Resa; Kosala, Raymond

    2014-03-01

    Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.

  15. Development and Characterization of a Laser-Induced Acoustic Desorption Source.

    PubMed

    Huang, Zhipeng; Ossenbrüggen, Tim; Rubinsky, Igor; Schust, Matthias; Horke, Daniel A; Küpper, Jochen

    2018-03-20

    A laser-induced acoustic desorption source, developed for use at central facilities, such as free-electron lasers, is presented. It features prolonged measurement times and a fixed interaction point. A novel sample deposition method using aerosol spraying provides a uniform sample coverage and hence stable signal intensity. Utilizing strong-field ionization as a universal detection scheme, the produced molecular plume is characterized in terms of number density, spatial extend, fragmentation, temporal distribution, translational velocity, and translational temperature. The effect of desorption laser intensity on these plume properties is evaluated. While translational velocity is invariant for different desorption laser intensities, pointing to a nonthermal desorption mechanism, the translational temperature increases significantly and higher fragmentation is observed with increased desorption laser fluence.

  16. Unregulated private wells in the Republic of Ireland: consumer awareness, source susceptibility and protective actions.

    PubMed

    Hynds, Paul D; Misstear, Bruce D; Gill, Laurence W

    2013-09-30

    While the safety of public drinking water supplies in the Republic of Ireland is governed and monitored at both local and national levels, there are currently no legislative tools in place relating to private supplies. It is therefore paramount that private well owners (and users) be aware of source specifications and potential contamination risks, to ensure adequate water quality. The objective of this study was to investigate the level of awareness among private well owners in the Republic of Ireland, relating to source characterisation and groundwater contamination issues. This was undertaken through interviews with 245 private well owners. Statistical analysis indicates that respondents' source type significantly influences owner awareness, particularly regarding well construction and design parameters. Water treatment, source maintenance and regular water quality testing are considered the three primary "protective actions" (or "stewardship activities") to consumption of contaminated groundwater and were reported as being absent in 64%, 72% and 40% of cases, respectively. Results indicate that the level of awareness exhibited by well users did not significantly affect the likelihood of their source being contaminated (source susceptibility); increased awareness on behalf of well users was associated with increased levels of protective action, particularly among borehole owners. Hence, lower levels of awareness may result in increased contraction of waterborne illnesses where contaminants have entered the well. Accordingly, focused educational strategies to increase awareness among private groundwater users are advocated in the short-term; the development and introdiction of formal legislation is recommended in the long-term, including an integrated programme of well inspections and risk assessments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Co-development, innovation and mutual learning--or how we need to turn the world upside down.

    PubMed

    Crisp, Nigel

    2015-12-01

    This paper describes the scope for mutual learning and the sharing of innovation between different parts of the world. It argues that the top-down concept of international development--with its connotations that low income countries need to develop in ways that emulate progress in richer more "developed" ones needs to be replaced with the idea of co-development and learning and sharing together. Similarly, it advocates for replacing the term of "reverse innovation" with the concept of global sourcing of innovation. Copyright © 2015. Published by Elsevier Inc.

  18. Develop a Business Plan

    DTIC Science & Technology

    1993-10-01

    AD-A274 321 ,’U IflIlllME• U I I DEVELOP A BUSINESS PLAN U FINAL REPORT n A Short Term Research and Development Task Proposed Under DLA900-87-D-0017...DATEU COVERED October 1, 1.993 F inal -10/12./89 Q 6󈧢/93 4. TITLE AND SUBTIILE S. FUNDING NUMBERS "Develop a Business Plan" DLA900-87-D-0017 D00017 I...Z39-16 Z9- 107 DEVELOP A BUSINESS PLAN I EXECUTIVE SUMMARY I Funding for CAR’s activities has come from a variety of sources: federal, state, Clemson

  19. Modernization and unification: Strategic goals for NASA STI program

    NASA Technical Reports Server (NTRS)

    Blados, W.; Cotter, Gladys A.

    1993-01-01

    Information is increasingly becoming a strategic resource in all societies and economies. The NASA Scientific and Technical Information (STI) Program has initiated a modernization program to address the strategic importance and changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls.' Notwithstanding this modernization program, it is recognized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected because of resource limitations. If nations cooperate to develop an international aerospace information system, resources can be used efficiently to cover expanded sources of information. We must consider forming a coalition to collect and provide access to disparate, multidisciplinary sources of information, and to develop standardized tools for documenting and manipulating this data and information. In view of recent technological developments in information science and technology, as well as the reality of scarce resources in all nations, it is time to explore the mutually beneficial possibilities offered by cooperation and international resource sharing. International resources need to be mobilized in a coordinated manner to move us towards this goal. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative that can optimize the cost/benefit equation for all participants.

  20. Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015

    PubMed Central

    2012-01-01

    Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561

  1. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak

    NASA Astrophysics Data System (ADS)

    Xiong, C. Y.; Chen, J.; Li, Q.; Liu, Y.; Gao, L.

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (˜100-˜500 kHz/10 min) and decay of laser power (˜10%-˜20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  2. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak.

    PubMed

    Xiong, C Y; Chen, J; Li, Q; Liu, Y; Gao, L

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (∼100-∼500 kHz/10 min) and decay of laser power (∼10%-∼20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  3. Related Studies in Long Term Lithium Battery Stability

    NASA Technical Reports Server (NTRS)

    Horning, R. J.; Chua, D. L.

    1984-01-01

    The continuing growth of the use of lithium electrochemical systems in a wide variety of both military and industrial applications is primarily a result of the significant benefits associated with the technology such as high energy density, wide temperature operation and long term stability. The stability or long term storage capability of a battery is a function of several factors, each important to the overall storage life and, therefore, each potentially a problem area if not addressed during the design, development and evaluation phases of the product cycle. Design (e.g., reserve vs active), inherent material thermal stability, material compatibility and self-discharge characteristics are examples of factors key to the storability of a power source.

  4. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  5. Time-frequency approach to underdetermined blind source separation.

    PubMed

    Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong

    2012-02-01

    This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.

  6. Helicon plasma generator-assisted surface conversion ion source for the production of H- ion beams at the Los Alamos Neutron Science Centera)

    NASA Astrophysics Data System (ADS)

    Tarvainen, O.; Rouleau, G.; Keller, R.; Geros, E.; Stelzer, J.; Ferris, J.

    2008-02-01

    The converter-type negative ion source currently employed at the Los Alamos Neutron Science Center (LANSCE) is based on cesium enhanced surface production of H- ion beams in a filament-driven discharge. In this kind of an ion source the extracted H- beam current is limited by the achievable plasma density which depends primarily on the electron emission current from the filaments. The emission current can be increased by increasing the filament temperature but, unfortunately, this leads not only to shorter filament lifetime but also to an increase in metal evaporation from the filament, which deposits on the H- converter surface and degrades its performance. Therefore, we have started an ion source development project focused on replacing these thermionic cathodes (filaments) of the converter source by a helicon plasma generator capable of producing high-density hydrogen plasmas with low electron energy. In our studies which have so far shown that the plasma density of the surface conversion source can be increased significantly by exciting a helicon wave in the plasma, and we expect to improve the performance of the surface converter H- ion source in terms of beam brightness and time between services. The design of this new source and preliminary results are presented, along with a discussion of physical processes relevant for H- ion beam production with this novel design. Ultimately, we perceive this approach as an interim step towards our long-term goal, combining a helicon plasma generator with an SNS-type main discharge chamber, which will allow us to individually optimize the plasma properties of the plasma cathode (helicon) and H- production (main discharge) in order to further improve the brightness of extracted H- ion beams.

  7. Helicon plasma generator-assisted surface conversion ion source for the production of H(-) ion beams at the Los Alamos Neutron Science Center.

    PubMed

    Tarvainen, O; Rouleau, G; Keller, R; Geros, E; Stelzer, J; Ferris, J

    2008-02-01

    The converter-type negative ion source currently employed at the Los Alamos Neutron Science Center (LANSCE) is based on cesium enhanced surface production of H(-) ion beams in a filament-driven discharge. In this kind of an ion source the extracted H(-) beam current is limited by the achievable plasma density which depends primarily on the electron emission current from the filaments. The emission current can be increased by increasing the filament temperature but, unfortunately, this leads not only to shorter filament lifetime but also to an increase in metal evaporation from the filament, which deposits on the H(-) converter surface and degrades its performance. Therefore, we have started an ion source development project focused on replacing these thermionic cathodes (filaments) of the converter source by a helicon plasma generator capable of producing high-density hydrogen plasmas with low electron energy. In our studies which have so far shown that the plasma density of the surface conversion source can be increased significantly by exciting a helicon wave in the plasma, and we expect to improve the performance of the surface converter H(-) ion source in terms of beam brightness and time between services. The design of this new source and preliminary results are presented, along with a discussion of physical processes relevant for H(-) ion beam production with this novel design. Ultimately, we perceive this approach as an interim step towards our long-term goal, combining a helicon plasma generator with an SNS-type main discharge chamber, which will allow us to individually optimize the plasma properties of the plasma cathode (helicon) and H(-) production (main discharge) in order to further improve the brightness of extracted H(-) ion beams.

  8. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  9. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  10. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  11. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  12. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  13. The Plant Research Unit: Long-Term Plant Growth Support for Space Station

    NASA Technical Reports Server (NTRS)

    Heathcote, D. G.; Brown, C. S.; Goins, G. D.; Kliss, M.; Levine, H.; Lomax, P. A.; Porter, R. L.; Wheeler, R.

    1996-01-01

    The specifications of the plant research unit (PRU) plant habitat, designed for space station operations, are presented. A prototype brassboard model of the PRU is described, and the results of the subsystems tests are outlined. The effects of the long term red light emitting diode (LED) illumination as the sole source for plant development were compared with red LEDs supplemented with blue wavelengths, and white fluorescent sources. It was found that wheat and Arabidopsis were able to complete a life cycle under red LEDs alone, but with differences in physiology and morphology. The differences noted were greatest for the Arabidopsis, where the time to flowering was increased under red illumination. The addition of 10 percent of blue light was effective in eliminating the observed differences. The results of the comparative testing of three nutrient delivery systems for the PRU are discussed.

  14. Endophytic Fungi—Alternative Sources of Cytotoxic Compounds: A Review

    PubMed Central

    Uzma, Fazilath; Mohan, Chakrabhavi D.; Hashem, Abeer; Konappa, Narasimha M.; Rangappa, Shobith; Kamath, Praveen V.; Singh, Bhim P.; Mudili, Venkataramana; Gupta, Vijai K.; Siddaiah, Chandra N.; Chowdappa, Srinivas; Alqarawi, Abdulaziz A.; Abd_Allah, Elsayed F.

    2018-01-01

    Cancer is a major cause of death worldwide, with an increasing number of cases being reported annually. The elevated rate of mortality necessitates a global challenge to explore newer sources of anticancer drugs. Recent advancements in cancer treatment involve the discovery and development of new and improved chemotherapeutics derived from natural or synthetic sources. Natural sources offer the potential of finding new structural classes with unique bioactivities for cancer therapy. Endophytic fungi represent a rich source of bioactive metabolites that can be manipulated to produce desirable novel analogs for chemotherapy. This review offers a current and integrative account of clinically used anticancer drugs such as taxol, podophyllotoxin, camptothecin, and vinca alkaloids in terms of their mechanism of action, isolation from endophytic fungi and their characterization, yield obtained, and fungal strain improvement strategies. It also covers recent literature on endophytic fungal metabolites from terrestrial, mangrove, and marine sources as potential anticancer agents and emphasizes the findings for cytotoxic bioactive compounds tested against specific cancer cell lines. PMID:29755344

  15. Frequency tunable electronic sources working at room temperature in the 1 to 3 THz band

    NASA Astrophysics Data System (ADS)

    Maestrini, Alain; Mehdi, Imran; Siles, José V.; Lin, Robert; Lee, Choonsup; Chattopadhyay, Goutam; Pearson, John; Siegel, Peter

    2012-10-01

    Compact, room temperature terahertz sources are much needed in the 1 to 3 THz band for developing multi-pixel heterodyne receivers for astrophysics and planetary science or for building short-range high spatial resolution THz imaging systems able to see through low water content and non metallic materials, smoke or dust for a variety of applications ranging from the inspection of art artifacts to the detection of masked or concealed objects. All solid-sate electronic sources based on a W-band synthesizer followed by a high-power W-band amplifier and a cascade of Schottky diode based THz frequency multipliers are now capable of producing more than 1 mW at 0.9THz, 50 μW at 2 THz and 18 μW at 2.6 THz without the need of any cryogenic system. These sources are frequency agile and have a relative bandwidth of 10 to 15%, limited by the high power W-band amplifiers. The paper will present the latest developments of this technology and its perspective in terms of frequency range, bandwidth and power.

  16. Development Status of Ion Source at J-PARC Linac Test Stand

    NASA Astrophysics Data System (ADS)

    Yamazaki, S.; Takagi, A.; Ikegami, K.; Ohkoshi, K.; Ueno, A.; Koizumi, I.; Oguri, H.

    The Japan Proton Accelerator Research Complex (J-PARC) linac power upgrade program is now in progress in parallel with user operation. To realize a nominal performance of 1 MW at 3 GeV Rapid Cycling Synchrotron and 0.75 MW at the Main Ring synchrotron, we need to upgrade the peak beam current (50 mA) of the linac. For the upgrade program, we are testing a new front-end system, which comprises a cesiated RF-driven H- ion source and a new radio -frequency quadrupole linac (RFQ). The H- ion source was developed to satisfy the J-PARC upgrade requirements of an H- ion-beam current of 60 mA and a lifetime of more than 50 days. On February 6, 2014, the first 50 mA H- beams were accelerated by the RFQ during a beam test. To demonstrate the performance of the ion source before its installation in the summer of 2014, we tested the long-term stability through continuous beam operation, which included estimating the lifetime of the RF antenna and evaluating the cesium consumption.

  17. Microbial quality of improved drinking water sources: evidence from western Kenya and southern Vietnam.

    PubMed

    Grady, Caitlin A; Kipkorir, Emmanuel C; Nguyen, Kien; Blatchley, E R

    2015-06-01

    In recent decades, more than 2 billion people have gained access to improved drinking water sources thanks to extensive effort from governments, and public and private sector entities. Despite this progress, many water sector development interventions do not provide access to safe water or fail to be sustained for long-term use. The authors examined drinking water quality of previously implemented water improvement projects in three communities in western Kenya and three communities in southern Vietnam. The cross-sectional study of 219 households included measurements of viable Escherichia coli. High rates of E. coli prevalence in these improved water sources were found in many of the samples. These findings suggest that measures above and beyond the traditional 'improved source' definition may be necessary to ensure truly safe water throughout these regions.

  18. Analytical method for optimal source reduction with monitored natural attenuation in contaminated aquifers

    USGS Publications Warehouse

    Widdowson, M.A.; Chapelle, F.H.; Brauner, J.S.; ,

    2003-01-01

    A method is developed for optimizing monitored natural attenuation (MNA) and the reduction in the aqueous source zone concentration (??C) required to meet a site-specific regulatory target concentration. The mathematical model consists of two one-dimensional equations of mass balance for the aqueous phase contaminant, to coincide with up to two distinct zones of transformation, and appropriate boundary and intermediate conditions. The solution is written in terms of zone-dependent Peclet and Damko??hler numbers. The model is illustrated at a chlorinated solvent site where MNA was implemented following source treatment using in-situ chemical oxidation. The results demonstrate that by not taking into account a variable natural attenuation capacity (NAC), a lower target ??C is predicted, resulting in unnecessary source concentration reduction and cost with little benefit to achieving site-specific remediation goals.

  19. Integrating diverse forage sources reduces feed gaps on mixed crop-livestock farms.

    PubMed

    Bell, L W; Moore, A D; Thomas, D T

    2017-12-04

    Highly variable climates induce large variability in the supply of forage for livestock and so farmers must manage their livestock systems to reduce the risk of feed gaps (i.e. periods when livestock feed demand exceeds forage supply). However, mixed crop-livestock farmers can utilise a range of feed sources on their farms to help mitigate these risks. This paper reports on the development and application of a simple whole-farm feed-energy balance calculator which is used to evaluate the frequency and magnitude of feed gaps. The calculator matches long-term simulations of variation in forage and metabolisable energy supply from diverse sources against energy demand for different livestock enterprises. Scenarios of increasing the diversity of forage sources in livestock systems is investigated for six locations selected to span Australia's crop-livestock zone. We found that systems relying on only one feed source were prone to higher risk of feed gaps, and hence, would often have to reduce stocking rates to mitigate these risks or use supplementary feed. At all sites, by adding more feed sources to the farm feedbase the continuity of supply of both fresh and carry-over forage was improved, reducing the frequency and magnitude of feed deficits. However, there were diminishing returns from making the feedbase more complex, with combinations of two to three feed sources typically achieving the maximum benefits in terms of reducing the risk of feed gaps. Higher stocking rates could be maintained while limiting risk when combinations of other feed sources were introduced into the feedbase. For the same level of risk, a feedbase relying on a diversity of forage sources could support stocking rates 1.4 to 3 times higher than if they were using a single pasture source. This suggests that there is significant capacity to mitigate both risk of feed gaps at the same time as increasing 'safe' stocking rates through better integration of feed sources on mixed crop-livestock farms across diverse regions and climates.

  20. 12 CFR 201.4 - Availability and terms of credit.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...

  1. El estres y los ninos pequenos (Stress and Young Children). ERIC Digest.

    ERIC Educational Resources Information Center

    Jewett, Jan; Peterson, Karen

    Traditionally, stress has been defined in terms of its source (internal, such as hunger, pain, sensitivity to noise and external, such as separation from family, change in family composition, exposure to conflict or violence). Although the research literature tends to focus on the impact of single-variable stressors on children's development, in…

  2. NASA Cold Land Processes Experiment (CLPX 2002/03): Ground-based and near-surface meteorological observations

    Treesearch

    Kelly Elder; Don Cline; Angus Goodbody; Paul Houser; Glen E. Liston; Larry Mahrt; Nick Rutter

    2009-01-01

    A short-term meteorological database has been developed for the Cold Land Processes Experiment (CLPX). This database includes meteorological observations from stations designed and deployed exclusively for CLPXas well as observations available from other sources located in the small regional study area (SRSA) in north-central Colorado. The measured weather parameters...

  3. Self-Esteem, Creativity, and Music: Implications and Directions for Research.

    ERIC Educational Resources Information Center

    VanderArk, Sherman

    1989-01-01

    This paper seeks to give potentially pertinent information and ideas for the development of a model and of hypotheses that are relevant in terms of combining the areas of self-concept and creativity. Selected sources from the areas of psychology, education, and music education are presented as the basis for ideas and thoughts for further research.…

  4. From Little Acorns..: Environmental Action as a Source of Well-Being for Schoolchildren

    ERIC Educational Resources Information Center

    Waite, S.; Goodenough, A.; Norris, V.; Puttick, N.

    2016-01-01

    Pastoral care in education may take many forms but increasing emphasis on education for sustainable development (ESD) and concern about children's disconnection from nature suggests that our understanding of care should perhaps encompass the more than human world. The study described in this article examines longer term perspectives on well-being…

  5. Toward an Integrated System of Income Acquisition and Management: Four Community College Responses.

    ERIC Educational Resources Information Center

    Birmingham, Kathryn M.

    This study argues that community college funding and resource development must become a long-term core function of the institution due to changes in the source of revenue for community colleges. The research problem was: (1) to identify and describe how organizational structure and management activities have changed in four community colleges in…

  6. 78 FR 48156 - Update to An Inventory of Sources and Environmental Releases of Dioxin-Like Compounds in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... Environmental Assessment (NCEA) within EPA's Office of Research and Development. In November 2006, EPA released... classified as preliminary and not included in the quantitative inventory. The updated inventory lists the top... 2000. The quantitative results are expressed in terms of the toxicity equivalent (TEQ) of the mixture...

  7. Ohm's Law and Solar Energy. Courseware Evaluation for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Gates, Earl; And Others

    This courseware evaluation rates the Ohm's Law and Solar Energy program developed by the Iowa Department of Public Instruction. (The program--not contained in this document--covers Ohm's law and resistance problems, passive solar energy, and project ideas and sources.) Part A describes the program in terms of subject area (construction and…

  8. Development of Accommodation Models for Soldiers in Vehicles: Squad

    DTIC Science & Technology

    2014-09-01

    existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information. Send comments...Distribution Statement A. Approved for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data from a previous study...body armor and body borne gear. 15. SUBJECT TERMS Anthropometry , Posture, Vehicle Occupants, Accommodation 16. SECURITY CLASSIFICATION OF

  9. A review of methods for predicting air pollution dispersion

    NASA Technical Reports Server (NTRS)

    Mathis, J. J., Jr.; Grose, W. L.

    1973-01-01

    Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.

  10. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  11. Gendered Sources of Distress and Resilience among Afghan Refugees in Northern California: A Cross-Sectional Study.

    PubMed

    Stempel, Carl; Sami, Nilofar; Koga, Patrick Marius; Alemi, Qais; Smith, Valerie; Shirazi, Aida

    2016-12-28

    Recent studies have emphasized the influence of resettlement factors on the mental health of refugees resettling in developed countries. However, little research has addressed gender differences in the nature and influence of resettlement stressors and sources of resilience. We address this gap in knowledge by investigating how gender moderates and mediates the influence of several sources of distress and resilience among 259 Afghan refugees residing in Northern California (USA). Gender moderated the effects of four factors on levels of distress. Intimate and extended family ties have little correlation with men's distress levels, but are strongly associated with lower distress for women. English ability is positively associated with lower distress for women, but not men. In terms of gender ideology, traditionally oriented women and egalitarian men have lower levels of distress. And experiencing greater dissonant acculturation increases distress for men, but not women. The influence of gender interaction terms is substantial and patterns may reflect difficulty adapting to a different gender order. Future studies of similar populations should investigate gender differences in sources of distress and resilience, and efforts to assist new arrivals might inform them of changes in gender roles they may experience, and facilitate opportunities to renegotiate gender roles.

  12. Gendered Sources of Distress and Resilience among Afghan Refugees in Northern California: A Cross-Sectional Study

    PubMed Central

    Stempel, Carl; Sami, Nilofar; Koga, Patrick Marius; Alemi, Qais; Smith, Valerie; Shirazi, Aida

    2016-01-01

    Recent studies have emphasized the influence of resettlement factors on the mental health of refugees resettling in developed countries. However, little research has addressed gender differences in the nature and influence of resettlement stressors and sources of resilience. We address this gap in knowledge by investigating how gender moderates and mediates the influence of several sources of distress and resilience among 259 Afghan refugees residing in Northern California (USA). Gender moderated the effects of four factors on levels of distress. Intimate and extended family ties have little correlation with men’s distress levels, but are strongly associated with lower distress for women. English ability is positively associated with lower distress for women, but not men. In terms of gender ideology, traditionally oriented women and egalitarian men have lower levels of distress. And experiencing greater dissonant acculturation increases distress for men, but not women. The influence of gender interaction terms is substantial and patterns may reflect difficulty adapting to a different gender order. Future studies of similar populations should investigate gender differences in sources of distress and resilience, and efforts to assist new arrivals might inform them of changes in gender roles they may experience, and facilitate opportunities to renegotiate gender roles. PMID:28036054

  13. Multigrid Method for Modeling Multi-Dimensional Combustion with Detailed Chemistry

    NASA Technical Reports Server (NTRS)

    Zheng, Xiaoqing; Liu, Chaoqun; Liao, Changming; Liu, Zhining; McCormick, Steve

    1996-01-01

    A highly accurate and efficient numerical method is developed for modeling 3-D reacting flows with detailed chemistry. A contravariant velocity-based governing system is developed for general curvilinear coordinates to maintain simplicity of the continuity equation and compactness of the discretization stencil. A fully-implicit backward Euler technique and a third-order monotone upwind-biased scheme on a staggered grid are used for the respective temporal and spatial terms. An efficient semi-coarsening multigrid method based on line-distributive relaxation is used as the flow solver. The species equations are solved in a fully coupled way and the chemical reaction source terms are treated implicitly. Example results are shown for a 3-D gas turbine combustor with strong swirling inflows.

  14. Geochemical constraints on sustainable development: Can an advanced global economy achieve long-term stability?

    NASA Astrophysics Data System (ADS)

    Pickard, William F.

    2008-04-01

    The eighty-one stable chemical elements are examined individually with respect to (i) recent annual demand and (ii) worst case long-term availability in a distant future in which they must be extracted from the background sources of air, seawater, and ordinary rock. It is shown that, if a conventional use scenario is envisioned, the supplies of ruthenium, rhodium, palladium, tellurium, rhenium, osmium, iridium, platinum, gold, and especially phosphorus will be questionable while the supplies of copper, zinc, molybdenum, silver, cadmium, tin, antimony, tungsten, mercury, lead, and bismuth will be inadequate. It is therefore concluded that, in the long run, only the promotion of massive recycling and substitution technologies will suffice to maintain the global industrial society now developing.

  15. Developing a historical climatology of Wales from Welsh and English language sources

    NASA Astrophysics Data System (ADS)

    MacDonald, N.; Davies, S. J.; Jones, C. A.; Charnell-White, C.

    2009-04-01

    Historical documentary records are recognised as valuable in understanding long term climate variability. In the UK, the Central England Temperature Series (1772- ) and the Lamb weather catalogue (1861- ) provide a detailed climate record for England, but the value of these archives in Wales and Scotland is more limited, though some long term instrumental series exist, particularly for cities such as Cardiff. The spatial distance from the central England area and a lower density of instrumental stations in Wales has limited understanding of climate variability during the instrumental period (~1750- ). This paper illustrates that historical documentary records represent a considerable resource, that to date have been underutilised in developing a more complete understanding of past weather and climate within many parts of Western Europe.

  16. OrChem - An open source chemistry search engine for Oracle(R).

    PubMed

    Rijnbeek, Mark; Steinbeck, Christoph

    2009-10-22

    Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net.

  17. Solving transient acoustic boundary value problems with equivalent sources using a lumped parameter approach.

    PubMed

    Fahnline, John B

    2016-12-01

    An equivalent source method is developed for solving transient acoustic boundary value problems. The method assumes the boundary surface is discretized in terms of triangular or quadrilateral elements and that the solution is represented using the acoustic fields of discrete sources placed at the element centers. Also, the boundary condition is assumed to be specified for the normal component of the surface velocity as a function of time, and the source amplitudes are determined to match the known elemental volume velocity vector at a series of discrete time steps. Equations are given for marching-on-in-time schemes to solve for the source amplitudes at each time step for simple, dipole, and tripole source formulations. Several example problems are solved to illustrate the results and to validate the formulations, including problems with closed boundary surfaces where long-time numerical instabilities typically occur. A simple relationship between the simple and dipole source amplitudes in the tripole source formulation is derived so that the source radiates primarily in the direction of the outward surface normal. The tripole source formulation is shown to eliminate interior acoustic resonances and long-time numerical instabilities.

  18. Extended lattice Boltzmann scheme for droplet combustion.

    PubMed

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n-butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  19. Short-Term Rhizosphere Effect on Available Carbon Sources, Phenanthrene Degradation, and Active Microbiome in an Aged-Contaminated Industrial Soil

    PubMed Central

    Thomas, François; Cébron, Aurélie

    2016-01-01

    Over the last decades, understanding of the effects of plants on soil microbiomes has greatly advanced. However, knowledge on the assembly of rhizospheric communities in aged-contaminated industrial soils is still limited, especially with regard to transcriptionally active microbiomes and their link to the quality or quantity of carbon sources. We compared the short-term (2–10 days) dynamics of bacterial communities and potential PAH-degrading bacteria in bare or ryegrass-planted aged-contaminated soil spiked with phenanthrene, put in relation with dissolved organic carbon (DOC) sources and polycyclic aromatic hydrocarbon (PAH) pollution. Both resident and active bacterial communities (analyzed from DNA and RNA, respectively) showed higher species richness and smaller dispersion between replicates in planted soils. Root development strongly favored the activity of Pseudomonadales within the first 2 days, and of members of Actinobacteria, Caulobacterales, Rhizobiales, and Xanthomonadales within 6–10 days. Plants slowed down the dissipation of phenanthrene, while root exudation provided a cocktail of labile substrates that might preferentially fuel microbial growth. Although the abundance of PAH-degrading genes increased in planted soil, their transcription level stayed similar to bare soil. In addition, network analysis revealed that plants induced an early shift in the identity of potential phenanthrene degraders, which might influence PAH dissipation on the long-term. PMID:26903971

  20. Attenuation Tomography of Northern California and the Yellow Sea / Korean Peninsula from Coda-source Normalized and Direct Lg Amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Dreger, D S; Phillips, W S

    2008-07-16

    Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less

  1. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    PubMed

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  2. 10 CFR 40.41 - Terms and conditions of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...

  3. Single Crystal Diffuse Neutron Scattering

    DOE PAGES

    Welberry, Richard; Whitfield, Ross

    2018-01-11

    Diffuse neutron scattering has become a valuable tool for investigating local structure in materials ranging from organic molecular crystals containing only light atoms to piezo-ceramics that frequently contain heavy elements. Although neutron sources will never be able to compete with X-rays in terms of the available flux the special properties of neutrons, viz. the ability to explore inelastic scattering events, the fact that scattering lengths do not vary systematically with atomic number and their ability to scatter from magnetic moments, provides strong motivation for developing neutron diffuse scattering methods. Here, we compare three different instruments that have been used bymore » us to collect neutron diffuse scattering data. Two of these are on a spallation source and one on a reactor source.« less

  4. Single Crystal Diffuse Neutron Scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welberry, Richard; Whitfield, Ross

    Diffuse neutron scattering has become a valuable tool for investigating local structure in materials ranging from organic molecular crystals containing only light atoms to piezo-ceramics that frequently contain heavy elements. Although neutron sources will never be able to compete with X-rays in terms of the available flux the special properties of neutrons, viz. the ability to explore inelastic scattering events, the fact that scattering lengths do not vary systematically with atomic number and their ability to scatter from magnetic moments, provides strong motivation for developing neutron diffuse scattering methods. Here, we compare three different instruments that have been used bymore » us to collect neutron diffuse scattering data. Two of these are on a spallation source and one on a reactor source.« less

  5. Magnetostrophic balance in planetary dynamos - Predictions for Neptune's magnetosphere

    NASA Technical Reports Server (NTRS)

    Curtis, S. A.; Ness, N. F.

    1986-01-01

    With the purpose of estimating Neptune's magnetic field and its implications for nonthermal Neptune radio emissions, a new scaling law for planetary magnetic fields was developed in terms of externally observable parameters (the planet's mean density, radius, mass, rotation rate, and internal heat source luminosity). From a comparison of theory and observations by Voyager it was concluded that planetary dynamos are two-state systems with either zero intrinsic magnetic field (for planets with low internal heat source) or (for planets with the internal heat source sufficiently strong to drive convection) a magnetic field near the upper bound determined from magnetostrophic balance. It is noted that mass loading of the Neptune magnetosphere by Triton may play an important role in the generation of nonthermal radio emissions.

  6. Identifying Greater Sage-Grouse source and sink habitats for conservation planning in an energy development landscape.

    PubMed

    Kirol, Christopher P; Beck, Jeffrey L; Huzurbazar, Snehalata V; Holloran, Matthew J; Miller, Scott N

    2015-06-01

    Conserving a declining species that is facing many threats, including overlap of its habitats with energy extraction activities, depends upon identifying and prioritizing the value of the habitats that remain. In addition, habitat quality is often compromised when source habitats are lost or fragmented due to anthropogenic development. Our objective was to build an ecological model to classify and map habitat quality in terms of source or sink dynamics for Greater Sage-Grouse (Centrocercus urophasianus) in the Atlantic Rim Project Area (ARPA), a developing coalbed natural gas field in south-central Wyoming, USA. We used occurrence and survival modeling to evaluate relationships between environmental and anthropogenic variables at multiple spatial scales and for all female summer life stages, including nesting, brood-rearing, and non-brooding females. For each life stage, we created resource selection functions (RSFs). We weighted the RSFs and combined them to form a female summer occurrence map. We modeled survival also as a function of spatial variables for nest, brood, and adult female summer survival. Our survival-models were mapped as survival probability functions individually and then combined with fixed vital rates in a fitness metric model that, when mapped, predicted habitat productivity (productivity map). Our results demonstrate a suite of environmental and anthropogenic variables at multiple scales that were predictive of occurrence and survival. We created a source-sink map by overlaying our female summer occurrence map and productivity map to predict habitats contributing to population surpluses (source habitats) or deficits (sink habitat) and low-occurrence habitats on the landscape. The source-sink map predicted that of the Sage-Grouse habitat within the ARPA, 30% was primary source, 29% was secondary source, 4% was primary sink, 6% was secondary sink, and 31% was low occurrence. Our results provide evidence that energy development and avoidance of energy infrastructure were probably reducing the amount of source habitat within the ARPA landscape. Our source-sink map provides managers with a means of prioritizing habitats for conservation planning based on source and sink dynamics. The spatial identification of high value (i.e., primary source) as well as suboptimal (i.e., primary sink) habitats allows for informed energy development to minimize effects on local wildlife populations.

  7. Citizen Sensors for SHM: Towards a Crowdsourcing Platform

    PubMed Central

    Ozer, Ekin; Feng, Maria Q.; Feng, Dongming

    2015-01-01

    This paper presents an innovative structural health monitoring (SHM) platform in terms of how it integrates smartphone sensors, the web, and crowdsourcing. The ubiquity of smartphones has provided an opportunity to create low-cost sensor networks for SHM. Crowdsourcing has given rise to citizen initiatives becoming a vast source of inexpensive, valuable but heterogeneous data. Previously, the authors have investigated the reliability of smartphone accelerometers for vibration-based SHM. This paper takes a step further to integrate mobile sensing and web-based computing for a prospective crowdsourcing-based SHM platform. An iOS application was developed to enable citizens to measure structural vibration and upload the data to a server with smartphones. A web-based platform was developed to collect and process the data automatically and store the processed data, such as modal properties of the structure, for long-term SHM purposes. Finally, the integrated mobile and web-based platforms were tested to collect the low-amplitude ambient vibration data of a bridge structure. Possible sources of uncertainties related to citizens were investigated, including the phone location, coupling conditions, and sampling duration. The field test results showed that the vibration data acquired by smartphones operated by citizens without expertise are useful for identifying structural modal properties with high accuracy. This platform can be further developed into an automated, smart, sustainable, cost-free system for long-term monitoring of structural integrity of spatially distributed urban infrastructure. Citizen Sensors for SHM will be a novel participatory sensing platform in the way that it offers hybrid solutions to transitional crowdsourcing parameters. PMID:26102490

  8. Functional Annotation of the Arabidopsis Genome Using Controlled Vocabularies1

    PubMed Central

    Berardini, Tanya Z.; Mundodi, Suparna; Reiser, Leonore; Huala, Eva; Garcia-Hernandez, Margarita; Zhang, Peifen; Mueller, Lukas A.; Yoon, Jungwoon; Doyle, Aisling; Lander, Gabriel; Moseyko, Nick; Yoo, Danny; Xu, Iris; Zoeckler, Brandon; Montoya, Mary; Miller, Neil; Weems, Dan; Rhee, Seung Y.

    2004-01-01

    Controlled vocabularies are increasingly used by databases to describe genes and gene products because they facilitate identification of similar genes within an organism or among different organisms. One of The Arabidopsis Information Resource's goals is to associate all Arabidopsis genes with terms developed by the Gene Ontology Consortium that describe the molecular function, biological process, and subcellular location of a gene product. We have also developed terms describing Arabidopsis anatomy and developmental stages and use these to annotate published gene expression data. As of March 2004, we used computational and manual annotation methods to make 85,666 annotations representing 26,624 unique loci. We focus on associating genes to controlled vocabulary terms based on experimental data from the literature and use The Arabidopsis Information Resource-developed PubSearch software to facilitate this process. Each annotation is tagged with a combination of evidence codes, evidence descriptions, and references that provide a robust means to assess data quality. Annotation of all Arabidopsis genes will allow quantitative comparisons between sets of genes derived from sources such as microarray experiments. The Arabidopsis annotation data will also facilitate annotation of newly sequenced plant genomes by using sequence similarity to transfer annotations to homologous genes. In addition, complete and up-to-date annotations will make unknown genes easy to identify and target for experimentation. Here, we describe the process of Arabidopsis functional annotation using a variety of data sources and illustrate several ways in which this information can be accessed and used to infer knowledge about Arabidopsis and other plant species. PMID:15173566

  9. Quality evaluation of carbonaceous industrial by-products and its effect on properties of autoclave aerated concrete

    NASA Astrophysics Data System (ADS)

    Fomina, E. V.; Lesovik, V. S.; Fomin, A. E.; Kozhukhova, N. I.; Lebedev, M. S.

    2018-03-01

    Argillite is a carbonaceous industrial by-product that is a potential source in environmentally friendly and source-saving construction industry. In this research, chemical and mineral composition as well as particle size distribution of argillite were studied and used to develop autoclave aerated concrete as partial substitute of quartz sand. Effect of the argillite as a mineral admixture in autoclave aerated concrete was investigated in terms of compressive and tensile strength, density, heat conductivity etc. The obtained results demonstrated an efficiency of argillite as an energy-saving material in autoclave construction composites.

  10. Utilizing Descriptive Statements from the Biodiversity Heritage Library to Expand the Hymenoptera Anatomy Ontology

    PubMed Central

    Seltmann, Katja C.; Pénzes, Zsolt; Yoder, Matthew J.; Bertone, Matthew A.; Deans, Andrew R.

    2013-01-01

    Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the ‘Proofer’, as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction. PMID:23441153

  11. Utilizing descriptive statements from the biodiversity heritage library to expand the Hymenoptera Anatomy Ontology.

    PubMed

    Seltmann, Katja C; Pénzes, Zsolt; Yoder, Matthew J; Bertone, Matthew A; Deans, Andrew R

    2013-01-01

    Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the 'Proofer', as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction.

  12. Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steven M.; Harding, Lee

    The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.

  13. Localization of sound sources in a room with one microphone

    NASA Astrophysics Data System (ADS)

    Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre

    2017-08-01

    Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.

  14. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  15. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  16. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    NASA Astrophysics Data System (ADS)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration and deposition observations over Europe. The results of the present inversion were confirmed using an independent Eulerian model, for which deposition patterns were also improved when using the estimated posterior releases. Although the independent model tends to underestimate deposition in countries that are not in the main direction of the plume, it reproduces country levels of deposition very efficiently. The results were also tested for robustness against different setups of the inversion through sensitivity runs. The source term data from this study are publicly available.

  17. Indigenous Manufacturing realization of TWIN Source

    NASA Astrophysics Data System (ADS)

    Pandey, R.; Bandyopadhyay, M.; Parmar, D.; Yadav, R.; Tyagi, H.; Soni, J.; Shishangiya, H.; Sudhir Kumar, D.; Shah, S.; Bansal, G.; Pandya, K.; Parmar, K.; Vuppugalla, M.; Gahlaut, A.; Chakraborty, A.

    2017-04-01

    TWIN source is two RF driver based negative ion source that has been planned to bridge the gap between single driver based ROBIN source (currently operational) and eight river based DNB source (to be operated under IN-TF test facility). TWIN source experiments have been planned at IPR keeping the objective of long term domestic fusion programme to gain operational experiences on vacuum immersed multi driver RF based negative ion source. High vacuum compatible components of twin source are designed at IPR keeping an aim on indigenous built in attempt. These components of TWIN source are mainly stainless steel and OFC-Cu. Being high heat flux receiving components, one of the major functional requirements is continuous heat removal via water as cooling medium. Hence for the purpose stainless steel parts are provided with externally milled cooling lines and that shall be covered with a layer of OFC-cu which would be on the receiving side of high heat flux. Manufacturability of twin source components requires joining of these dissimilar materials via process like electrode position, electron beam welding and vacuum brazing. Any of these manufacturing processes shall give a vacuum tight joint having proper joint strength at operating temperature and pressure. Taking the indigenous development effort vacuum brazing (in non-nuclear environment) has been opted for joining of dissimilar materials of twin source being one of the most reliable joining techniques and commercially feasible across the suppliers of country. Manufacturing design improvisation for the components has been done to suit the vacuum brazing process requirement and to ease some of the machining without comprising over the functional and operational requirements. This paper illustrates the details on the indigenous development effort, design improvisation to suits manufacturability, vacuum brazing basics and its procedures for twin source components.

  18. A Novel Airborne Carbon Isotope Analyzer for Methane and Carbon Dioxide Source Fingerprinting

    NASA Astrophysics Data System (ADS)

    Berman, E. S.; Huang, Y. W.; Owano, T. G.; Leifer, I.

    2014-12-01

    Recent field studies on major sources of the important greenhouse gas methane (CH4) indicate significant underestimation of methane release from fossil fuel industrial (FFI) and animal husbandry sources, among others. In addition, uncertainties still exist with respect to carbon dioxide (CO2) measurements, especially source fingerprinting. CO2 isotopic analysis provides a valuable in situ measurement approach to fingerprint CH4 and CO2as associated with combustion sources, leakage from geologic reservoirs, or biogenic sources. As a result, these measurements can characterize strong combustion source plumes, such as power plant emissions, and discriminate these emissions from other sources. As part of the COMEX (CO2 and MEthane eXperiment) campaign, a novel CO2 isotopic analyzer was installed and collected data aboard the CIRPAS Twin Otter aircraft. Developing methods to derive CH4 and CO2 budgets from remote sensing data is the goal of the summer 2014 COMEX campaign, which combines hyperspectral imaging (HSI) and non-imaging spectroscopy (NIS) with in situ airborne and surface data. COMEX leverages the synergy between high spatial resolution HSI and moderate spatial resolution NIS. The carbon dioxide isotope analyzer developed by Los Gatos Research (LGR) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology and incorporates proprietary internal thermal control for high sensitivity and optimal instrument stability. This analyzer measures CO2 concentration as well as δ13C, δ18O, and δ17O from CO2 at natural abundance (100-3000 ppm). The laboratory accuracy is ±1.2 ppm (1σ) in CO2 from 370-1000 ppm, with a long-term (1000 s) precision of ±0.012 ppm. The long-term precision for both δ13C and δ18O is 0.04 ‰, and for δ17O is 0.06 ‰. The analyzer was field-tested as part of the COWGAS campaign, a pre-cursor campaign to COMEX in March 2014, where it successfully discriminated plumes related to combustion processes associated with dairy activities (tractor exhaust) from plumes and sources in air enriched in methane and ammonia from bovine activities including waste maintenance. Methodology, laboratory data, field data from COWGAS, and field data from the COMEX campaign acquired by LGR's carbon isotope analyzer as well as other COMEX analyzers are presented.

  19. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  20. Source reconstruction via the spatiotemporal Kalman filter and LORETA from EEG time series with 32 or fewer electrodes.

    PubMed

    Hamid, Laith; Al Farawn, Ali; Merlet, Isabelle; Japaridze, Natia; Heute, Ulrich; Stephani, Ulrich; Galka, Andreas; Wendling, Fabrice; Siniatchkin, Michael

    2017-07-01

    The clinical routine of non-invasive electroencephalography (EEG) is usually performed with 8-40 electrodes, especially in long-term monitoring, infants or emergency care. There is a need in clinical and scientific brain imaging to develop inverse solution methods that can reconstruct brain sources from these low-density EEG recordings. In this proof-of-principle paper we investigate the performance of the spatiotemporal Kalman filter (STKF) in EEG source reconstruction with 9-, 19- and 32- electrodes. We used simulated EEG data of epileptic spikes generated from lateral frontal and lateral temporal brain sources using state-of-the-art neuronal population models. For validation of source reconstruction, we compared STKF results to the location of the simulated source and to the results of low-resolution brain electromagnetic tomography (LORETA) standard inverse solution. STKF consistently showed less localization bias compared to LORETA, especially when the number of electrodes was decreased. The results encourage further research into the application of the STKF in source reconstruction of brain activity from low-density EEG recordings.

  1. Low-level rf control of Spallation Neutron Source: System and characterization

    NASA Astrophysics Data System (ADS)

    Ma, Hengjie; Champion, Mark; Crofford, Mark; Kasemir, Kay-Uwe; Piller, Maurice; Doolittle, Lawrence; Ratti, Alex

    2006-03-01

    The low-level rf control system currently commissioned throughout the Spallation Neutron Source (SNS) LINAC evolved from three design iterations over 1 yr intensive research and development. Its digital hardware implementation is efficient, and has succeeded in achieving a minimum latency of less than 150 ns which is the key for accomplishing an all-digital feedback control for the full bandwidth. The control bandwidth is analyzed in frequency domain and characterized by testing its transient response. The hardware implementation also includes the provision of a time-shared input channel for a superior phase differential measurement between the cavity field and the reference. A companion cosimulation system for the digital hardware was developed to ensure a reliable long-term supportability. A large effort has also been made in the operation software development for the practical issues such as the process automations, cavity filling, beam loading compensation, and the cavity mechanical resonance suppression.

  2. Technical Study of a Standalone Photovoltaic–Wind Energy Based Hybrid Power Supply Systems for Island Electrification in Malaysia

    PubMed Central

    Samrat, Nahidul Hoque; Ahmad, Norhafizan; Choudhury, Imtiaz Ahmed; Taha, Zahari

    2015-01-01

    Energy is one of the most important factors in the socioeconomic development of a country. In a developing country like Malaysia, the development of islands is mostly related to the availability of electric power. Power generated by renewable energy sources has recently become one of the most promising solutions for the electrification of islands and remote rural areas. But high dependency on weather conditions and the unpredictable nature of these renewable energy sources are the main drawbacks. To overcome this weakness, different green energy sources and power electronic converters need to be integrated with each other. This study presents a battery storage hybrid standalone photovoltaic-wind energy power supply system. In the proposed standalone hybrid system, a DC-DC buck-boost bidirectional converter controller is used to accumulates the surplus hybrid power in the battery bank and supplies this power to the load during the hybrid power shortage by maintaining the constant dc-link voltage. A three-phase voltage source inverter complex vector control scheme is used to control the load side voltage in terms of the voltage amplitude and frequency. Based on the simulation results obtained from MATLAB/Simulink, it has been found that the overall hybrid framework is capable of working under variable weather and load conditions. PMID:26121032

  3. Technical Study of a Standalone Photovoltaic-Wind Energy Based Hybrid Power Supply Systems for Island Electrification in Malaysia.

    PubMed

    Samrat, Nahidul Hoque; Ahmad, Norhafizan; Choudhury, Imtiaz Ahmed; Taha, Zahari

    2015-01-01

    Energy is one of the most important factors in the socioeconomic development of a country. In a developing country like Malaysia, the development of islands is mostly related to the availability of electric power. Power generated by renewable energy sources has recently become one of the most promising solutions for the electrification of islands and remote rural areas. But high dependency on weather conditions and the unpredictable nature of these renewable energy sources are the main drawbacks. To overcome this weakness, different green energy sources and power electronic converters need to be integrated with each other. This study presents a battery storage hybrid standalone photovoltaic-wind energy power supply system. In the proposed standalone hybrid system, a DC-DC buck-boost bidirectional converter controller is used to accumulates the surplus hybrid power in the battery bank and supplies this power to the load during the hybrid power shortage by maintaining the constant dc-link voltage. A three-phase voltage source inverter complex vector control scheme is used to control the load side voltage in terms of the voltage amplitude and frequency. Based on the simulation results obtained from MATLAB/Simulink, it has been found that the overall hybrid framework is capable of working under variable weather and load conditions.

  4. Path spectra derived from inversion of source and site spectra for earthquakes in Southern California

    NASA Astrophysics Data System (ADS)

    Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.

    2017-12-01

    A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the residuals for our different sets independently to see how path terms differ between event-to-station collections. The path-specific information gained from this can inform development of terms for regional GMPEs, through understanding of these seismological phenomena.

  5. Palatal development of preterm and low birthweight infants compared to term infants – What do we know? Part 1: The palate of the term newborn

    PubMed Central

    Hohoff, Ariane; Rabe, Heike; Ehmer, Ulrike; Harms, Erik

    2005-01-01

    Background The evidence on prematurity as 'a priori' a risk for palatal disturbances that increase the need for orthodontic or orthognathic treatment is still weak. Further well-designed clinical studies are needed. The objective of this review is to provide a fundamental analysis of methodologies, confounding factors, and outcomes of studies on palatal development. One focus of this review is the analysis of studies on the palate of the term newborn, since knowing what is 'normal' is a precondition of being able to assess abnormalities. Methods A search profile based on Cochrane search strategies applied to 10 medical databases was used to identify existing studies. Articles, mainly those published before 1960, were identified from hand searches in textbooks, encyclopedias, reference lists and bibliographies. Sources in English, German, and French of more than a century were included. Data for term infants were recalculated if particular information about weight, length, or maturity was given. The extracted values, especially those from non-English paper sources, were provided unfiltered for comparison. Results The search strategy yielded 182 articles, of which 155 articles remained for final analysis. Morphology of the term newborn's palate was of great interest in the first half of the last century. Two general methodologies were used to assess palatal morphology: visual and metrical descriptions. Most of the studies on term infants suffer from lack of reliability tests. The groove system was recognized as the distinctive feature of the infant palate. The shape of the palate of the term infant may vary considerably, both visually and metrically. Gender, race, mode of delivery, and nasal deformities were identified as causes contributing to altered palatal morphology. Until today, anatomical features of the newborn's palate are subject to a non-uniform nomenclature. Conclusion Today's knowledge of a newborn's 'normal' palatal morphology is based on non-standardized and limited methodologies for measuring a three-dimensional shape. This shortcoming increases bias and is the reason for contradictory research results, especially if pathologic conditions like syndromes or prematurity are involved. Adequate measurement techniques are needed and the 'normal palatal morphology' should be defined prior to new clinical studies on palatal development. PMID:16270908

  6. Generating a focused view of disease ontology cancer terms for pan-cancer data integration and analysis

    PubMed Central

    Wu, Tsung-Jung; Schriml, Lynn M.; Chen, Qing-Rong; Colbert, Maureen; Crichton, Daniel J.; Finney, Richard; Hu, Ying; Kibbe, Warren A.; Kincaid, Heather; Meerzaman, Daoud; Mitraka, Elvira; Pan, Yang; Smith, Krista M.; Srivastava, Sudhir; Ward, Sari; Yan, Cheng; Mazumder, Raja

    2015-01-01

    Bio-ontologies provide terminologies for the scientific community to describe biomedical entities in a standardized manner. There are multiple initiatives that are developing biomedical terminologies for the purpose of providing better annotation, data integration and mining capabilities. Terminology resources devised for multiple purposes inherently diverge in content and structure. A major issue of biomedical data integration is the development of overlapping terms, ambiguous classifications and inconsistencies represented across databases and publications. The disease ontology (DO) was developed over the past decade to address data integration, standardization and annotation issues for human disease data. We have established a DO cancer project to be a focused view of cancer terms within the DO. The DO cancer project mapped 386 cancer terms from the Catalogue of Somatic Mutations in Cancer (COSMIC), The Cancer Genome Atlas (TCGA), International Cancer Genome Consortium, Therapeutically Applicable Research to Generate Effective Treatments, Integrative Oncogenomics and the Early Detection Research Network into a cohesive set of 187 DO terms represented by 63 top-level DO cancer terms. For example, the COSMIC term ‘kidney, NS, carcinoma, clear_cell_renal_cell_carcinoma’ and TCGA term ‘Kidney renal clear cell carcinoma’ were both grouped to the term ‘Disease Ontology Identification (DOID):4467 / renal clear cell carcinoma’ which was mapped to the TopNodes_DOcancerslim term ‘DOID:263 / kidney cancer’. Mapping of diverse cancer terms to DO and the use of top level terms (DO slims) will enable pan-cancer analysis across datasets generated from any of the cancer term sources where pan-cancer means including or relating to all or multiple types of cancer. The terms can be browsed from the DO web site (http://www.disease-ontology.org) and downloaded from the DO’s Apache Subversion or GitHub repositories. Database URL: http://www.disease-ontology.org PMID:25841438

  7. The long-term problems of contaminated land: Sources, impacts and countermeasures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  8. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  9. A study of the prediction of cruise noise and laminar flow control noise criteria for subsonic air transports

    NASA Technical Reports Server (NTRS)

    Swift, G.; Mungur, P.

    1979-01-01

    General procedures for the prediction of component noise levels incident upon airframe surfaces during cruise are developed. Contributing noise sources are those associated with the propulsion system, the airframe and the laminar flow control (LFC) system. Transformation procedures from the best prediction base of each noise source to the transonic cruise condition are established. Two approaches to LFC/acoustic criteria are developed. The first is a semi-empirical extension of the X-21 LFC/acoustic criteria to include sensitivity to the spectrum and directionality of the sound field. In the second, the more fundamental problem of how sound excites boundary layer disturbances is analyzed by deriving and solving an inhomogeneous Orr-Sommerfeld equation in which the source terms are proportional to the production and dissipation of sound induced fluctuating vorticity. Numerical solutions are obtained and compared with corresponding measurements. Recommendations are made to improve and validate both the cruise noise prediction methods and the LFC/acoustic criteria.

  10. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  11. Computation of high Reynolds number internal/external flows

    NASA Technical Reports Server (NTRS)

    Cline, M. C.; Wilmoth, R. G.

    1981-01-01

    A general, user oriented computer program, called VNAP2, has been developed to calculate high Reynolds number, internal/external flows. VNAP2 solves the two-dimensional, time-dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, and internal/external flow calculations are presented.

  12. Computation of high Reynolds number internal/external flows

    NASA Technical Reports Server (NTRS)

    Cline, M. C.; Wilmoth, R. G.

    1981-01-01

    A general, user oriented computer program, called VNAP2, was developed to calculate high Reynolds number, internal/ external flows. The VNAP2 program solves the two dimensional, time dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack Scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented.

  13. Computation of high Reynolds number internal/external flows

    NASA Technical Reports Server (NTRS)

    Cline, M. C.; Wilmoth, R. G.

    1981-01-01

    A general, user oriented computer program, called VNAF2, developed to calculate high Reynolds number internal/external flows is described. The program solves the two dimensional, time dependent Navier-Stokes equations. Turbulence is modeled with either a mixing length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented.

  14. Rayleigh, the unit for light radiance.

    PubMed

    Baker, D J

    1974-09-01

    A 0.7% accurate formula is derived for the easy conversion of power spectral radiance L(lambda) in W cm(-2) sr(-1) microm(-1)to rayleigh spectral radiance R(lambda) in rayleigh/microm, R(lambda) = 2pilambdaL(lambda) x 10(13), where the wavelength lambda is in microm. The rationale for the rayleigh unit is discussed in terms of a photon rate factor and a solid angle factor. The latter is developed in terms of an equivalence theorem about optical receivers and extended sources, and the concept is extended to the computation of photon volume emission rates from altitude profiles of zenith radiance.

  15. Structural and Cultural Factors in Successful Aging Among Older Hispanics

    PubMed Central

    Angel, Ronald J.

    2014-01-01

    Successful or healthful aging are terms that draw attention to life course issues related to individual, physical, and psychologic development and maturation, but they also draw attention to the material basis of successful aging and the social structures that determine one’s place in the social hierarchy. This article focuses on barriers to optimal aging for Hispanics, especially those of Mexican origin, and argues that cultural factors and social class are closely associated. The reduction of health disparities and equity in medical and long-term care requires an understanding of both cultural and material sources of differential health levels. PMID:19065093

  16. Static and dynamic models in economics

    NASA Astrophysics Data System (ADS)

    Safiullin, N. Z.; Safiullin, B. L.

    2018-05-01

    In this article, the authors consider the impact of information and advertising on consumer behavior and the process of producing differentiation formation. Advertising, television, radio, magazines and direct mail as major constraints of mass media may act as sources of information. Economic modernization is aimed at development of acceleration of the knowledge intensive industries, which contribute to Russia’s position in terms of the world economy; the recovering process of the Russian economic manufacturing base; development of import substitution industries and limited participation in international labor specialization.

  17. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  18. Spatial and Temporal Trends in Global Emissions of Nitrogen Oxides from 1960 to 2014.

    PubMed

    Huang, Tianbo; Zhu, Xi; Zhong, Qirui; Yun, Xiao; Meng, Wenjun; Li, Bengang; Ma, Jianmin; Zeng, Eddy Y; Tao, Shu

    2017-07-18

    The quantification of nitrogen oxide (NO x ) emissions is critical for air quality modeling. Based on updated fuel consumption and emission factor databases, a global emission inventory was compiled with high spatial (0.1° × 0.1°), temporal (monthly), and source (87 sources) resolutions for the period 1960 to 2014. The monthly emission data have been uploaded online ( http://inventory.pku.edu.cn ), along with a number of other air pollutant and greenhouse gas data for free download. Differences in source profiles, not global total quantities, between our results and those reported previously were found. There were significant differences in total and per capita emissions and emission intensities among countries, especially between the developing and developed countries. Globally, the total annual NO x emissions finally stopped increasing in 2013 after continuously increasing over several decades, largely due to strict control measures taken in China in recent years. Nevertheless, the peak year of NO x emissions was later than for many other major air pollutants. Per capita emissions, either among countries or over years, follow typical inverted U-shaped environmental Kuznets curves, indicating that the emissions increased during the early stage of development and were restrained when socioeconomic development reached certain points. Although the trends are similar among countries, the turning points of developing countries appeared sooner than those of developed countries in terms of development status, confirming late-move advantages.

  19. Describing knowledge encounters in healthcare: a mixed studies systematic review and development of a classification.

    PubMed

    Hurst, Dominic; Mickan, Sharon

    2017-03-14

    Implementation science seeks to promote the uptake of research and other evidence-based findings into practice, but for healthcare professionals, this is complex as practice draws on, in addition to scientific principles, rules of thumb and a store of practical wisdom acquired from a range of informational and experiential sources. The aims of this review were to identify sources of information and professional experiences encountered by healthcare workers and from this to build a classification system, for use in future observational studies, that describes influences on how healthcare professionals acquire and use information in their clinical practice. This was a mixed studies systematic review of observational studies. OVID MEDLINE and Embase and Google Scholar were searched using terms around information, knowledge or evidence and sharing, searching and utilisation combined with terms relating to healthcare groups. Studies were eligible if one of the intentions was to identify information or experiential encounters by healthcare workers. Data was extracted by one author after piloting with another. Studies were assessed using the Mixed Methods Appraisal Tool (MMAT). The primary outcome extracted was the information source or professional experience encounter. Similar encounters were grouped together as single constructs. Our synthesis involved a mixed approach using the top-down logic of the Bliss Bibliographic Classification System (BC2) to generate classification categories and a bottom-up approach to develop descriptive codes (or "facets") for each category, from the data. The generic terms of BC2 were customised by an iterative process of thematic content analysis. Facets were developed by using available theory and keeping in mind the pragmatic end use of the classification. Eighty studies were included from which 178 discreet knowledge encounters were extracted. Six classification categories were developed: what information or experience was encountered; how was the information or experience encountered; what was the mode of encounter; from whom did the information originate or with whom was the experience; how many participants were there; and where did the encounter take place. For each of these categories, relevant descriptive facets were identified. We have sought to identify and classify all knowledge encounters, and we have developed a faceted description of key categories which will support richer descriptions and interrogations of knowledge encounters in healthcare research.

  20. Evolution of air pollution source contributions over one decade, derived by PM10 and PM2.5 source apportionment in two metropolitan urban areas in Greece

    NASA Astrophysics Data System (ADS)

    Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.

    2017-09-01

    Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.

  1. Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests

    NASA Astrophysics Data System (ADS)

    Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.

    2015-12-01

    Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.

  2. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  3. Aerosol Microphysics and Radiation Integration

    DTIC Science & Technology

    2007-09-30

    http://www.nrlmry.navy.mil/ flambe / LONG-TERM GOALS This project works toward the development and support of real time global prognostic aerosol...Burning Emissions ( FLAMBE ) project were transition to the Fleet Numerical Oceanographic Center (FNMOC) Monterey in FY07. Meteorological guidance...Hyer, E. J. and J. S. Reid (2006), Evaluating the impact of improvements to the FLAMBE smoke source model on forecasts of aerosol distribution

  4. Seventh Graders' Perceptions of College and Career Aspiration Supports in Two Urban Charter Middle Schools

    ERIC Educational Resources Information Center

    Berardi-Demo, Linda

    2012-01-01

    College and career aspirations are important to the development of students' short and long term educational and personal goals. Although students rely on information they receive and are influenced by experiences in which they engage in a variety of settings, for many, school is an important source of college and career information. How…

  5. Plasma Radiation Source Development Program

    DTIC Science & Technology

    2006-03-01

    shell mass distributions perform belter than thin shells. The dual plenum, double shell load has unique diagnostic features that enhance our...as implosion time increases. 13. SUBJECT TERMS Zpinch x-ray diagnostics Rayleigh-Taylor instability pulsed-power x-ray spectroscopy supersonic...feature permits some very useful diagnostics that shed light on critical details of the implosion process. See Section 3 for details. We have

  6. Mitigating climate change through small-scale forestry in the USA: opportunities and challenges

    Treesearch

    Susan Charnley; David Diaz; Hannah Gosnell

    2010-01-01

    Forest management for carbon sequestration is a low-cost, low-technology, relatively easy way to help mitigate global climate change that can be adopted now while additional long-term solutions are developed. Carbon-oriented management of forests also offers forest owners an opportunity to obtain a new source of income, and commonly has environmental co-benefits. The...

  7. An Improved Neutron Transport Algorithm for Space Radiation

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Clowdsley, Martha S.; Wilson, John W.

    2000-01-01

    A low-energy neutron transport algorithm for use in space radiation protection is developed. The algorithm is based upon a multigroup analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. This analysis is accomplished by solving a realistic but simplified neutron transport test problem. The test problem is analyzed by using numerical and analytical procedures to obtain an accurate solution within specified error bounds. Results from the test problem are then used for determining mean values associated with rescattering terms that are associated with a multigroup solution of the straight-ahead Boltzmann equation. The algorithm is then coupled to the Langley HZETRN code through the evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for a water and an aluminum-water shield-target configuration is then compared with LAHET and MCNPX Monte Carlo code calculations for the same shield-target configuration. The algorithm developed showed a great improvement in results over the unmodified HZETRN solution. In addition, a two-directional solution of the evaporation source showed even further improvement of the fluence near the front of the water target where diffusion from the front surface is important.

  8. [Analysis of the quality of data issued from Beirut's hospitals in order to measure short-term health effects of air pollution].

    PubMed

    Mrad Nakhlé, M; Farah, W; Ziade, N; Abboud, M; Gerard, J; Zaarour, R; Saliba, N; Dabar, G; Abdel Massih, T; Zoghbi, A; Coussa-Koniski, M-L; Annesi-Maesano, I

    2013-12-01

    The effects of air pollution on human health have been the subject of much public health research. Several techniques and methods of analysis have been developed. Thus, Beirut Air Pollution and Health Effects (BAPHE) was designed to develop a methodology adapted to the context of the city of Beirut in order to quantify the short-term health effects of air pollution. The quality of data collected from emergency units was analyzed in order to properly estimate hospitalizations via these units. This study examined the process of selecting and validating health and pollution indicators. The different sources of data from emergency units were not correlated. BAPHE was therefore reoriented towards collecting health data from the emergency registry of each hospital. A pilot study determined the appropriate health indicators for BAPHE and created a classification methodology for data collection. In Lebanon, several studies have attempted to indirectly assess the impact of air pollution on health. They had limitations and weaknesses and offered no recommendations regarding the sources and quality of data. The present analysis will be useful for BAPHE and for planning further studies. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  9. Progress in the development of PDF turbulence models for combustion

    NASA Technical Reports Server (NTRS)

    Hsu, Andrew T.

    1991-01-01

    A combined Monte Carlo-computational fluid dynamic (CFD) algorithm was developed recently at Lewis Research Center (LeRC) for turbulent reacting flows. In this algorithm, conventional CFD schemes are employed to obtain the velocity field and other velocity related turbulent quantities, and a Monte Carlo scheme is used to solve the evolution equation for the probability density function (pdf) of species mass fraction and temperature. In combustion computations, the predictions of chemical reaction rates (the source terms in the species conservation equation) are poor if conventional turbulence modles are used. The main difficulty lies in the fact that the reaction rate is highly nonlinear, and the use of averaged temperature produces excessively large errors. Moment closure models for the source terms have attained only limited success. The probability density function (pdf) method seems to be the only alternative at the present time that uses local instantaneous values of the temperature, density, etc., in predicting chemical reaction rates, and thus may be the only viable approach for more accurate turbulent combustion calculations. Assumed pdf's are useful in simple problems; however, for more general combustion problems, the solution of an evolution equation for the pdf is necessary.

  10. An Assessment of Fission Product Scrubbing in Sodium Pools Following a Core Damage Event in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, M.; Farmer, M.; Grabaskas, D.

    The U.S. Nuclear Regulatory Commission has stated that mechanistic source term (MST) calculations are expected to be required as part of the advanced reactor licensing process. A recent study by Argonne National Laboratory has concluded that fission product scrubbing in sodium pools is an important aspect of an MST calculation for a sodium-cooled fast reactor (SFR). To model the phenomena associated with sodium pool scrubbing, a computational tool, developed as part of the Integral Fast Reactor (IFR) program, was utilized in an MST trial calculation. This tool was developed by applying classical theories of aerosol scrubbing to the decontamination ofmore » gases produced as a result of postulated fuel pin failures during an SFR accident scenario. The model currently considers aerosol capture by Brownian diffusion, inertial deposition, and gravitational sedimentation. The effects of sodium vapour condensation on aerosol scrubbing are also treated. This paper provides details of the individual scrubbing mechanisms utilized in the IFR code as well as results from a trial mechanistic source term assessment led by Argonne National Laboratory in 2016.« less

  11. Finite element solution to passive scalar transport behind line sources under neutral and unstable stratification

    NASA Astrophysics Data System (ADS)

    Liu, Chun-Ho; Leung, Dennis Y. C.

    2006-02-01

    This study employed a direct numerical simulation (DNS) technique to contrast the plume behaviours and mixing of passive scalar emitted from line sources (aligned with the spanwise direction) in neutrally and unstably stratified open-channel flows. The DNS model was developed using the Galerkin finite element method (FEM) employing trilinear brick elements with equal-order interpolating polynomials that solved the momentum and continuity equations, together with conservation of energy and mass equations in incompressible flow. The second-order accurate fractional-step method was used to handle the implicit velocity-pressure coupling in incompressible flow. It also segregated the solution to the advection and diffusion terms, which were then integrated in time, respectively, by the explicit third-order accurate Runge-Kutta method and the implicit second-order accurate Crank-Nicolson method. The buoyancy term under unstable stratification was integrated in time explicitly by the first-order accurate Euler method. The DNS FEM model calculated the scalar-plume development and the mean plume path. In particular, it calculated the plume meandering in the wall-normal direction under unstable stratification that agreed well with the laboratory and field measurements, as well as previous modelling results available in literature.

  12. Modernization and new technologies: Coping with the information explosion

    NASA Technical Reports Server (NTRS)

    Blados, Walter R.; Cotter, Gladys A.

    1993-01-01

    Information has become a valuable and strategic resource in all societies and economies. Scientific and technical information is especially important in developing and maintaining a strong national science and technology base. The expanding use of information technology, the growth of interdisciplinary research, and an increase in international collaboration are changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls'. Notwithstanding this modernization program, it is recogized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected to cover expanded sources of information. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative, one that can optimize the cost/benefit equation for all participants.

  13. The legal system. Part 1: it's not just for lawyers.

    PubMed

    Boylan-Kemp, Jo

    This article is the first of two providing an introduction to the foundational elements of the English legal system. The 'English legal system' is a rather generic term that is often used to refer to the different sources of law and the court system in which the law is practiced. Students of law will study the English legal system as a specific topic, but it is as equally important for those who work within a profession that is regulated by the law (as nursing is) to also develop an understanding of the legal boundaries within which such a profession works. Part one, therefore, will consider the matters that form the cornerstone of our legal system, such as the constitution, and it will also explain the specific legal terms and doctrines that influence how our law is made and developed. Part two will then go on to consider the different sources of law that can be found within the English legal system. The aim of these articles is to describe these principles in a way that makes them easily understandable by those who are not involved with practicing law but who instead work within other disciplines, such as nursing.

  14. A summary of the results from the DOE advanced gas reactor (AGR) fuel development and qualification program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petti, David Andrew

    2017-04-01

    Modular high temperature gas-cooled reactor (HTGR) designs were developed to provide natural safety, which prevents core damage under all licensing basis events. The principle that guides their design concepts is to passively maintain core temperatures below fission product release thresholds under all accident scenarios. The required level of fuel performance and fission product retention reduces the radioactive source term by many orders of magnitude relative to source terms for other reactor types and allows a graded approach to emergency planning and the potential elimination of the need for evacuation and sheltering beyond a small exclusion area. Achieving this level, however,more » is predicated on exceptionally high coated-particle fuel fabrication quality and excellent performance under normal operation and accident conditions. The design goal of modular HTGRs is to meet the Environmental Protection Agency (EPA) Protective Action Guides (PAGs) for offsite dose at the Exclusion Area Boundary (EAB). To achieve this, the reactor design concepts require a level of fuel integrity that is far better than that achieved for all prior U.S.-manufactured tristructural isotropic (TRISO) coated particle fuel.« less

  15. Fundamental Rotorcraft Acoustic Modeling From Experiments (FRAME)

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric

    2011-01-01

    A new methodology is developed for the construction of helicopter source noise models for use in mission planning tools from experimental measurements of helicopter external noise radiation. The models are constructed by employing a parameter identification method to an assumed analytical model of the rotor harmonic noise sources. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. The method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor harmonic noise, allowing accurate estimates of the dominant rotorcraft noise sources to be made for operating conditions based on a small number of measurements taken at different operating conditions. The ability of this method to estimate changes in noise radiation due to changes in ambient conditions is also demonstrated.

  16. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  17. Reconstructing source terms from atmospheric concentration measurements: Optimality analysis of an inversion technique

    NASA Astrophysics Data System (ADS)

    Turbelin, Grégory; Singh, Sarvesh Kumar; Issartel, Jean-Pierre

    2014-12-01

    In the event of an accidental or intentional contaminant release in the atmosphere, it is imperative, for managing emergency response, to diagnose the release parameters of the source from measured data. Reconstruction of the source information exploiting measured data is called an inverse problem. To solve such a problem, several techniques are currently being developed. The first part of this paper provides a detailed description of one of them, known as the renormalization method. This technique, proposed by Issartel (2005), has been derived using an approach different from that of standard inversion methods and gives a linear solution to the continuous Source Term Estimation (STE) problem. In the second part of this paper, the discrete counterpart of this method is presented. By using matrix notation, common in data assimilation and suitable for numerical computing, it is shown that the discrete renormalized solution belongs to a family of well-known inverse solutions (minimum weighted norm solutions), which can be computed by using the concept of generalized inverse operator. It is shown that, when the weight matrix satisfies the renormalization condition, this operator satisfies the criteria used in geophysics to define good inverses. Notably, by means of the Model Resolution Matrix (MRM) formalism, we demonstrate that the renormalized solution fulfils optimal properties for the localization of single point sources. Throughout the article, the main concepts are illustrated with data from a wind tunnel experiment conducted at the Environmental Flow Research Centre at the University of Surrey, UK.

  18. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  19. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yidong Xia; Mitch Plummer; Robert Podgorney

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less

  20. Whole-plant adjustments in coconut (Cocos nucifera) in response to sink-source imbalance.

    PubMed

    Mialet-Serra, I; Clement-Vidal, A; Roupsard, O; Jourdan, C; Dingkuhn, M

    2008-08-01

    Coconut (Cocos nucifera L.) is a perennial tropical monocotyledon that produces fruit continuously. The physiological function of the large amounts of sucrose stored in coconut stems is unknown. To test the hypothesis that reserve storage and mobilization enable the crop to adjust to variable sink-source relationships at the scale of the whole plant, we investigated the dynamics of dry matter production, yield and yield components, and concentrations of nonstructural carbohydrate reserves in a coconut plantation on Vanuatu Island in the South Pacific. Two treatments were implemented continuously over 29 months (April 2002 to August 2004): 50% leaf pruning (to reduce the source) and 100% fruit and inflorescence pruning (to reduce the sink). The pruning treatments had little effect on carbohydrate reserves because they affected only petioles, not the main reserve pool in the stem. Both pruning treatments greatly reduced dry matter production of the reproductive compartment, but vegetative growth and development were negligibly affected by treatment and season. Leaf pruning increased radiation-use efficiency (RUE) initially, and fruit pruning greatly reduced RUE throughout the experiment. Changes in RUE were negatively correlated with leaflet soluble sugar concentration, indicating feedback inhibition of photosynthesis. We conclude that vegetative development and growth of coconut show little phenotypic plasticity, assimilate demand for growth being largely independent of a fluctuating assimilate supply. The resulting sink-source imbalances were partly compensated for by transitory reserves and, more importantly, by variable RUE in the short term, and by adjustment of fruit load in the long term. Possible physiological mechanisms are discussed, as well as modeling concepts that may be applied to coconut and similar tree crops.

  1. On the Development of Spray Submodels Based on Droplet Size Moments

    NASA Astrophysics Data System (ADS)

    Beck, J. C.; Watkins, A. P.

    2002-11-01

    Hitherto, all polydisperse spray models have been based on discretising the liquid flow field into groups of equally sized droplets. The authors have recently developed a spray model that captures the full polydisperse nature of the spray flow without using droplet size classes (Beck, 2000, Ph.D thesis, UMIST; Beck and Watkins, 2001, Proc. R. Soc. London A). The parameters used to describe the distribution of droplet sizes are the moments of the droplet size distribution function. Transport equations are written for the two moments which represent the liquid mass and surface area, and two more moments representing the sum of drop radii and droplet number are approximated via use of a presumed distribution function, which is allowed to vary in space and time. The velocities to be used in the two transport equations are obtained by defining moment-average quantities and constructing further transport equations for the relevant moment-average velocities. An equation for the energy of the liquid phase and standard gas phase equations, including a k-ɛ turbulence model, are also solved. All the equations are solved in an Eulerian framework using the finite-volume approach, and the phases are coupled through source terms. Effects such as interphase drag, droplet breakup, and droplet-droplet collisions are also captured through the use of source terms. The development of the submodels to describe these effects is the subject of this paper. All the source terms for the hydrodynamics of the spray are derived in this paper in terms of the four moments of the droplet size distribution in order to find the net effect on the whole spray flow field. The development of similar submodels to describe heat and mass transfer effects between the phases is the subject of a further paper (Beck and Watkins, 2001, J. Heat Fluid Flow). The model has been applied to a wide variety of different sprays, including high-pressure diesel sprays, wide-angle solid-cone water sprays, hollow-cone spray s, and evaporating sprays. The comparisons of the results with experimental data show that the model performs well. The interphase drag model, along with the model for the turbulent dispersion of the liquid, produces excellent agreement in the spray penetration results, and the moment-average velocity approach gives good radial distributions of droplet size, showing the capability of the model to predict polydisperse behaviour. Good submodel performance results in droplet breakup, collisions, and evaporation effects (see (Beck and Watkins, 2001, J. Heat Fluid Flow)) also being captured successfully.

  2. Evaluating Decoupling Process in OECD Countries: Case Study of Turkey

    NASA Astrophysics Data System (ADS)

    An, Nazan; Şengün Ucal, Meltem; Kurnaz, M. Levent

    2017-04-01

    Climate change is at the top of the present and future problems facing humanity. Climate change is now largely attributed to human activities and economic activities are the source of human activities that cause climate change by creating pressure on the environment. Providing the sustainability of resources for the future seems possible by reducing the pressure of these economic activities on the environment. Given the increasing population pressure and growth-focused economies, it is possible to say that achieving decoupling is not so easy on a global basis. It is known that there are some problems in developing countries especially in terms of accessing reliable data in transition and implementation process of decoupling. Developed countries' decoupling practices and proper calculation methods can also be a guide for developing countries. In this study, we tried to calculate the comparative decoupling index for OECD countries and Turkey in terms of data suitability, and we showed the differences between them. We tried to indicate the level of decoupling (weak, stable, strong) for each country. We think that the comparison of Turkey can be an example in terms of developing countries. Acknowledgement: This research has been supported by Bogazici University Research Fund Grant Number 12220.

  3. Plant Perception and Short-Term Responses to Phytophagous Insects and Mites.

    PubMed

    Santamaria, M Estrella; Arnaiz, Ana; Gonzalez-Melendi, Pablo; Martinez, Manuel; Diaz, Isabel

    2018-05-03

    Plant⁻pest relationships involve complex processes encompassing a network of molecules, signals, and regulators for overcoming defenses they develop against each other. Phytophagous arthropods identify plants mainly as a source of food. In turn, plants develop a variety of strategies to avoid damage and survive. The success of plant defenses depends on rapid and specific recognition of the phytophagous threat. Subsequently, plants trigger a cascade of short-term responses that eventually result in the production of a wide range of compounds with defense properties. This review deals with the main features involved in the interaction between plants and phytophagous insects and acari, focusing on early responses from the plant side. A general landscape of the diverse strategies employed by plants within the first hours after pest perception to block the capability of phytophagous insects to develop mechanisms of resistance is presented, with the potential of providing alternatives for pest control.

  4. Pharmaceutical Industry in Syria

    PubMed Central

    2010-01-01

    The aim of this article is to present the development of the pharmaceutical industry in Syria using national and international public data sources. At the end of the 80ies, the pharmaceutical industry in Syria was very poor, covering 6% of the national needs. In less than 20 years, with the government support in terms of legal frame and strategic political engagement, the Syrian pharmaceutical industry finally covered almost 90% of the national needs, in terms of drugs, and exported drugs in around 52 Arabian countries. Beyond covering the local market, the main added values of this huge development consisted in exporting drugs in amount of 150 million dollars per year and providing jobs for 17000 Syrian people, out of which around 85% are women. Strong and weak points of the pharmaceutical sector are taken into consideration in the article and further interventions to support a sustainable development are proposed by the author. PMID:20945828

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  6. Free-space quantum key distribution with a high generation rate potassium titanyl phosphate waveguide photon-pair source

    NASA Astrophysics Data System (ADS)

    Wilson, Jeffrey D.; Chaffee, Dalton W.; Wilson, Nathaniel C.; Lekki, John D.; Tokars, Roger P.; Pouch, John J.; Roberts, Tony D.; Battle, Philip R.; Floyd, Bertram; Lind, Alexander J.; Cavin, John D.; Helmick, Spencer R.

    2016-09-01

    A high generation rate photon-pair source using a dual element periodically-poled potassium titanyl phosphate (PP KTP) waveguide is described. The fully integrated photon-pair source consists of a 1064-nm pump diode laser, fiber-coupled to a dual element waveguide within which a pair of 1064-nm photons are up-converted to a single 532-nm photon in the first stage. In the second stage, the 532-nm photon is down-converted to an entangled photon-pair at 800 nm and 1600 nm which are fiber-coupled at the waveguide output. The photon-pair source features a high pair generation rate, a compact power-efficient package, and continuous wave (CW) or pulsed operation. This is a significant step towards the long term goal of developing sources for high-rate Quantum Key Distribution (QKD) to enable Earth-space secure communications. Characterization and test results are presented. Details and preliminary results of a laboratory free space QKD experiment with the B92 protocol are also presented.

  7. Free-Space Quantum Key Distribution with a High Generation Rate Potassium Titanyl Phosphate Waveguide Photon-Pair Source

    NASA Technical Reports Server (NTRS)

    Wilson, Jeffrey D.; Chaffee, Dalton W.; Wilson, Nathaniel C.; Lekki, John D.; Tokars, Roger P.; Pouch, John J.; Roberts, Tony D.; Battle, Philip; Floyd, Bertram M.; Lind, Alexander J.; hide

    2016-01-01

    A high generation rate photon-pair source using a dual element periodically-poled potassium titanyl phosphate (PP KTP) waveguide is described. The fully integrated photon-pair source consists of a 1064-nanometer pump diode laser, fiber-coupled to a dual element waveguide within which a pair of 1064-nanometer photons are up-converted to a single 532-nanometer photon in the first stage. In the second stage, the 532-nanometer photon is down-converted to an entangled photon-pair at 800 nanometer and 1600 nanometer which are fiber-coupled at the waveguide output. The photon-pair source features a high pair generation rate, a compact power-efficient package, and continuous wave (CW) or pulsed operation. This is a significant step towards the long term goal of developing sources for high-rate Quantum Key Distribution (QKD) to enable Earth-space secure communications. Characterization and test results are presented. Details and preliminary results of a laboratory free-space QKD experiment with the B92 protocol are also presented.

  8. The potential contribution of geothermal energy to electricity supply in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Chandrasekharam, D.; Lashin, Aref; Al Arifi, Nassir

    2016-10-01

    With increase in demand for electricity at 7.5% per year, the major concern of Saudi Arabia is the amount of CO2 being emitted. The country has the potential of generating 200×106 kWh from hydrothermal sources and 120×106 terawatt hour from Enhanced Geothermal System (EGS) sources. In addition to electricity generation and desalination, the country has substantial source for direct application such as space cooling and heating, a sector that consumes 80% of the electricity generated from fossil fuels. Geothermal energy can offset easily 17 million kWh of electricity that is being used for desalination. At least a part of 181,000 Gg of CO2 emitted by conventional space cooling units can also be mitigated through ground-source heat pump technology immediately. Future development of EGS sources together with the wet geothermal systems will make the country stronger in terms of oil reserves saved and increase in exports.

  9. Neuroimaging Evidence for Agenda-Dependent Monitoring of Different Features during Short-Term Source Memory Tests

    ERIC Educational Resources Information Center

    Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.

    2008-01-01

    A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…

  10. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    NASA Astrophysics Data System (ADS)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.

  11. Development of departmental standard for traceability of measured activity for I-131 therapy capsules used in nuclear medicine.

    PubMed

    Ravichandran, Ramamoorthy; Binukumar, Jp

    2011-01-01

    International Basic Safety Standards (International Atomic Energy Agency, IAEA) provide guidance levels for diagnostic procedures in nuclear medicine indicating the maximum usual activity for various diagnostic tests in terms of activities of injected radioactive formulations. An accuracy of ± 10% in the activities of administered radio-pharmaceuticals is being recommended, for expected outcome in diagnostic and therapeutic nuclear medicine procedures. It is recommended that the long-term stability of isotope calibrators used in nuclear medicine is to be checked periodically for their performance using a long-lived check source, such as Cs-137, of suitable activity. In view of the un-availability of such a radioactive source, we tried to develop methods to maintain traceability of these instruments, for certifying measured activities for human use. Two re-entrant chambers [(HDR 1000 and Selectron Source Dosimetry System (SSDS)] with I-125 and Ir-192 calibration factors in the Department of Radiotherapy were used to measure Iodine-131 (I-131) therapy capsules to establish traceability to Mark V isotope calibrator of the Department of Nuclear Medicine. Special nylon jigs were fabricated to keep I-131 capsule holder in position. Measured activities in all the chambers showed good agreement. The accuracy of SSDS chamber in measuring Ir-192 activities in the last 5 years was within 0.5%, validating its role as departmental standard for measuring activity. The above method is adopted because mean energies of I-131 and Ir-192 are comparable.

  12. The new generation of beta-cells: replication, stem cell differentiation, and the role of small molecules.

    PubMed

    Borowiak, Malgorzata

    2010-01-01

    Diabetic patients suffer from the loss of insulin-secreting β-cells, or from an improper working β-cell mass. Due to the increasing prevalence of diabetes across the world, there is a compelling need for a renewable source of cells that could replace pancreatic β-cells. In recent years, several promising approaches to the generation of new β-cells have been developed. These include directed differentiation of pluripotent cells such as embryonic stem (ES) cells or induced pluripotent stem (iPS) cells, or reprogramming of mature tissue cells. High yield methods to differentiate cell populations into β-cells, definitive endoderm, and pancreatic progenitors, have been established using growth factors and small molecules. However, the final step of directed differentiation to generate functional, mature β-cells in sufficient quantities has yet to be achieved in vitro. Beside the needs of transplantation medicine, a renewable source of β-cells would also be important in terms of a platform to study the pathogenesis of diabetes, and to seek alternative treatments. Finally, by generating new β-cells, we could learn more details about pancreatic development and β-cell specification. This review gives an overview of pancreas ontogenesis in the perspective of stem cell differentiation, and highlights the critical aspects of small molecules in the generation of a renewable β-cell source. Also, it discusses longer term challenges and opportunities in moving towards a therapeutic goal for diabetes.

  13. Federated Access to Heterogeneous Information Resources in the Neuroscience Information Framework (NIF)

    PubMed Central

    Gupta, Amarnath; Bug, William; Marenco, Luis; Qian, Xufei; Condit, Christopher; Rangarajan, Arun; Müller, Hans Michael; Miller, Perry L.; Sanders, Brian; Grethe, Jeffrey S.; Astakhov, Vadim; Shepherd, Gordon; Sternberg, Paul W.; Martone, Maryann E.

    2009-01-01

    The overarching goal of the NIF (Neuroscience Information Framework) project is to be a one-stop-shop for Neuroscience. This paper provides a technical overview of how the system is designed. The technical goal of the first version of the NIF system was to develop an information system that a neuroscientist can use to locate relevant information from a wide variety of information sources by simple keyword queries. Although the user would provide only keywords to retrieve information, the NIF system is designed to treat them as concepts whose meanings are interpreted by the system. Thus, a search for term should find a record containing synonyms of the term. The system is targeted to find information from web pages, publications, databases, web sites built upon databases, XML documents and any other modality in which such information may be published. We have designed a system to achieve this functionality. A central element in the system is an ontology called NIFSTD (for NIF Standard) constructed by amalgamating a number of known and newly developed ontologies. NIFSTD is used by our ontology management module, called OntoQuest to perform ontology-based search over data sources. The NIF architecture currently provides three different mechanisms for searching heterogeneous data sources including relational databases, web sites, XML documents and full text of publications. Version 1.0 of the NIF system is currently in beta test and may be accessed through http://nif.nih.gov. PMID:18958629

  14. Federated access to heterogeneous information resources in the Neuroscience Information Framework (NIF).

    PubMed

    Gupta, Amarnath; Bug, William; Marenco, Luis; Qian, Xufei; Condit, Christopher; Rangarajan, Arun; Müller, Hans Michael; Miller, Perry L; Sanders, Brian; Grethe, Jeffrey S; Astakhov, Vadim; Shepherd, Gordon; Sternberg, Paul W; Martone, Maryann E

    2008-09-01

    The overarching goal of the NIF (Neuroscience Information Framework) project is to be a one-stop-shop for Neuroscience. This paper provides a technical overview of how the system is designed. The technical goal of the first version of the NIF system was to develop an information system that a neuroscientist can use to locate relevant information from a wide variety of information sources by simple keyword queries. Although the user would provide only keywords to retrieve information, the NIF system is designed to treat them as concepts whose meanings are interpreted by the system. Thus, a search for term should find a record containing synonyms of the term. The system is targeted to find information from web pages, publications, databases, web sites built upon databases, XML documents and any other modality in which such information may be published. We have designed a system to achieve this functionality. A central element in the system is an ontology called NIFSTD (for NIF Standard) constructed by amalgamating a number of known and newly developed ontologies. NIFSTD is used by our ontology management module, called OntoQuest to perform ontology-based search over data sources. The NIF architecture currently provides three different mechanisms for searching heterogeneous data sources including relational databases, web sites, XML documents and full text of publications. Version 1.0 of the NIF system is currently in beta test and may be accessed through http://nif.nih.gov.

  15. Efficient RF energy harvesting by using a fractal structured rectenna system

    NASA Astrophysics Data System (ADS)

    Oh, Sechang; Ramasamy, Mouli; Varadan, Vijay K.

    2014-04-01

    A rectenna system delivers, collects, and converts RF energy into direct current to power the electronic devices or recharge batteries. It consists of an antenna for receiving RF power, an input filter for processing energy and impedance matching, a rectifier, an output filter, and a load resistor. However, the conventional rectenna systems have drawback in terms of power generation, as the single resonant frequency of an antenna can generate only low power compared to multiple resonant frequencies. A multi band rectenna system is an optimal solution to generate more power. This paper proposes the design of a novel rectenna system, which involves developing a multi band rectenna with a fractal structured antenna to facilitate an increase in energy harvesting from various sources like Wi-Fi, TV signals, mobile networks and other ambient sources, eliminating the limitation of a single band technique. The usage of fractal antennas effects certain prominent advantages in terms of size and multiple resonances. Even though, a fractal antenna incorporates multiple resonances, controlling the resonant frequencies is an important aspect to generate power from the various desired RF sources. Hence, this paper also describes the design parameters of the fractal antenna and the methods to control the multi band frequency.

  16. Nonlinear synthesis of infrasound propagation through an inhomogeneous, absorbing atmosphere.

    PubMed

    de Groot-Hedlin, C D

    2012-08-01

    An accurate and efficient method to predict infrasound amplitudes from large explosions in the atmosphere is required for diverse source types, including bolides, volcanic eruptions, and nuclear and chemical explosions. A finite-difference, time-domain approach is developed to solve a set of nonlinear fluid dynamic equations for total pressure, temperature, and density fields rather than acoustic perturbations. Three key features for the purpose of synthesizing nonlinear infrasound propagation in realistic media are that it includes gravitational terms, it allows for acoustic absorption, including molecular vibration losses at frequencies well below the molecular vibration frequencies, and the environmental models are constrained to have axial symmetry, allowing a three-dimensional simulation to be reduced to two dimensions. Numerical experiments are performed to assess the algorithm's accuracy and the effect of source amplitudes and atmospheric variability on infrasound waveforms and shock formation. Results show that infrasound waveforms steepen and their associated spectra are shifted to higher frequencies for nonlinear sources, leading to enhanced infrasound attenuation. Results also indicate that nonlinear infrasound amplitudes depend strongly on atmospheric temperature and pressure variations. The solution for total field variables and insertion of gravitational terms also allows for the computation of other disturbances generated by explosions, including gravity waves.

  17. NASA thesaurus. Volume 3: Definitions

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.

  18. A Subspace Pursuit–based Iterative Greedy Hierarchical Solution to the Neuromagnetic Inverse Problem

    PubMed Central

    Babadi, Behtash; Obregon-Henao, Gabriel; Lamus, Camilo; Hämäläinen, Matti S.; Brown, Emery N.; Purdon, Patrick L.

    2013-01-01

    Magnetoencephalography (MEG) is an important non-invasive method for studying activity within the human brain. Source localization methods can be used to estimate spatiotemporal activity from MEG measurements with high temporal resolution, but the spatial resolution of these estimates is poor due to the ill-posed nature of the MEG inverse problem. Recent developments in source localization methodology have emphasized temporal as well as spatial constraints to improve source localization accuracy, but these methods can be computationally intense. Solutions emphasizing spatial sparsity hold tremendous promise, since the underlying neurophysiological processes generating MEG signals are often sparse in nature, whether in the form of focal sources, or distributed sources representing large-scale functional networks. Recent developments in the theory of compressed sensing (CS) provide a rigorous framework to estimate signals with sparse structure. In particular, a class of CS algorithms referred to as greedy pursuit algorithms can provide both high recovery accuracy and low computational complexity. Greedy pursuit algorithms are difficult to apply directly to the MEG inverse problem because of the high-dimensional structure of the MEG source space and the high spatial correlation in MEG measurements. In this paper, we develop a novel greedy pursuit algorithm for sparse MEG source localization that overcomes these fundamental problems. This algorithm, which we refer to as the Subspace Pursuit-based Iterative Greedy Hierarchical (SPIGH) inverse solution, exhibits very low computational complexity while achieving very high localization accuracy. We evaluate the performance of the proposed algorithm using comprehensive simulations, as well as the analysis of human MEG data during spontaneous brain activity and somatosensory stimuli. These studies reveal substantial performance gains provided by the SPIGH algorithm in terms of computational complexity, localization accuracy, and robustness. PMID:24055554

  19. Total energy management for nursing homes and other long-term care institutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-01-01

    The purpose of this publication is to provide the basic instruction needed to implement the most effective form of energy conservation--Total Energy Management, or TEM--in your long-term care facility. The effort required is worthwhile for many different reasons: TEM is self-paying; TEM promotes energy conservation without negative impact on health care services; and energy costs will continue to escalate. Following the introductory chapter, chapters are titled: Understanding Energy Consumption; Initiating a Total Energy Management Program; Developing Energy Consumption Data; Conducting the Facility Survey; Developing and Implementing the Basic Plan; Communication and Motivation; Monitoring Your Program and Keeping It Effective; andmore » Guidelines for Energy Conservation. Two appendices furnish information on building information for TEM and sources of information for energy management. (MCW)« less

  20. Coping with carbon: a near-term strategy to limit carbon dioxide emissions from power stations.

    PubMed

    Breeze, Paul

    2008-11-13

    Burning coal to generate electricity is one of the key sources of atmospheric carbon dioxide emissions; so, targeting coal-fired power plants offers one of the easiest ways of reducing global carbon emissions. Given that the world's largest economies all rely heavily on coal for electricity production, eliminating coal combustion is not an option. Indeed, coal consumption is likely to increase over the next 20-30 years. However, the introduction of more efficient steam cycles will improve the emission performance of these plants over the short term. To achieve a reduction in carbon emissions from coal-fired plant, however, it will be necessary to develop and introduce carbon capture and sequestration technologies. Given adequate investment, these technologies should be capable of commercial development by ca 2020.

  1. Breast-Feeding Friendly, but Not Formula Averse.

    PubMed

    Lewis, Juanita

    2017-11-01

    Breast-feeding is the optimal source of newborn nutrition in term infants and is associated with multiple short- and long-term health benefits. Establishment of breast-feeding may be difficult in a small subset of mothers, which can lead to adverse consequences in the newborn. Some of the consequences of suboptimal nutritional provision to the newborn, such as severe hyperbilirubinemia and breast-feeding-associated hypernatremic dehydration, can have devastating and long-lasting sequelae. Timely identification of mothers and newborns at risk for developing these complications is necessary to avoid significant morbidity and mortality. In these cases, the judicious use of formula supplementation may be considered. However, more studies are necessary to develop comprehensive formula supplementation criteria and guidelines for pediatric medical providers. [Pediatr Ann. 2017;46(11):e402-e408.]. Copyright 2017, SLACK Incorporated.

  2. Modeling the contribution of point sources and non-point sources to Thachin River water pollution.

    PubMed

    Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth

    2009-08-15

    Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.

  3. Assessing the Impact of Source-Zone Remediation Efforts at the Contaminant-Plume Scale Through Analysis of Contaminant Mass Discharge

    PubMed Central

    Brusseau, M. L.; Hatton, J.; DiGuiseppi, W.

    2011-01-01

    The long-term impact of source-zone remediation efforts was assessed for a large site contaminated by trichloroethene. The impact of the remediation efforts (soil vapor extraction and in-situ chemical oxidation) was assessed through analysis of plume-scale contaminant mass discharge, which was measured using a high-resolution data set obtained from 23 years of operation of a large pump-and-treat system. The initial contaminant mass discharge peaked at approximately 7 kg/d, and then declined to approximately 2 kg/d. This latter value was sustained for several years prior to the initiation of source-zone remediation efforts. The contaminant mass discharge in 2010, measured several years after completion of the two source-zone remediation actions, was approximately 0.2 kg/d, which is ten times lower than the value prior to source-zone remediation. The time-continuous contaminant mass discharge data can be used to evaluate the impact of the source-zone remediation efforts on reducing the time required to operate the pump-and-treat system, and to estimate the cost savings associated with the decreased operational period. While significant reductions have been achieved, it is evident that the remediation efforts have not completely eliminated contaminant mass discharge and associated risk. Remaining contaminant mass contributing to the current mass discharge is hypothesized to comprise poorly-accessible mass in the source zones, as well as aqueous (and sorbed) mass present in the extensive lower-permeability units located within and adjacent to the contaminant plume. The fate of these sources is an issue of critical import to the remediation of chlorinated-solvent contaminated sites, and development of methods to address these sources will be required to achieve successful long-term management of such sites and to ultimately transition them to closure. PMID:22115080

  4. Size distribution, directional source contributions and pollution status of PM from Chengdu, China during a long-term sampling campaign.

    PubMed

    Shi, Guo-Liang; Tian, Ying-Ze; Ma, Tong; Song, Dan-Lin; Zhou, Lai-Dong; Han, Bo; Feng, Yin-Chang; Russell, Armistead G

    2017-06-01

    Long-term and synchronous monitoring of PM 10 and PM 2.5 was conducted in Chengdu in China from 2007 to 2013. The levels, variations, compositions and size distributions were investigated. The sources were quantified by two-way and three-way receptor models (PMF2, ME2-2way and ME2-3way). Consistent results were found: the primary source categories contributed 63.4% (PMF2), 64.8% (ME2-2way) and 66.8% (ME2-3way) to PM 10 , and contributed 60.9% (PMF2), 65.5% (ME2-2way) and 61.0% (ME2-3way) to PM 2.5 . Secondary sources contributed 31.8% (PMF2), 32.9% (ME2-2way) and 31.7% (ME2-3way) to PM 10 , and 35.0% (PMF2), 33.8% (ME2-2way) and 36.0% (ME2-3way) to PM 2.5 . The size distribution of source categories was estimated better by the ME2-3way method. The three-way model can simultaneously consider chemical species, temporal variability and PM sizes, while a two-way model independently computes datasets of different sizes. A method called source directional apportionment (SDA) was employed to quantify the contributions from various directions for each source category. Crustal dust from east-north-east (ENE) contributed the highest to both PM 10 (12.7%) and PM 2.5 (9.7%) in Chengdu, followed by the crustal dust from south-east (SE) for PM 10 (9.8%) and secondary nitrate & secondary organic carbon from ENE for PM 2.5 (9.6%). Source contributions from different directions are associated with meteorological conditions, source locations and emission patterns during the sampling period. These findings and methods provide useful tools to better understand PM pollution status and to develop effective pollution control strategies. Copyright © 2016. Published by Elsevier B.V.

  5. Towards resiliency with micro-grids: Portfolio optimization and investment under uncertainty

    NASA Astrophysics Data System (ADS)

    Gharieh, Kaveh

    Energy security and sustained supply of power are critical for community welfare and economic growth. In the face of the increased frequency and intensity of extreme weather conditions which can result in power grid outage, the value of micro-grids to improve the communities' power reliability and resiliency is becoming more important. Micro-grids capability to operate in islanded mode in stressed-out conditions, dramatically decreases the economic loss of critical infrastructure in power shortage occasions. More wide-spread participation of micro-grids in the wholesale energy market in near future, makes the development of new investment models necessary. However, market and price risks in short term and long term along with risk factors' impacts shall be taken into consideration in development of new investment models. This work proposes a set of models and tools to address different problems associated with micro-grid assets including optimal portfolio selection, investment and financing in both community and a sample critical infrastructure (i.e. wastewater treatment plant) levels. The models account for short-term operational volatilities and long-term market uncertainties. A number of analytical methodologies and financial concepts have been adopted to develop the aforementioned models as follows. (1) Capital budgeting planning and portfolio optimization models with Monte Carlo stochastic scenario generation are applied to derive the optimal investment decision for a portfolio of micro-grid assets considering risk factors and multiple sources of uncertainties. (2) Real Option theory, Monte Carlo simulation and stochastic optimization techniques are applied to obtain optimal modularized investment decisions for hydrogen tri-generation systems in wastewater treatment facilities, considering multiple sources of uncertainty. (3) Public Private Partnership (PPP) financing concept coupled with investment horizon approach are applied to estimate public and private parties' revenue shares from a community-level micro-grid project over the course of assets' lifetime considering their optimal operation under uncertainty.

  6. On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel

    2018-05-01

    We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.

  7. A UMLS-based spell checker for natural language processing in vaccine safety.

    PubMed

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-02-12

    The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74-75), 100% (95% CI: 100-100), and 47% (95% CI: 46%-48%), respectively. We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest.

  8. A UMLS-based spell checker for natural language processing in vaccine safety

    PubMed Central

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-01-01

    Background The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. Methods We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. Results We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74–75), 100% (95% CI: 100–100), and 47% (95% CI: 46%–48%), respectively. Conclusion We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest. PMID:17295907

  9. Modeling individual differences in working memory performance: a source activation account

    PubMed Central

    Daily, Larry Z.; Lovett, Marsha C.; Reder, Lynne M.

    2008-01-01

    Working memory resources are needed for processing and maintenance of information during cognitive tasks. Many models have been developed to capture the effects of limited working memory resources on performance. However, most of these models do not account for the finding that different individuals show different sensitivities to working memory demands, and none of the models predicts individual subjects' patterns of performance. We propose a computational model that accounts for differences in working memory capacity in terms of a quantity called source activation, which is used to maintain goal-relevant information in an available state. We apply this model to capture the working memory effects of individual subjects at a fine level of detail across two experiments. This, we argue, strengthens the interpretation of source activation as working memory capacity. PMID:19079561

  10. Multiple Kernel Learning with Random Effects for Predicting Longitudinal Outcomes and Data Integration

    PubMed Central

    Chen, Tianle; Zeng, Donglin

    2015-01-01

    Summary Predicting disease risk and progression is one of the main goals in many clinical research studies. Cohort studies on the natural history and etiology of chronic diseases span years and data are collected at multiple visits. Although kernel-based statistical learning methods are proven to be powerful for a wide range of disease prediction problems, these methods are only well studied for independent data but not for longitudinal data. It is thus important to develop time-sensitive prediction rules that make use of the longitudinal nature of the data. In this paper, we develop a novel statistical learning method for longitudinal data by introducing subject-specific short-term and long-term latent effects through a designed kernel to account for within-subject correlation of longitudinal measurements. Since the presence of multiple sources of data is increasingly common, we embed our method in a multiple kernel learning framework and propose a regularized multiple kernel statistical learning with random effects to construct effective nonparametric prediction rules. Our method allows easy integration of various heterogeneous data sources and takes advantage of correlation among longitudinal measures to increase prediction power. We use different kernels for each data source taking advantage of the distinctive feature of each data modality, and then optimally combine data across modalities. We apply the developed methods to two large epidemiological studies, one on Huntington's disease and the other on Alzheimer's Disease (Alzheimer's Disease Neuroimaging Initiative, ADNI) where we explore a unique opportunity to combine imaging and genetic data to study prediction of mild cognitive impairment, and show a substantial gain in performance while accounting for the longitudinal aspect of the data. PMID:26177419

  11. Erratum to Surface‐wave green’s tensors in the near field

    USGS Publications Warehouse

    Haney, Matthew M.; Hisashi Nakahara,

    2016-01-01

    Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).

  12. Clinical and analytical characteristics and short-term evolution of enteroviral meningitis in young infants presenting with fever without source.

    PubMed

    Gomez, Borja; Mintegi, Santiago; Rubio, Mari Cruz; Garcia, Diego; Garcia, Silvia; Benito, Javier

    2012-06-01

    The objective of this study was to describe the characteristics of the enteroviral meningitis diagnosed in a pediatric emergency department among infants younger than 3 months with fever without source and its short-term evolution. This was a retrospective, cross-sectional, 6-year descriptive study including all infants younger than 3 months who presented with fever without source and who were diagnosed with enteroviral meningitis. A lumbar puncture was practiced at their first emergency visit in 398 (29.5%) of 1348 infants, and 65 (4.8%) were diagnosed with enteroviral meningitis, 33 of them (50.7%) between May and July. Among these 65 infants, 61 were classified as well-appearing; parents referred irritability in 16 (25.3%) of them (without statistical significance when compared with infants without meningitis). Forty-one (63.0%) had no altered infectious parameters (white blood cell [WBC] count between 5000 and 15,000/μL, absolute neutrophil count less than 10,000/μL, and C-reactive protein less than 20 g/L), and 39 (60%) had no pleocytosis. All of the 65 infants recovered well, and none of them developed short-term complications. The symptoms in infants younger than 3 months with enteroviral meningitis were similar to those in infants with a self-limited febrile process without intracranial infection. C-reactive protein and WBC count were not good enteroviral meningitis predictors. Cerebrospinal fluid WBC count was normal in many of these infants, so performing a viral test is recommended for febrile infants younger than 3 months in which a lumbar puncture is practiced during warm months. The short-term evolution was benign.

  13. Phosphatidylcholine supplementation in pregnant women consuming moderate-choline diets does not enhance infant cognitive function: a randomized, double-blind, placebo-controlled trial123

    PubMed Central

    Goldman, Barbara Davis; Fischer, Leslie M; da Costa, Kerry-Ann; Reznick, J Steven; Zeisel, Steven H

    2012-01-01

    Background: Choline is essential for fetal brain development, and it is not known whether a typical American diet contains enough choline to ensure optimal brain development. Objective: The study was undertaken to determine whether supplementing pregnant women with phosphatidylcholine (the main dietary source of choline) improves the cognitive abilities of their offspring. Design: In a double-blind, randomized controlled trial, 140 pregnant women were randomly assigned to receive supplemental phosphatidylcholine (750 mg) or a placebo (corn oil) from 18 wk gestation through 90 d postpartum. Their infants (n = 99) were tested for short-term visuospatial memory, long-term episodic memory, language development, and global development at 10 and 12 mo of age. Results: The women studied ate diets that delivered ∼360 mg choline/d in foods (∼80% of the recommended intake for pregnant women, 65% of the recommended intake for lactating women). The phosphatidylcholine supplements were well tolerated. Groups did not differ significantly in global development, language development, short-term visuospatial memory, or long-term episodic memory. Conclusions: Phosphatidylcholine supplementation of pregnant women eating diets containing moderate amounts of choline did not enhance their infants’ brain function. It is possible that a longer follow-up period would reveal late-emerging effects. Moreover, future studies should determine whether supplementing mothers eating diets much lower in choline content, such as those consumed in several low-income countries, would enhance infant brain development. This trial was registered at clinicaltrials.gov as NCT00678925. PMID:23134891

  14. The influence of initial conditions on dispersion and reactions

    NASA Astrophysics Data System (ADS)

    Wood, B. D.

    2016-12-01

    In various generalizations of the reaction-dispersion problem, researchers have developed frameworks in which the apparent dispersion coefficient can be negative. Such dispersion coefficients raise several difficult questions. Most importantly, the presence of a negative dispersion coefficient at the macroscale leads to a macroscale representation that illustrates an apparent decrease in entropy with increasing time; this, then, appears to be in violation of basic thermodynamic principles. In addition, the proposition of a negative dispersion coefficient leads to an inherently ill-posed mathematical transport equation. The ill-posedness of the problem arises because there is no unique initial condition that corresponds to a later-time concentration distribution (assuming that if discontinuous initial conditions are allowed). In this presentation, we explain how the phenomena of negative dispersion coefficients actually arise because the governing differential equation for early times should, when derived correctly, incorporate a term that depends upon the initial and boundary conditions. The process of reactions introduces a similar phenomena, where the structure of the initial and boundary condition influences the form of the macroscopic balance equations. When upscaling is done properly, new equations are developed that include source terms that are not present in the classical (late-time) reaction-dispersion equation. These source terms depend upon the structure of the initial condition of the reacting species, and they decrease exponentially in time (thus, they converge to the conventional equations at asymptotic times). With this formulation, the resulting dispersion tensor is always positive-semi-definite, and the reaction terms directly incorporate information about the state of mixedness of the system. This formulation avoids many of the problems that would be engendered by defining negative-definite dispersion tensors, and properly represents the effective rate of reaction at early times.

  15. Scientific and technical challenges on the road towards fusion electricity

    NASA Astrophysics Data System (ADS)

    Donné, A. J. H.; Federici, G.; Litaudon, X.; McDonald, D. C.

    2017-10-01

    The goal of the European Fusion Roadmap is to deliver fusion electricity to the grid early in the second half of this century. It breaks the quest for fusion energy into eight missions, and for each of them it describes a research and development programme to address all the open technical gaps in physics and technology and estimates the required resources. It points out the needs to intensify industrial involvement and to seek all opportunities for collaboration outside Europe. The roadmap covers three periods: the short term, which runs parallel to the European Research Framework Programme Horizon 2020, the medium term and the long term. ITER is the key facility of the roadmap as it is expected to achieve most of the important milestones on the path to fusion power. Thus, the vast majority of present resources are dedicated to ITER and its accompanying experiments. The medium term is focussed on taking ITER into operation and bringing it to full power, as well as on preparing the construction of a demonstration power plant DEMO, which will for the first time demonstrate fusion electricity to the grid around the middle of this century. Building and operating DEMO is the subject of the last roadmap phase: the long term. Clearly, the Fusion Roadmap is tightly connected to the ITER schedule. Three key milestones are the first operation of ITER, the start of the DT operation in ITER and reaching the full performance at which the thermal fusion power is 10 times the power put in to the plasma. The Engineering Design Activity of DEMO needs to start a few years after the first ITER plasma, while the start of the construction phase will be a few years after ITER reaches full performance. In this way ITER can give viable input to the design and development of DEMO. Because the neutron fluence in DEMO will be much higher than in ITER, it is important to develop and validate materials that can handle these very high neutron loads. For the testing of the materials, a dedicated 14 MeV neutron source is needed. This DEMO Oriented Neutron Source (DONES) is therefore an important facility to support the fusion roadmap.

  16. Globalization, women's migration, and the long-term-care workforce.

    PubMed

    Browne, Colette V; Braun, Kathryn L

    2008-02-01

    With the aging of the world's population comes the rising need for qualified direct long-term-care (DLTC) workers (i.e., those who provide personal care to frail and disabled older adults). Developed nations are increasingly turning to immigrant women to fill these needs. In this article, we examine the impact of three global trends-population aging, globalization, and women's migration-on the supply and demand for DLTC workers in the United States. Following an overview of these trends, we identify three areas with embedded social justice issues that are shaping the DLTC workforce in the United States, with a specific focus on immigrant workers in these settings. These include world poverty and economic inequalities, the feminization and colorization of labor (especially in long-term care), and empowerment and women's rights. We conclude with a discussion of the contradictory effects that both population aging and globalization have on immigrant women, source countries, and the long-term-care workforce in the United States. We raise a number of policy, practice, and research implications and questions. For policy makers and long-term-care administrators in receiver nations such as the United States, the meeting of DLTC worker needs with immigrants may result in greater access to needed employees but also in the continued devaluation of eldercare as a profession. Source (supply) nations must balance the real and potential economic benefits of remittances from women who migrate for labor with the negative consequences of disrupting family care traditions and draining the long-term-care workforce of those countries.

  17. Application of Geodetic VLBI Data to Obtaining Long-Term Light Curves for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kijima, Masachika

    2010-01-01

    The long-term light curve is important to research on binary black holes and disk instability in AGNs. The light curves have been drawn mainly using single dish data provided by the University of Michigan Radio Observatory and the Metsahovi Radio Observatory. Hence, thus far, we have to research on limited sources. I attempt to draw light curves using VLBI data for those sources that have not been monitored by any observatories with single dish. I developed software, analyzed all geodetic VLBI data available at the IVS Data Centers, and drew the light curves at 8 GHz. In this report, I show the tentative results for two AGNs. I compared two light curves of 4C39.25, which were drawn based on single dish data and on VLBI data. I confirmed that the two light curves were consistent. Furthermore, I succeeded in drawing the light curve of 0454-234 with VLBI data, which has not been monitored by any observatory with single dish. In this report, I suggest that the geodetic VLBI archive data is useful to obtain the long-term light curves at radio bands for astrophysics.

  18. Thinking the "unthinkable": why Philip Morris considered quitting

    PubMed Central

    Smith, E; Malone, R

    2003-01-01

    Objective: To investigate the genesis and development of tobacco company Philip Morris's recent image enhancement strategies and analyse their significance. Data sources: Internal Philip Morris documents, made available by the terms of the Master Settlement Agreement between the tobacco companies and the attorneys general of 46 states, and secondary newspaper sources. Study selection: Searches of the Philip Morris documents website (www.pmdocs.com) beginning with terms such as "image management" and "identity" and expanding as relevant new terms (consultant names, project names, and dates), were identified, using a "snowball" sampling strategy. Findings and conclusions: In the early 1990s, Philip Morris, faced with increasing pressures generated both externally, from the non-smokers' rights and public health communities, and internally, from the conflicts among its varied operating companies, seriously considered leaving the tobacco business. Discussions of this option, which occurred at the highest levels of management, focused on the changing social climate regarding tobacco and smoking that the tobacco control movement had effected. However, this option was rejected in favour of the image enhancement strategy that culminated with the recent "Altria" name change. This analysis suggests that advocacy efforts have the potential to significantly denormalise tobacco as a corporate enterprise. PMID:12773733

  19. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames.

    PubMed

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  20. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  1. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames

    NASA Astrophysics Data System (ADS)

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    2014-05-01

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  2. Performance Analysis of Physical Layer Security of Opportunistic Scheduling in Multiuser Multirelay Cooperative Networks

    PubMed Central

    Shim, Kyusung; Do, Nhu Tri; An, Beongku

    2017-01-01

    In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286

  3. Local tsunamis and earthquake source parameters

    USGS Publications Warehouse

    Geist, Eric L.; Dmowska, Renata; Saltzman, Barry

    1999-01-01

    This chapter establishes the relationship among earthquake source parameters and the generation, propagation, and run-up of local tsunamis. In general terms, displacement of the seafloor during the earthquake rupture is modeled using the elastic dislocation theory for which the displacement field is dependent on the slip distribution, fault geometry, and the elastic response and properties of the medium. Specifically, nonlinear long-wave theory governs the propagation and run-up of tsunamis. A parametric study is devised to examine the relative importance of individual earthquake source parameters on local tsunamis, because the physics that describes tsunamis from generation through run-up is complex. Analysis of the source parameters of various tsunamigenic earthquakes have indicated that the details of the earthquake source, namely, nonuniform distribution of slip along the fault plane, have a significant effect on the local tsunami run-up. Numerical methods have been developed to address the realistic bathymetric and shoreline conditions. The accuracy of determining the run-up on shore is directly dependent on the source parameters of the earthquake, which provide the initial conditions used for the hydrodynamic models.

  4. Early-life skin microbiota in hospitalized preterm and full-term infants.

    PubMed

    Younge, Noelle E; Araújo-Pérez, Félix; Brandon, Debra; Seed, Patrick C

    2018-05-31

    The infant skin microbiota may serve as a reservoir of bacteria that contribute to neonatal infections and stimulate local and systemic immune development. The objectives of our study were to characterize the skin microbiota of preterm and full-term infants during their birth hospitalization and describe its relationship to the microbiota of other body sites and the hospital environment. We conducted a cross-sectional study of 129 infants, including 40 preterm and 89 full-term infants. Samples were collected from five sites: the forehead and posterior auricular scalp (skin upper body); the periumbilical region, inguinal folds, and upper thighs (skin lower body); the oral cavity; the infant's immediate environment; and stool. Staphylococcus, Streptococcus, Enterococcus, and enteric Gram-negative bacteria including Escherichia and Enterobacter dominated the skin microbiota. The preterm infant microbiota at multiple sites had lower alpha diversity and greater enrichment with Staphylococcus and Escherichia than the microbiota of comparable sites in full-term infants. The community structure was highly variable among individuals but differed significantly by body site, postnatal age, and gestational age. Source tracking indicated that each body site both contributed to and received microbiota from other body sites and the hospital environment. The skin microbiota of preterm and full-term infants varied across individuals, by body site, and by the infant's developmental stage. The skin harbored many organisms that are common pathogens in hospitalized infants. Bacterial source tracking suggests that microbiota are commonly exchanged across body sites and the hospital environment as microbial communities mature in infancy.

  5. Development of radon sources with a high stability and a wide range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fukutsu, K.; Yamada, Y.

    A solid {sup 222}Rn (radon) source using a fibrous and porous SiC ceramic disk was developed. The emission rate of radon emanated from the disk depended on the content of {sup 226}Ra and the sintering temperature. A {sup 226}Ra sulfate ({sup 226}RaSO{sub 4}) solution was dropped on a fibrous SiC ceramic disk (33 mmφ) of 1 mm in thickness, and sintered at 400 °C. The radon concentration from a disk containing {sup 226}Ra of 1.85 MBq was measured to be 38 kBq m{sup −3} at a carrier airflow rate of 0.5 L min{sup −1}. By adjusting the {sup 226}Ra contentmore » or the sweep airflow rate, the radon concentrations were easily controlled over a wide range of over three orders of magnitude. The concentration was very stable for a long term. The compactness of the source disk made is easy for handling the source container and the shielding of gamma radiation from {sup 226}Ra and its decay products. Such advantages in a radon generation system are desirable for experiments of high-level, large-scale radon exposure.« less

  6. Sources of cigarettes among adolescent smokers: Free or purchased?

    PubMed

    Jansen, Paul; Toomey, Traci L; Nelson, Toben F; Fabian, Lindsey E A; Lenk, Kathleen M; Forster, Jean L

    Few studies have described youth cigarette sources in terms of whether the cigarettes were free or purchased. Understanding the different ways youth obtain tobacco can guide development of interventions to more effectively reduce youth smoking. To determine the propensity for youth to purchase cigarettes versus obtain cigarettes for free, and the factors associated with each obtainment method. Our sample included 812 youth ages 12-17 who reported ever smoking a whole cigarette. Our outcome was the source of the last cigarette smoked (purchased vs. free) and independent variables included demographics, smoking behaviors, and smoking status of parents/siblings/friends. We conducted logistic regression to assess relationships between outcome and independent variables. Eighty-four percent of youth obtained their last cigarette for free and 16% purchased their last cigarette. Youth who smoked less and had less weekly spending money were more likely to have obtained their last cigarette for free. Youth smokers appear to have a high propensity to obtain their cigarette for free, particularly those who smoke relatively infrequently. Interventions that target sources of free cigarettes have the potential to reduce the progression of youth smoking at a critical stage in its development.

  7. Design of reflector contours to satisfy photometric criteria using physically realizable light sources

    NASA Astrophysics Data System (ADS)

    Spencer, Domina E.

    2001-11-01

    Traditionally reflector design has been confined to the use of surfaces defined in terms of conic sections, assuming that all light sources can be considered to be point sources. In the middle of the twentieth century, it was recognized that major improvements could be made if the shape of the reflector was designed to produce a desired distribution of light form an actual light source. Cylindrical reflectors were created which illuminated airport runways using fluorescent lamps in such a way that pilots could make visual landings safely even in fog. These reflector contours were called macrofocal parabolic cylinders. Other new reflector contours introduced were macrofocal elliptic cylinders which confined the light to long rectangles. Surfaces of revolution the fourth degree were also developed which made possible uniform floodlighting of a circular region. These were called horned and peaked quartics. The optimum solution of the automotive head lighting problem has not yet been found. The paper concludes with a discussion of the possibility of developing reflectors which are neither cylindrical nor rotational but will produce the optimum field of view for the automobile driver both in clear weather and in fog.

  8. Meeting China's electricity needs through clean energy sources: A 2030 low-carbon energy roadmap

    NASA Astrophysics Data System (ADS)

    Hu, Zheng

    China is undergoing rapid economic development that generates significant increase in energy demand, primarily for electricity. Energy supply in China is heavily relying on coal, which leads to high carbon emissions. This dissertation explores opportunities for meeting China's growing power demand through clean energy sources. The utilization of China's clean energy sources as well as demand-side management is still at the initial phase. Therefore, development of clean energy sources would require substantial government support in order to be competitive in the market. One of the widely used means to consider clean energy in power sector supplying is Integrated Resource Strategic Planning, which aims to minimize the long term electricity costs while screening various power supply options for the power supply and demand analysis. The IRSP tool tackles the energy problem from the perspective of power sector regulators, and provides different policy scenarios to quantify the impacts of combined incentives. Through three scenario studies, Business as Usual, High Renewable, and Renewable and Demand Side Management, this dissertation identifies the optimized scenario for China to achieve the clean energy target of 2030. The scenarios are assessed through energy, economics, environment, and equity dimensions.

  9. What is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology

    NASA Astrophysics Data System (ADS)

    Borisenko, Victor E.; Ossicini, Stefano

    2004-10-01

    This introductory, reference handbook summarizes the terms and definitions, most important phenomena, and regulations discovered in the physics, chemistry, technology, and application of nanostructures. These nanostructures are typically inorganic and organic structures at the atomic scale. Fast progressing nanoelectronics and optoelectronics, molecular electronics and spintronics, nanotechnology and quantum processing of information, are of strategic importance for the information society of the 21st century. The short form of information taken from textbooks, special encyclopedias, recent original books and papers provides fast support in understanding "old" and new terms of nanoscience and technology widely used in scientific literature on recent developments. Such support is indeed important when one reads a scientific paper presenting new results in nanoscience. A representative collection of fundamental terms and definitions from quantum physics, and quantum chemistry, special mathematics, organic and inorganic chemistry, solid state physics, material science and technology accompanies recommended second sources (books, reviews, websites) for an extended study of a subject. Each entry interprets the term or definition under consideration and briefly presents main features of the phenomena behind it. Additional information in the form of notes ("First described in: ?", "Recognition: ?", "More details in: ?") supplements entries and gives a historical retrospective of the subject with reference to further sources. Ideal for answering questions related to unknown terms and definitions of undergraduate and Ph.D. students studying the physics of low-dimensional structures, nanoelectronics, nanotechnology. The handbook provides fast support, when one likes to know or to remind the essence of a scientific term, especially when it contains a personal name in its title, like in terms "Anderson localization", "Aharonov-Bohm effect", "Bose-Einstein condensate", e.t.c. More than 1000 entries, from a few sentences to a page in length.

  10. Theoretical and Experimental Aspects of Acoustic Modelling of Engine Exhaust Systems with Applications to a Vacuum Pump

    NASA Astrophysics Data System (ADS)

    Sridhara, Basavapatna Sitaramaiah

    In an internal combustion engine, the engine is the noise source and the exhaust pipe is the main transmitter of noise. Mufflers are often used to reduce engine noise level in the exhaust pipe. To optimize a muffler design, a series of experiments could be conducted using various mufflers installed in the exhaust pipe. For each configuration, the radiated sound pressure could be measured. However, this is not a very efficient method. A second approach would be to develop a scheme involving only a few measurements which can predict the radiated sound pressure at a specified distance from the open end of the exhaust pipe. In this work, the engine exhaust system was modelled as a lumped source-muffler-termination system. An expression for the predicted sound pressure level was derived in terms of the source and termination impedances, and the muffler geometry. The pressure source and monopole radiation models were used for the source and the open end of the exhaust pipe. The four pole parameters were used to relate the acoustic properties at two different cross sections of the muffler and the pipe. The developed formulation was verified through a series of experiments. Two loudspeakers and a reciprocating type vacuum pump were used as sound sources during the tests. The source impedance was measured using the direct, two-load and four-load methods. A simple expansion chamber and a side-branch resonator were used as mufflers. Sound pressure level measurements for the prediction scheme were made for several source-muffler and source-straight pipe combinations. The predicted and measured sound pressure levels were compared for all cases considered. In all cases, correlation of the experimental results and those predicted by the developed expressions was good. Predicted and measured values of the insertion loss of the mufflers were compared. The agreement between the two was good. Also, an error analysis of the four-load method was done.

  11. Development and evaluation of a lightweight sensor system ...

    EPA Pesticide Factsheets

    A new sensor system for mobile and aerial emission sampling was developed for open area pollutant sources, such as prescribed forest burns. The sensor system, termed “Kolibri”, consists of multiple low-cost air quality sensors measuring CO2, CO, samplers for particulate matter with diameter of 2.5 µm or less (PM2.5), and volatile organic compounds (VOCs). This extended abstract, intended for oral presentation or poster presentation at this summer's AWMA conference, presents some of the first verification data from laboratory and burn calibration of a newly developed sensor and sampler system for ground and aerial sampling.

  12. A Benchmark Study of Large Contract Supplier Monitoring Within DOD and Private Industry

    DTIC Science & Technology

    1994-03-01

    83 2. Long Term Supplier Relationships ...... .. 84 3. Global Sourcing . . . . . . . . . . . . .. 85 4. Refocusing on Customer Quality...monitoring and recognition, reduced number of suppliers, global sourcing, and long term contractor relationships . These initiatives were then compared to DCMC...on customer quality. 14. suBJE.C TERMS Benchmark Study of Large Contract Supplier Monitoring. 15. NUMBER OF PAGES108 16. PRICE CODE 17. SECURITY

  13. SNAP 19 Pioneer F and G. Final Report

    DOE R&D Accomplishments Database

    1973-06-01

    The generator developed for the Pioneer mission evolved from the SNAP 19 RTG`s launched aboard the NIMBUS III spacecraft. In order to satisfy the power requirements and environment of earth escape trajectory, significant modifications were made to the thermoelectric converter, heat source, and structural configuration. Specifically, a TAGS 2N thermoelectric couple was designed to provide higher efficiency and improved long term power performance, and the electrical circuitry was modified to yield very low magnetic field from current flow in the RTG. A new heat source was employed to satisfy operational requirements and its integration with the generator required alteration to the method of providing support to the fuel capsule.

  14. Creative industry in supporting economy growth in Indonesia: Perspective of regional innovation system

    NASA Astrophysics Data System (ADS)

    Hidayat, AR R. T.; Asmara, A. Y.

    2017-06-01

    Creative Industry is one of the most influential economy sources in the world in era 2000 years. It was introduced by John Howkins [1] in which economy growth is dependent on new ideas. This concept answers concerning to industrial-based economy and has shifted from industrial economy (manufacture) to creative economy (intellectual as main asset). As developing countries, Government of Indonesia has seriously paid attention on creative industry sectors since 2009 through President Instruction Number 6 Year 2009 about Development of Creative Economy in Indonesia [23]. Since Joko Widodo has been President of Republic of Indonesia, creative economy is more developed by forming creative economy agency (Bekraf). Now, economy creative is one of new economy sources which is promoted by Government of Indonesia. Many creative sectors are pushed to complete national economy in Indonesia. In this term, perspective of regional innovation system is also important to understand what is creative industry expected by Government of Indonesia. Innovation and creative economy is two terms which is not separated each other. This paper uses case study in Indonesia as research methodology, also perspective of regional innovation system is to be main perspective in this study. The result is that creative industry and innovation are mutual relation each other in conceptual level. Practically, both are aimed to support national economy growth in Indonesia

  15. Adaptation

    ERIC Educational Resources Information Center

    Littlejohn, Emily

    2018-01-01

    "Adaptation" originally began as a scientific term, but from 1860 to today it most often refers to an altered version of a text, film, or other literary source. When this term was first analyzed, humanities scholars often measured adaptations against their source texts, frequently privileging "original" texts. However, this…

  16. 40 CFR 401.11 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Environmental Protection Agency. (d) The term point source means any discernible, confined and discrete conveyance, including but not limited to any pipe, ditch, channel, tunnel, conduit, well, discrete fissure... which pollutants are or may be discharged. (e) The term new source means any building, structure...

  17. Airport-Noise Levels and Annoyance Model (ALAMO) system's reference manual

    NASA Technical Reports Server (NTRS)

    Deloach, R.; Donaldson, J. L.; Johnson, M. J.

    1986-01-01

    The airport-noise levels and annoyance model (ALAMO) is described in terms of the constituent modules, the execution of ALAMO procedure files, necessary for system execution, and the source code documentation associated with code development at Langley Research Center. The modules constituting ALAMO are presented both in flow graph form, and through a description of the subroutines and functions that comprise them.

  18. Chips: A Tool for Developing Software Interfaces Interactively.

    DTIC Science & Technology

    1987-10-01

    of the application through the objects on the screen. Chips makes this easy by supplying simple and direct access to the source code and data ...object-oriented programming, user interface management systems, programming environments. Typographic Conventions Technical terms appearing in the...creating an environment in which we could do our work. This project could not have happened without him. Jeff Bonar started and managed the Chips

  19. Development of Iron Aluminides.

    DTIC Science & Technology

    1986-03-01

    in the evolution of technology for the aerospace industry. This is particularly true for the gas turbine engine industry, where the requirements for...vulnerability of the U.S. gas turbine industry in terms of its heavy reliance upon non-domestic sources for much of its strategic metals requirements...superalloys used in gas turbine engines. The questionable future availability of chromium, for example, poses a potential serious threat to these applicacions

  20. Public Sector/Private Sector Interaction in Providing Information Services. Report to the NCLIS from the Public Sector/Private Sector Task Force.

    ERIC Educational Resources Information Center

    National Commission on Libraries and Information Science, Washington, DC.

    The results of a 2-year study on the interactions between government and private sector information activities are presented in terms of principles and guidelines for federal policy to support the development and use of information resources, products, and services, and to implement the principles. Discussions address sources of conflict between…

  1. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    NASA Technical Reports Server (NTRS)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  2. OrChem - An open source chemistry search engine for Oracle®

    PubMed Central

    2009-01-01

    Background Registration, indexing and searching of chemical structures in relational databases is one of the core areas of cheminformatics. However, little detail has been published on the inner workings of search engines and their development has been mostly closed-source. We decided to develop an open source chemistry extension for Oracle, the de facto database platform in the commercial world. Results Here we present OrChem, an extension for the Oracle 11G database that adds registration and indexing of chemical structures to support fast substructure and similarity searching. The cheminformatics functionality is provided by the Chemistry Development Kit. OrChem provides similarity searching with response times in the order of seconds for databases with millions of compounds, depending on a given similarity cut-off. For substructure searching, it can make use of multiple processor cores on today's powerful database servers to provide fast response times in equally large data sets. Availability OrChem is free software and can be redistributed and/or modified under the terms of the GNU Lesser General Public License as published by the Free Software Foundation. All software is available via http://orchem.sourceforge.net. PMID:20298521

  3. An Analysis of Sources of Technological Change in Efficiency Improvement of Fluorescent Lamp Systems

    NASA Astrophysics Data System (ADS)

    Imanaka, Takeo

    In Japan, energy efficient fluorescent lamp systems which use “rare-earth phosphors” and “electronic ballasts” have shown rapid diffusion since 1990s. This report investigated sources of technological change in the efficiency improvement of fluorescent lamp systems: (i) Fluorescent lamp and luminaires have been under steady technological development for getting more energy efficient lighting and the concepts to achieve high efficiency had been found in such activities; however, it took long time until they realized and become widely used; (ii) Electronic ballasts and rare-earth phosphors add fluorescent lamp systems not only energy efficiency but also various values such as compactness, lightweight, higher output, and better color rendering properties, which have also been expected and have induced research and development (R&D) (iii) Affordable electronic ballasts are realized by the new technology “power MOSFET” which is based on IC technologies and has been developed for large markets of information and communication technologies and mobile devices; and (iv) Rare-earth phosphors became available after rare-earth industries developed for the purpose of supplying rare-earth phosphors for color television. In terms of sources of technological change, (i) corresponds to “R&D” aiming at the particular purpose i.e. energy efficiency in this case, on the other hand, (ii), (iii), and (iv) correspond to “spillovers” from activities aiming at other purposes. This case exhibits an actual example in which “spillovers” were the critical sources of technological change in energy technology.

  4. Possible Dual Earthquake-Landslide Source of the 13 November 2016 Kaikoura, New Zealand Tsunami

    NASA Astrophysics Data System (ADS)

    Heidarzadeh, Mohammad; Satake, Kenji

    2017-10-01

    A complicated earthquake ( M w 7.8) in terms of rupture mechanism occurred in the NE coast of South Island, New Zealand, on 13 November 2016 (UTC) in a complex tectonic setting comprising a transition strike-slip zone between two subduction zones. The earthquake generated a moderate tsunami with zero-to-crest amplitude of 257 cm at the near-field tide gauge station of Kaikoura. Spectral analysis of the tsunami observations showed dual peaks at 3.6-5.7 and 5.7-56 min, which we attribute to the potential landslide and earthquake sources of the tsunami, respectively. Tsunami simulations showed that a source model with slip on an offshore plate-interface fault reproduces the near-field tsunami observation in terms of amplitude, but fails in terms of tsunami period. On the other hand, a source model without offshore slip fails to reproduce the first peak, but the later phases are reproduced well in terms of both amplitude and period. It can be inferred that an offshore source is necessary to be involved, but it needs to be smaller in size than the plate interface slip, which most likely points to a confined submarine landslide source, consistent with the dual-peak tsunami spectrum. We estimated the dimension of the potential submarine landslide at 8-10 km.

  5. The solution of three-variable duct-flow equations

    NASA Technical Reports Server (NTRS)

    Stuart, A. R.; Hetherington, R.

    1974-01-01

    This paper establishes a numerical method for the solution of three-variable problems and is applied here to rotational flows through ducts of various cross sections. An iterative scheme is developed, the main feature of which is the addition of a duplicate variable to the forward component of velocity. Two forward components of velocity result from integrating two sets of first order ordinary differential equations for the streamline curvatures, in intersecting directions across the duct. Two pseudo-continuity equations are introduced with source/sink terms, whose strengths are dependent on the difference between the forward components of velocity. When convergence is obtained, the two forward components of velocity are identical, the source/sink terms are zero, and the original equations are satisfied. A computer program solves the exact equations and boundary conditions numerically. The method is economical and compares successfully with experiments on bent ducts of circular and rectangular cross section where secondary flows are caused by gradients of total pressure upstream.

  6. Parenting knowledge: experiential and sociodemographic factors in European American mothers of young children.

    PubMed

    Bornstein, Marc H; Cote, Linda R; Haynes, O Maurice; Hahn, Chun-Shin; Park, Yoonjung

    2010-11-01

    Knowledge of child rearing and child development is relevant to parenting and the well-being of children. Using a sociodemographically heterogeneous sample of 268 European American mothers of 2-year-olds, we assessed the state of mothers' parenting knowledge; compared parenting knowledge in groups of mothers who varied in terms of parenthood and social status; and identified principal sources of mothers' parenting knowledge in terms of social factors, parenting supports, and formal classes. On the whole, European American mothers demonstrated fair but less than complete basic parenting knowledge; age, education, and rated helpfulness of written materials each uniquely contributed to mothers' knowledge. Adult mothers scored higher than adolescent mothers, and mothers improved in their knowledge of parenting from their first to their second child (and were stable across time). No differences were found between mothers of girls and boys, mothers who varied in employment status, or birth and adoptive mothers. The implications of variation in parenting knowledge and its sources for parenting education and clinical interactions with parents are discussed.

  7. Design and implementation of wireless dose logger network for radiological emergency decision support system.

    PubMed

    Gopalakrishnan, V; Baskaran, R; Venkatraman, B

    2016-08-01

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.

  8. Design and implementation of wireless dose logger network for radiological emergency decision support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less

  9. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    PubMed

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  10. A functional magnetic resonance imaging investigation of short-term source and item memory for negative pictures.

    PubMed

    Mitchell, Karen J; Mather, Mara; Johnson, Marcia K; Raye, Carol L; Greene, Erich J

    2006-10-02

    We investigated the hypothesis that arousal recruits attention to item information, thereby disrupting working memory processes that help bind items to context. Using functional magnetic resonance imaging, we compared brain activity when participants remembered negative or neutral picture-location conjunctions (source memory) versus pictures only. Behaviorally, negative trials showed disruption of short-term source, but not picture, memory; long-term picture recognition memory was better for negative than for neutral pictures. Activity in areas involved in working memory and feature integration (precentral gyrus and its intersect with superior temporal gyrus) was attenuated on negative compared with neutral source trials relative to picture-only trials. Visual processing areas (middle occipital and lingual gyri) showed greater activity for negative than for neutral trials, especially on picture-only trials.

  11. Circular current loops, magnetic dipoles and spherical harmonic analysis.

    USGS Publications Warehouse

    Alldredge, L.R.

    1980-01-01

    Spherical harmonic analysis (SHA) is the most used method of describing the Earth's magnetic field, even though spherical harmonic coefficients (SHC) almost completely defy interpretation in terms of real sources. Some moderately successful efforts have been made to represent the field in terms of dipoles placed in the core in an effort to have the model come closer to representing real sources. Dipole sources are only a first approximation to the real sources which are thought to be a very complicated network of electrical currents in the core of the Earth. -Author

  12. Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions

    NASA Astrophysics Data System (ADS)

    Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.

    2017-12-01

    Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.

  13. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1988-01-01

    The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.

  14. Growth, productivity, and scientific impact of sources of HIV/AIDS research information, with a focus on eastern and southern Africa.

    PubMed

    Bosire Onyancha, Omwoyo

    2008-05-01

    As channels of communicating HIV/AIDS research information, serial publications and particularly journals are increasingly used in response to the pandemic. The last few decades have witnessed a proliferation of sources of HIV/AIDS-related information, bringing many challenges to collection-development librarians as well as to researchers. This study uses an informetric approach to examine the growth, productivity and scientific impact of these sources, during the period 1980 to 2005, and especially to measure performance in the publication and dissemination of HIV/AIDS research about or from eastern or southern Africa. Data were collected from MEDLINE, Science Citation Index (SCI), Social Sciences Citation Index (SSCI), and Ulrich's Periodical Directory. The analysis used Sitkis version 1.5, Microsoft Office Access, Microsoft Office Excel, Bibexcel, and Citespace version 2.0.1. The specific objectives were to identify the number of sources of HIV/AIDS-related information that have been published in the region, the coverage of these in key bibliographic databases, the most commonly used publication type for HIV/AIDS research, the countries in which the sources are published, the sources' productivity in terms of numbers of papers and citations, the most influential sources, the subject coverage of the sources, and the core sources of HIV/AIDS-information.

  15. In-Space Propulsion Solar Electric Propulsion Program Overview of 2006

    NASA Technical Reports Server (NTRS)

    Baggett, Randy M.; Hulgan, Wendy W.; Dankanich, John W.; Bechtel, Robert T.

    2006-01-01

    The primary source of electric propulsion development throughout NASA is implemented by the In-Space Propulsion Technology Project at the NASA MSFC under the management of the Science Mission Directorate. The Solar Electric Propulsion technology area's objective is to develop near and mid-term SEP technology to enhance or enable mission capture while minimizing risk and cost to the end user. Major activities include developing NASA s Evolutionary Xenon Thruster (NEXT), implementing a Standard Architecture, and developing a long life High Voltage Hall Accelerator (HiVHAC). Lower level investments include advanced feed system development, advanced cathode testing and xenon recovery testing. Progress on current investments and future plans are discussed.

  16. Auditing the multiply-related concepts within the UMLS.

    PubMed

    Mougin, Fleur; Grabar, Natalia

    2014-10-01

    This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. Water homeostasis and diabetes insipidus in horses.

    PubMed

    Schott, Harold C

    2011-04-01

    Diabetes insipidus (DI) is a rare disorder of horses characterized by profound polyuria and polydipsia (PU/PD), which can be caused by loss of production of arginine vasopressin (AVP). This condition is termed neurogenic or central DI. DI may also develop with absence or loss of AVP receptors or activity on the basolateral membrane of collecting-duct epithelial cells. This condition is termed nephrogenic DI. Equine clinicians may differentiate true DI from more common causes of PU/PD by a systematic diagnostic approach. DI may not be a correctable disorder, and supportive care of affected horses requires an adequate water source. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Lessons Learned Through the Follow-up of the Long-Term Effects of Over-Exposure to an Ir192 Industrial Radiography Source in Bangladesh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalil, A.; Rabbani, G.; Hossain, M. K.

    2003-02-24

    An industrial radiographer was accidentally over-exposed while taking the radiograph of weld-joints of gas pipe-lines in 1985 in Bangladesh. Symptoms of high radiation exposure occurred immediately after the accident and skin erythema developed leading to progressive tissue deterioration. The consequences of this over-exposure is being followed up to assess the long-term effects of ionizing radiation on the victim. Progressive tissue deteriorations have already led to multiple surgeries and successive amputations of the finger-tips so far. Lessons learned from this accident are also reported in this paper.

  19. Mechanisms of Post-Infarct Left Ventricular Remodeling

    PubMed Central

    French, Brent A.; Kramer, Christopher M.

    2008-01-01

    Heart failure secondary to myocardial infarction (MI) remains a major source of morbidity and mortality. Long-term outcome after MI can be largely be defined in terms of its impact on the size and shape of the left ventricle (i.e., LV remodeling). Three major mechanisms contribute to LV remodeling: 1) early infarct expansion, 2) subsequent infarct extension into adjacent noninfarcted myocardium, and 3) late hypertrophy in the remote LV. Future developments in preventing post-MI heart failure will depend not only on identifying drugs targeting each of these individual mechanisms, but also on diagnostic techniques capable of assessing efficacy against each mechanism. PMID:18690295

  20. Evaluation of an unsteady flamelet progress variable model for autoignition and flame development in compositionally stratified mixtures

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Saumyadip; Abraham, John

    2012-07-01

    The unsteady flamelet progress variable (UFPV) model has been proposed by Pitsch and Ihme ["An unsteady/flamelet progress variable method for LES of nonpremixed turbulent combustion," AIAA Paper No. 2005-557, 2005] for modeling the averaged/filtered chemistry source terms in Reynolds averaged simulations and large eddy simulations of reacting non-premixed combustion. In the UFPV model, a look-up table of source terms is generated as a function of mixture fraction Z, scalar dissipation rate χ, and progress variable C by solving the unsteady flamelet equations. The assumption is that the unsteady flamelet represents the evolution of the reacting mixing layer in the non-premixed flame. We assess the accuracy of the model in predicting autoignition and flame development in compositionally stratified n-heptane/air mixtures using direct numerical simulations (DNS). The focus in this work is primarily on the assessment of accuracy of the probability density functions (PDFs) employed for obtaining averaged source terms. The performance of commonly employed presumed functions, such as the dirac-delta distribution function, the β distribution function, and statistically most likely distribution (SMLD) approach in approximating the shapes of the PDFs of the reactive and the conserved scalars is evaluated. For unimodal distributions, it is observed that functions that need two-moment information, e.g., the β distribution function and the SMLD approach with two-moment closure, are able to reasonably approximate the actual PDF. As the distribution becomes multimodal, higher moment information is required. Differences are observed between the ignition trends obtained from DNS and those predicted by the look-up table, especially for smaller gradients where the flamelet assumption becomes less applicable. The formulation assumes that the shape of the χ(Z) profile can be modeled by an error function which remains unchanged in the presence of heat release. We show that this assumption is not accurate.

  1. The Consortium of Advanced Residential Buildings (CARB) - A Building America Energy Efficient Housing Partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb Aldrich; Lois Arena; Dianne Griffiths

    2010-12-31

    This final report summarizes the work conducted by the Consortium of Advanced Residential Buildings (CARB) (http://www.carb-swa.com/), one of the 'Building America Energy Efficient Housing Partnership' Industry Teams, for the period January 1, 2008 to December 31, 2010. The Building America Program (BAP) is part of the Department of Energy (DOE), Energy Efficiency and Renewable Energy, Building Technologies Program (BTP). The long term goal of the BAP is to develop cost effective, production ready systems in five major climate zones that will result in zero energy homes (ZEH) that produce as much energy as they use on an annual basis bymore » 2020. CARB is led by Steven Winter Associates, Inc. with Davis Energy Group, Inc. (DEG), MaGrann Associates, and Johnson Research, LLC as team members. In partnership with our numerous builders and industry partners, work was performed in three primary areas - advanced systems research, prototype home development, and technical support for communities of high performance homes. Our advanced systems research work focuses on developing a better understanding of the installed performance of advanced technology systems when integrated in a whole-house scenario. Technology systems researched included: - High-R Wall Assemblies - Non-Ducted Air-Source Heat Pumps - Low-Load HVAC Systems - Solar Thermal Water Heating - Ventilation Systems - Cold-Climate Ground and Air Source Heat Pumps - Hot/Dry Climate Air-to-Water Heat Pump - Condensing Boilers - Evaporative condensers - Water Heating CARB continued to support several prototype home projects in the design and specification phase. These projects are located in all five program climate regions and most are targeting greater than 50% source energy savings over the Building America Benchmark home. CARB provided technical support and developed builder project case studies to be included in near-term Joule Milestone reports for the following community scale projects: - SBER Overlook at Clipper Mill (mixed, humid climate) - William Ryan Homes - Tampa (hot, humid climate).« less

  2. H- radio frequency source development at the Spallation Neutron Source.

    PubMed

    Welton, R F; Dudnikov, V G; Gawne, K R; Han, B X; Murray, S N; Pennisi, T R; Roseberry, R T; Santana, M; Stockli, M P; Turvey, M W

    2012-02-01

    The Spallation Neutron Source (SNS) now routinely operates nearly 1 MW of beam power on target with a highly persistent ∼38 mA peak current in the linac and an availability of ∼90%. H(-) beam pulses (∼1 ms, 60 Hz) are produced by a Cs-enhanced, multicusp ion source closely coupled with an electrostatic low energy beam transport (LEBT), which focuses the 65 kV beam into a radio frequency quadrupole accelerator. The source plasma is generated by RF excitation (2 MHz, ∼60 kW) of a copper antenna that has been encased with a thickness of ∼0.7 mm of porcelain enamel and immersed into the plasma chamber. The ion source and LEBT normally have a combined availability of ∼99%. Recent increases in duty-factor and RF power have made antenna failures a leading cause of downtime. This report first identifies the physical mechanism of antenna failure from a statistical inspection of ∼75 antennas which ran at the SNS, scanning electron microscopy studies of antenna surface, and cross sectional cuts and analysis of calorimetric heating measurements. Failure mitigation efforts are then described which include modifying the antenna geometry and our acceptance∕installation criteria. Progress and status of the development of the SNS external antenna source, a long-term solution to the internal antenna problem, are then discussed. Currently, this source is capable of delivering comparable beam currents to the baseline source to the SNS and, an earlier version, has briefly demonstrated unanalyzed currents up to ∼100 mA (1 ms, 60 Hz) on the test stand. In particular, this paper discusses plasma ignition (dc and RF plasma guns), antenna reliability, magnet overheating, and insufficient beam persistence.

  3. Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.

    PubMed

    Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S

    2004-01-01

    New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.

  4. Informal Taxation*

    PubMed Central

    Olken, Benjamin A.; Singhal, Monica

    2011-01-01

    Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this “informal taxation.” Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts. PMID:22199993

  5. Bayesian inverse modeling and source location of an unintended 131I release in Europe in the fall of 2011

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas

    2017-10-01

    In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of the release with its associated source term and perform a forward model simulation to study the consequences of the iodine release. Results of these procedures are compared with the known release location and reported information about its time variation. We find that our algorithm could successfully locate the actual release site. The estimated release period is also in agreement with the values reported by IAEA and the reported total released activity of 342 GBq is within the 99 % confidence interval of the posterior distribution of our most likely model.

  6. Lithium-Ion Batteries for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Surampudi, S.; Halpert, G.; Marsh, R. A.; James, R.

    1999-01-01

    This presentation reviews: (1) the goals and objectives, (2) the NASA and Airforce requirements, (3) the potential near term missions, (4) management approach, (5) the technical approach and (6) the program road map. The objectives of the program include: (1) develop high specific energy and long life lithium ion cells and smart batteries for aerospace and defense applications, (2) establish domestic production sources, and to demonstrate technological readiness for various missions. The management approach is to encourage the teaming of universities, R&D organizations, and battery manufacturing companies, to build on existing commercial and government technology, and to develop two sources for manufacturing cells and batteries. The technological approach includes: (1) develop advanced electrode materials and electrolytes to achieve improved low temperature performance and long cycle life, (2) optimize cell design to improve specific energy, cycle life and safety, (3) establish manufacturing processes to ensure predictable performance, (4) establish manufacturing processes to ensure predictable performance, (5) develop aerospace lithium ion cells in various AH sizes and voltages, (6) develop electronics for smart battery management, (7) develop a performance database required for various applications, and (8) demonstrate technology readiness for the various missions. Charts which review the requirements for the Li-ion battery development program are presented.

  7. Low Reynolds number k-epsilon modelling with the aid of direct simulation data

    NASA Technical Reports Server (NTRS)

    Rodi, W.; Mansour, N. N.

    1993-01-01

    The constant C sub mu and the near-wall damping function f sub mu in the eddy-viscosity relation of the k-epsilon model are evaluated from direct numerical simulation (DNS) data for developed channel and boundary layer flow at two Reynolds numbers each. Various existing f sub mu model functions are compared with the DNS data, and a new function is fitted to the high-Reynolds-number channel flow data. The epsilon-budget is computed for the fully developed channel flow. The relative magnitude of the terms in the epsilon-equation is analyzed with the aid of scaling arguments, and the parameter governing this magnitude is established. Models for the sum of all source and sink terms in the epsilon-equation are tested against the DNS data, and an improved model is proposed.

  8. Free Electron coherent sources: From microwave to X-rays

    NASA Astrophysics Data System (ADS)

    Dattoli, Giuseppe; Di Palma, Emanuele; Pagnutti, Simonetta; Sabia, Elio

    2018-04-01

    The term Free Electron Laser (FEL) will be used, in this paper, to indicate a wide collection of devices aimed at providing coherent electromagnetic radiation from a beam of "free" electrons, unbound at the atomic or molecular states. This article reviews the similarities that link different sources of coherent radiation across the electromagnetic spectrum from microwaves to X-rays, and compares the analogies with conventional laser sources. We explore developing a point of view that allows a unified analytical treatment of these devices, by the introduction of appropriate global variables (e.g. gain, saturation intensity, inhomogeneous broadening parameters, longitudinal mode coupling strength), yielding a very effective way for the determination of the relevant design parameters. The paper looks also at more speculative aspects of FEL physics, which may address the relevance of quantum effects in the lasing process.

  9. Emission of Sound from Turbulence Convected by a Parallel Mean Flow in the Presence of a Confining Duct

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.; Leib, Stewart J.

    1999-01-01

    An approximate method for calculating the noise generated by a turbulent flow within a semi-infinite duct of arbitrary cross section is developed. It is based on a previously derived high-frequency solution to Lilley's equation, which describes the sound propagation in a transversely-sheared mean flow. The source term is simplified by assuming the turbulence to be axisymmetric about the mean flow direction. Numerical results are presented for the special case of a ring source in a circular duct with an axisymmetric mean flow. They show that the internally generated noise is suppressed at sufficiently large upstream angles in a hard walled duct, and that acoustic liners can significantly reduce the sound radiated in both the upstream and downstream regions, depending upon the source location and Mach number of the flow.

  10. Emission of Sound From Turbulence Convected by a Parallel Mean Flow in the Presence of a Confining Duct

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.; Leib, Stewart J.

    1999-01-01

    An approximate method for calculating the noise generated by a turbulent flow within a semi-infinite duct of arbitrary cross section is developed. It is based on a previously derived high-frequency solution to Lilley's equation, which describes the sound propagation in transversely-sheared mean flow. The source term is simplified by assuming the turbulence to be axisymmetric about the mean flow direction. Numerical results are presented for the special case of a ring source in a circular duct with an axisymmetric mean flow. They show that the internally generated noise is suppressed at sufficiently large upstream angles in a hard walled duct, and that acoustic liners can significantly reduce the sound radiated in both the upstream and downstream regions, depending upon the source location and Mach number of the flow.

  11. Anaerobic fitness tests: what are we measuring?

    PubMed

    Van Praagh, Emmanuel

    2007-01-01

    Anaerobic fitness, during growth and development, has not received the same attention from researchers as aerobic fitness. This is surprising given the level of anaerobic energy used daily during childhood and adolescence. During physical activity and sport, the child is spontaneously more attracted to short-burst movements than to long-term activities. It is, however, well known that in anaerobic activities such as sprint cycling, sprint running or sprint swimming, the child's performance is distinctly poorer than that of the adult. This partly reflects the child's lesser ability to generate mechanical energy from chemical energy sources during short-term high-intensity work or exercise. Direct measurements of the rate or capacity of anaerobic pathways for energy turnover presents several ethical and methodological difficulties. Therefore, rather than measure energy supply, pediatric exercise scientists have concentrated on measuring short-term power output by means of standardized protocol tests such as short-term cycling power tests, running tests or vertical jump tests. There is, however, no perfect test and, therefore, it is important to acknowledge the benefits and limitations of each testing method. Mass-related short-term power output was shown to increase dramatically during growth and development, whereas the corresponding increase in peak blood lactate was considerably lower. This suggests that the observed difference between children and adolescents during short-term power output testing may be related to neuromuscular factors, hormonal factors and improved motor coordination.

  12. A brief compendium of correlations and analytical formulae for the thermal field generated by a heat source embedded in porous and purely-conductive media

    NASA Astrophysics Data System (ADS)

    Conti, P.; Testi, D.; Grassi, W.

    2017-11-01

    This work reviews and compares suitable models for the thermal analysis of forced convection over a heat source in a porous medium. The set of available models refers to an infinite medium in which a fluid moves over different three heat source geometries: i.e. the moving infinite line source, the moving finite line source, and the moving infinite cylindrical source. In this perspective, the present work presents a plain and handy compendium of the above-mentioned models for forced external convection in porous media; besides, we propose a dimensionless analysis to figure out the reciprocal deviation among available models, helping the selection of the most suitable one in the specific case of interest. Under specific conditions, the advection term becomes ineffective in terms of heat transfer performances, allowing the use of purely-conductive models. For that reason, available analytical and numerical solutions for purely-conductive media are also reviewed and compared, again, by dimensionless criteria. Therefore, one can choose the simplest solution, with significant benefits in terms of computational effort and interpretation of the results. The main outcomes presented in the paper are: the conditions under which the system can be considered subject to a Darcy flow, the minimal distance beyond which the finite dimension of the heat source does not affect the thermal field, and the critical fluid velocity needed to have a significant contribution of the advection term in the overall heat transfer process.

  13. Impact of saline water sources on hypertension and cardiovascular disease risk in coastal Bangladesh

    NASA Astrophysics Data System (ADS)

    Butler, Adrian; Hoque, Mohammad; Mathewson, Eleanor; Ahmed, Kazi; Rahman, Moshuir; Vineis, Paolo; Scheelbeek, Pauline

    2016-04-01

    Southern Bangladesh is periodically affected by tropical cyclone induced storm surges. Such events can result in the inundation of large areas of the coastal plain by sea water. Over time these episodic influxes of saline water have led to the build-up of a high of salinities (e.g. > 1,000 mg/l) in the shallow (up to ca. 150 m depth) groundwater. Owing to the highly saline groundwater, local communities have developed alternative surface water sources by constructing artificial drinking water ponds, which collect monsoonal rainwater. These have far greater storage than traditional rainwater harvesting systems, which typically use 40 litre storage containers that are quickly depleted during the dry season. Unfortunately, the ponds can also become salinised during storm surge events, the impacts of which can last for a number of years. A combined hydrological and epidemiological research programme over the past two years has been undertaken to understand the potential health risks associated with these saline water sources, as excessive intake of sodium can lead to hypertension and an increased risk of cardiovascular disease (such as stroke and heart attack). An important aspect of the selected research sites was the variety of drinking water sources available. These included the presence of managed aquifer recharge sites where monsoonal rainwater is stored in near-surface (semi-)confined aquifers for abstraction during the dry season. This provided an opportunity for the effects of interventions with lower salinity sources to be assessed. Adjusting for confounding factors such as age, gender and diet, the results show a significant association between salinity and blood pressure. Furthermore, the results also showed such impacts are reversible. In order to evaluate the costs and benefits of such interventions, a water salinity - dose impact model is being developed to assess the effectiveness of alternative drinking water sources, such as enhanced rainwater harvesting, localised solar distillation, as well as the long-term risks from traditional water sources due to climate change. Preliminary results from the model will be presented showing the relative impacts from these interventions. These highlight the need for an integrated approach to salinity management in such coastal deltas in order to improve the long-term health of local communities living in these areas.

  14. A critical review of principal traffic noise models: Strategies and implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garg, Naveen, E-mail: ngarg@mail.nplindia.ernet.in; Department of Mechanical, Production and Industrial Engineering, Delhi Technological University, Delhi 110042; Maji, Sagar

    2014-04-01

    The paper presents an exhaustive comparison of principal traffic noise models adopted in recent years in developed nations. The comparison is drawn on the basis of technical attributes including source modelling and sound propagation algorithms. Although the characterization of source in terms of rolling and propulsion noise in conjunction with advanced numerical methods for sound propagation has significantly reduced the uncertainty in traffic noise predictions, the approach followed is quite complex and requires specialized mathematical skills for predictions which is sometimes quite cumbersome for town planners. Also, it is sometimes difficult to follow the best approach when a variety ofmore » solutions have been proposed. This paper critically reviews all these aspects pertaining to the recent models developed and adapted in some countries and also discusses the strategies followed and implications of these models. - Highlights: • Principal traffic noise models developed are reviewed. • Sound propagation algorithms used in traffic noise models are compared. • Implications of models are discussed.« less

  15. Characterization of PTO and Idle Behavior for Utility Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, Adam W.; Konan, Arnaud M.; Miller, Eric S.

    This report presents the results of analyses performed on utility vehicle data composed primarily of aerial lift bucket trucks sampled from the National Renewable Energy Laboratory's Fleet DNA database to characterize power takeoff (PTO) and idle operating behavior for utility trucks. Two major data sources were examined in this study: a 75-vehicle sample of Odyne electric PTO (ePTO)-equipped vehicles drawn from multiple fleets spread across the United States and 10 conventional PTO-equipped Pacific Gas and Electric fleet vehicles operating in California. Novel data mining approaches were developed to identify PTO and idle operating states for each of the datasets usingmore » telematics and controller area network/onboard diagnostics data channels. These methods were applied to the individual datasets and aggregated to develop utilization curves and distributions describing PTO and idle behavior in both absolute and relative operating terms. This report also includes background information on the source vehicles, development of the analysis methodology, and conclusions regarding the study's findings.« less

  16. MIXOPTIM: A tool for the evaluation and the optimization of the electricity mix in a territory

    NASA Astrophysics Data System (ADS)

    Bonin, Bernard; Safa, Henri; Laureau, Axel; Merle-Lucotte, Elsa; Miss, Joachim; Richet, Yann

    2014-09-01

    This article presents a method of calculation of the generation cost of a mixture of electricity sources, by means of a Monte Carlo simulation of the production output taking into account the fluctuations of the demand and the stochastic nature of the availability of the various power sources that compose the mix. This evaluation shows that for a given electricity mix, the cost has a non-linear dependence on the demand level. In the second part of the paper, we develop some considerations on the management of intermittence. We develop a method based on spectral decomposition of the imposed power fluctuations to calculate the minimal amount of the controlled power sources needed to follow these fluctuations. This can be converted into a viability criterion of the mix included in the MIXOPTIM software. In the third part of the paper, the MIXOPTIM cost evaluation method is applied to the multi-criteria optimization of the mix, according to three main criteria: the cost of the mix; its impact on climate in terms of CO2 production; and the security of supply.

  17. Ultrabright continuously tunable terahertz-wave generation at room temperature

    PubMed Central

    Hayashi, Shin'ichiro; Nawata, Kouji; Taira, Takunori; Shikata, Jun-ichi; Kawase, Kodo; Minamide, Hiroaki

    2014-01-01

    The hottest frequency region in terms of research currently lies in the ‘frequency gap' region between microwaves and infrared: terahertz waves. Although new methods for generating terahertz radiation have been developed, most sources cannot generate high-brightness terahertz beams. Here we demonstrate the generation of ultrabright terahertz waves (brightness ~0.2 GW/sr·cm2, brightness temperature of ~1018 K, peak power of >50 kW) using parametric wavelength conversion in a nonlinear crystal; this is brighter than many specialized sources such as far-infrared free-electron lasers (~1016 K, ~2 kW). We revealed novel parametric wavelength conversion using stimulated Raman scattering in LiNbO3 without stimulated Brillouin scattering using recently-developed microchip laser. Furthermore, nonlinear up-conversion techniques allow the intense terahertz waves to be visualized and their frequency determined. These results are very promising for extending applied research into the terahertz region, and we expect that this source will open up new research fields such as nonlinear optics in the terahertz region. PMID:24898269

  18. Ultrabright continuously tunable terahertz-wave generation at room temperature.

    PubMed

    Hayashi, Shin'ichiro; Nawata, Kouji; Taira, Takunori; Shikata, Jun-ichi; Kawase, Kodo; Minamide, Hiroaki

    2014-06-05

    The hottest frequency region in terms of research currently lies in the 'frequency gap' region between microwaves and infrared: terahertz waves. Although new methods for generating terahertz radiation have been developed, most sources cannot generate high-brightness terahertz beams. Here we demonstrate the generation of ultrabright terahertz waves (brightness ~0.2 GW/sr·cm(2), brightness temperature of ~10(18) K, peak power of >50 kW) using parametric wavelength conversion in a nonlinear crystal; this is brighter than many specialized sources such as far-infrared free-electron lasers (~10(16) K, ~2 kW). We revealed novel parametric wavelength conversion using stimulated Raman scattering in LiNbO3 without stimulated Brillouin scattering using recently-developed microchip laser. Furthermore, nonlinear up-conversion techniques allow the intense terahertz waves to be visualized and their frequency determined. These results are very promising for extending applied research into the terahertz region, and we expect that this source will open up new research fields such as nonlinear optics in the terahertz region.

  19. A source to deliver mesoscopic particles for laser plasma studies

    NASA Astrophysics Data System (ADS)

    Gopal, R.; Kumar, R.; Anand, M.; Kulkarni, A.; Singh, D. P.; Krishnan, S. R.; Sharma, V.; Krishnamurthy, M.

    2017-02-01

    Intense ultrashort laser produced plasmas are a source for high brightness, short burst of X-rays, electrons, and high energy ions. Laser energy absorption and its disbursement strongly depend on the laser parameters and also on the initial size and shape of the target. The ability to change the shape, size, and material composition of the matter that absorbs light is of paramount importance not only from a fundamental physics point of view but also for potentially developing laser plasma sources tailored for specific applications. The idea of preparing mesoscopic particles of desired size/shape and suspending them in vacuum for laser plasma acceleration is a sparsely explored domain. In the following report we outline the development of a delivery mechanism of microparticles into an effusive jet in vacuum for laser plasma studies. We characterise the device in terms of particle density, particle size distribution, and duration of operation under conditions suitable for laser plasma studies. We also present the first results of x-ray emission from micro crystals of boric acid that extends to 100 keV even under relatively mild intensities of 1016 W/cm2.

  20. The Linac Coherent Light Source: Recent Developments and Future Plans

    DOE PAGES

    Schoenlein, R. W.; Boutet, S.; Minitti, M. P.; ...

    2017-08-18

    The development of X-ray free-electron lasers (XFELs) has launched a new era in X-ray science by providing ultrafast coherent X-ray pulses with a peak brightness that is approximately one billion times higher than previous X-ray sources. The Linac Coherent Light Source (LCLS) facility at the SLAC National Accelerator Laboratory, the world’s first hard X-ray FEL, has already demonstrated a tremendous scientific impact across broad areas of science. Here in this paper, a few of the more recent representative highlights from LCLS are presented in the areas of atomic, molecular, and optical science; chemistry; condensed matter physics; matter in extreme conditions;more » and biology. This paper also outlines the near term upgrade (LCLS-II) and motivating science opportunities for ultrafast X-rays in the 0.25–5 keV range at repetition rates up to 1 MHz. Future plans to extend the X-ray energy reach to beyond 13 keV (<1 Å) at high repetition rate (LCLS-II-HE) are envisioned, motivated by compelling new science of structural dynamics at the atomic scale.« less

Top