Sample records for source terms application

  1. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  2. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  3. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  4. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  5. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  6. 7 CFR 1942.111 - Applicant eligibility.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... applicant apply to commercial sources for financing. To provide a basis for referral of only those applicants who may be able to finance projects through commercial sources, approval officials should maintain... must determine whether financing from commercial sources at reasonable rates and terms is available. If...

  7. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  8. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  9. 45 CFR 144.204 - Applicability of regulations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....204 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS REQUIREMENTS RELATING TO HEALTH INSURANCE COVERAGE Source: § 144.204 Applicability of regulations. The... qualified long-term care insurance policies to individuals under a qualified State long-term care insurance...

  10. Photovoltaic highway applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Scudder, L. R.; Bifano, W. J.; Poley, W. A.

    1977-01-01

    A preliminary assessment of the near-term market for photovoltaic highway applications is presented. Among the potential users, two market sectors are considered: government and commercial. Within these sectors, two possible application areas, signs and motorist aids, are discussed. Based on judgemental information, obtained by a brief survey of representatives of the two user sectors, the government sector appears more amenable to the introduction of photovoltaic power sources for highway applications in the near-term. However, considerable interest and potential opportunities were also found to exist in the commercial sector. Further studies to quantify the market for highway applications appear warranted.

  11. MUFFSgenMC: An Open Source MUon Flexible Framework for Spectral GENeration for Monte Carlo Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatzidakis, Stylianos; Greulich, Christopher

    A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.

  12. Building Assessment Survey and Evaluation Study: Summarized Data - Test Space Pollutant Sources

    EPA Pesticide Factsheets

    information collected regarding sources that may have potential impact on the building in terms of indoor air quality including sources such as past or current water damage, pesticide application practices, special use spaces, etc.

  13. 7 CFR 4284.510 - Application processing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... to be provided, the expected impacts of that assistance, the sustainability of cooperative..., applicants should document future funding sources that will help achieve long-term sustainability of the...

  14. 7 CFR 4284.510 - Application processing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to be provided, the expected impacts of that assistance, the sustainability of cooperative..., applicants should document future funding sources that will help achieve long-term sustainability of the...

  15. 7 CFR 4284.510 - Application processing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to be provided, the expected impacts of that assistance, the sustainability of cooperative..., applicants should document future funding sources that will help achieve long-term sustainability of the...

  16. 7 CFR 4284.510 - Application processing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to be provided, the expected impacts of that assistance, the sustainability of cooperative..., applicants should document future funding sources that will help achieve long-term sustainability of the...

  17. Algorithm Development and Application of High Order Numerical Methods for Shocked and Rapid Changing Solutions

    DTIC Science & Technology

    2007-12-06

    high order well-balanced schemes to a class of hyperbolic systems with source terms, Boletin de la Sociedad Espanola de Matematica Aplicada, v34 (2006...schemes to a class of hyperbolic systems with source terms, Boletin de la Sociedad Espanola de Matematica Aplicada, v34 (2006), pp.69-80. 39. Y. Xu and C.-W

  18. Physical/chemical closed-loop water-recycling

    NASA Technical Reports Server (NTRS)

    Herrmann, Cal C.; Wydeven, Theodore

    1991-01-01

    Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on Earth, in regions where extensive water recycling is needed or where advanced water treatment is essential to meet EPA health standards.

  19. 77 FR 7568 - Freeport LNG Expansion, L.P. and FLNG Liquefaction, LLC; Application for Long-Term Authorization...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-13

    ... asserts that while Europe receives pipeline gas from various sources, the long supply chains and relative..., LLC; Application for Long-Term Authorization To Export Domestically Produced Liquefied Natural Gas to... authorization to export domestically produced liquefied natural gas (LNG) in an amount up to the equivalent of...

  20. [Effects of long-term fertilization on microbial biomass carbon and nitrogen and on carbon source utilization of microbes in a red soil].

    PubMed

    Sun, Feng-xia; Zhang, Wei-hua; Xu, Ming-gang; Zhang, Wen-ju; Li, Zhao-qiang; Zhang, Jing-ye

    2010-11-01

    In order to explore the effects of long-term fertilization on the microbiological characters of red soil, soil samples were collected from a 19-year long-term experimental field in Qiyang of Hunan, with their microbial biomass carbon (MBC) and nitrogen (MBN) and microbial utilization ratio of carbon sources analyzed. The results showed that after 19-year fertilization, the soil MBC and MBN under the application of organic manure and of organic manure plus inorganic fertilizers were 231 and 81 mg x kg(-1) soil, and 148 and 73 mg x kg(-1) soil, respectively, being significantly higher than those under non-fertilization, inorganic fertilization, and inorganic fertilization plus straw incorporation. The ratio of soil MBN to total N under the application of organic manure and of organic manure plus inorganic fertilizers was averagely 6.0%, significantly higher than that under non-fertilization and inorganic fertilization. Biolog-ECO analysis showed that the average well color development (AWCD) value was in the order of applying organic manure plus inorganic fertilizers = applying organic manure > non-fertilization > inorganic fertilization = inorganic fertilization plus straw incorporation. Under the application of organic manure or of organic manure plus inorganic fertilizers, the microbial utilization rate of carbon sources, including carbohydrates, carboxylic acids, amino acids, polymers, phenols, and amines increased; while under inorganic fertilization plus straw incorporation, the utilization rate of polymers was the highest, and that of carbohydrates was the lowest. Our results suggested that long-term application of organic manure could increase the red soil MBC, MBN, and microbial utilization rate of carbon sources, improve soil fertility, and maintain a better crop productivity.

  1. 8 CFR 210.1 - Definition of terms used in this part.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... processed together with an alien's photographs, fingerprints and signature, this form becomes the source... making a determination thereon. If fraud, willful misrepresentation of a material fact, a false writing... misrepresented facts in the application process. (m) Preliminary application. A preliminary application is...

  2. 8 CFR 210.1 - Definition of terms used in this part.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... processed together with an alien's photographs, fingerprints and signature, this form becomes the source... making a determination thereon. If fraud, willful misrepresentation of a material fact, a false writing... misrepresented facts in the application process. (m) Preliminary application. A preliminary application is...

  3. 8 CFR 210.1 - Definition of terms used in this part.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... processed together with an alien's photographs, fingerprints and signature, this form becomes the source... making a determination thereon. If fraud, willful misrepresentation of a material fact, a false writing... misrepresented facts in the application process. (m) Preliminary application. A preliminary application is...

  4. 8 CFR 210.1 - Definition of terms used in this part.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... processed together with an alien's photographs, fingerprints and signature, this form becomes the source... making a determination thereon. If fraud, willful misrepresentation of a material fact, a false writing... misrepresented facts in the application process. (m) Preliminary application. A preliminary application is...

  5. A well-balanced scheme for Ten-Moment Gaussian closure equations with source term

    NASA Astrophysics Data System (ADS)

    Meena, Asha Kumari; Kumar, Harish

    2018-02-01

    In this article, we consider the Ten-Moment equations with source term, which occurs in many applications related to plasma flows. We present a well-balanced second-order finite volume scheme. The scheme is well-balanced for general equation of state, provided we can write the hydrostatic solution as a function of the space variables. This is achieved by combining hydrostatic reconstruction with contact preserving, consistent numerical flux, and appropriate source discretization. Several numerical experiments are presented to demonstrate the well-balanced property and resulting accuracy of the proposed scheme.

  6. Innovative ceramic slab lasers for high power laser applications

    NASA Astrophysics Data System (ADS)

    Lapucci, Antonio; Ciofini, Marco

    2005-09-01

    Diode Pumped Solid State Lasers (DPSSL) are gaining increasing interest for high power industrial application, given the continuous improvement in high power diode laser technology reliability and affordability. These sources open new windows in the parameter space for traditional applications such as cutting , welding, marking and engraving for high reflectance metallic materials. Other interesting applications for this kind of sources include high speed thermal printing, precision drilling, selective soldering and thin film etching. In this paper we examine the most important DPSS laser source types for industrial applications and we describe in details the performances of some slab laser configurations investigated at our facilities. The different architectures' advantages and draw-backs are briefly compared in terms of performances, system complexity and ease of scalability to the multi-kW level.

  7. Physical/chemical closed-loop water-recycling for long-duration missions

    NASA Technical Reports Server (NTRS)

    Herrmann, Cal C.; Wydeven, Ted

    1990-01-01

    Water needs, water sources, and means for recycling water are examined in terms appropriate to the water quality requirements of a small crew and spacecraft intended for long duration exploration missions. Inorganic, organic, and biological hazards are estimated for waste water sources. Sensitivities to these hazards for human uses are estimated. The water recycling processes considered are humidity condensation, carbon dioxide reduction, waste oxidation, distillation, reverse osmosis, pervaporation, electrodialysis, ion exchange, carbon sorption, and electrochemical oxidation. Limitations and applications of these processes are evaluated in terms of water quality objectives. Computerized simulation of some of these chemical processes is examined. Recommendations are made for development of new water recycling technology and improvement of existing technology for near term application to life support systems for humans in space. The technological developments are equally applicable to water needs on earth, in regions where extensive water ecycling is needed or where advanced water treatment is essential to meet EPA health standards.

  8. Solution of the equation of heat conduction with time dependent sources: Programmed application to planetary thermal history

    NASA Technical Reports Server (NTRS)

    Conel, J. E.

    1975-01-01

    A computer program (Program SPHERE) solving the inhomogeneous equation of heat conduction with radiation boundary condition on a thermally homogeneous sphere is described. The source terms are taken to be exponential functions of the time. Thermal properties are independent of temperature. The solutions are appropriate to studying certain classes of planetary thermal history. Special application to the moon is discussed.

  9. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  10. Next Generation Emission Measurements for Fugitive, Area Source, and Fence Line Applications?

    EPA Science Inventory

    Next generation emissions measurements (NGEM) is an EPA term for the rapidly advancing field of air pollutant sensor technologies, data integration concepts, and associated geospatial modeling strategies for source emissions measurements. Ranging from low coat sensors to satelli...

  11. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  12. REVIEW OF METHODS FOR REMOTE SENSING OF ATMOSPHERIC EMISSIONS FROM STATIONARY SOURCES

    EPA Science Inventory

    The report reviews the commercially available and developing technologies for the application of remote sensing to the measurement of source emissions. The term 'remote sensing technology', as applied in the report, means the detection or concentration measurement of trace atmosp...

  13. Term amniotic fluid: an unexploited reserve of mesenchymal stromal cells for reprogramming and potential cell therapy applications.

    PubMed

    Moraghebi, Roksana; Kirkeby, Agnete; Chaves, Patricia; Rönn, Roger E; Sitnicka, Ewa; Parmar, Malin; Larsson, Marcus; Herbst, Andreas; Woods, Niels-Bjarne

    2017-08-25

    Mesenchymal stromal cells (MSCs) are currently being evaluated in numerous pre-clinical and clinical cell-based therapy studies. Furthermore, there is an increasing interest in exploring alternative uses of these cells in disease modelling, pharmaceutical screening, and regenerative medicine by applying reprogramming technologies. However, the limited availability of MSCs from various sources restricts their use. Term amniotic fluid has been proposed as an alternative source of MSCs. Previously, only low volumes of term fluid and its cellular constituents have been collected, and current knowledge of the MSCs derived from this fluid is limited. In this study, we collected amniotic fluid at term using a novel collection system and evaluated amniotic fluid MSC content and their characteristics, including their feasibility to undergo cellular reprogramming. Amniotic fluid was collected at term caesarean section deliveries using a closed catheter-based system. Following fluid processing, amniotic fluid was assessed for cellularity, MSC frequency, in-vitro proliferation, surface phenotype, differentiation, and gene expression characteristics. Cells were also reprogrammed to the pluripotent stem cell state and differentiated towards neural and haematopoietic lineages. The average volume of term amniotic fluid collected was approximately 0.4 litres per donor, containing an average of 7 million viable mononuclear cells per litre, and a CFU-F content of 15 per 100,000 MNCs. Expanded CFU-F cultures showed similar surface phenotype, differentiation potential, and gene expression characteristics to MSCs isolated from traditional sources, and showed extensive expansion potential and rapid doubling times. Given the high proliferation rates of these neonatal source cells, we assessed them in a reprogramming application, where the derived induced pluripotent stem cells showed multigerm layer lineage differentiation potential. The potentially large donor base from caesarean section deliveries, the high yield of term amniotic fluid MSCs obtainable, the properties of the MSCs identified, and the suitability of the cells to be reprogrammed into the pluripotent state demonstrated these cells to be a promising and plentiful resource for further evaluation in bio-banking, cell therapy, disease modelling, and regenerative medicine applications.

  14. Bayesian Inference for Source Term Estimation: Application to the International Monitoring System Radionuclide Network

    DTIC Science & Technology

    2014-10-01

    de l’exactitude et de la précision), comparativement au modèle de mesure plus simple qui n’utilise pas de multiplicateurs. Importance pour la défense...3) Bayesian experimental design for receptor placement in order to maximize the expected information in the measured concen- tration data for...applications of the Bayesian inferential methodology for source recon- struction have used high-quality concentration data from well- designed atmospheric

  15. Developing Design Criteria and Scale Up Methods for Water-Stable Metal-Organic Frameworks for Adsorption Applications

    DTIC Science & Technology

    2014-09-08

    Figure 1.4: Number of publications containing the term “metal-organic frameworks” (Source: ISI Web of Science, retrieved April, 14 th , 2014) 8...1.4 Number of publications containing the term “metal-organic frameworks” (Source: ISI Web of Science, retrieved April, 14 th , 2014). 1.4...recorded with a PerkinElmer Spectrum One 10 in the range 400 – 4000 cm -1 . To record the IR spectrum, an IR beam is passed through the sample (in

  16. Amnion-derived stem cells: in quest of clinical applications

    PubMed Central

    2011-01-01

    In the promising field of regenerative medicine, human perinatal stem cells are of great interest as potential stem cells with clinical applications. Perinatal stem cells could be isolated from normally discarded human placentae, which are an ideal cell source in terms of availability, the fewer number of ethical concerns, less DNA damage, and so on. Numerous studies have demonstrated that some of the placenta-derived cells possess stem cell characteristics like pluripotent differentiation ability, particularly in amniotic epithelial (AE) cells. Term human amniotic epithelium contains a relatively large number of stem cell marker-positive cells as an adult stem cell source. In this review, we introduce a model theory of why so many AE cells possess stem cell characteristics. We also describe previous work concerning the therapeutic applications and discuss the pluripotency of the AE cells and potential pitfalls for amnion-derived stem cell research. PMID:21596003

  17. Ultra-Sensitive Elemental Analysis Using Plasmas 4.Application of Inductively Coupled Plasma Mass Spectrometry to the Study of Environmental Radioactivity

    NASA Astrophysics Data System (ADS)

    Yoshida, Satoshi

    Applications of inductively coupled plasma mass spectrometry (ICP-MS) to the determination of long-lived radionuclides in environmental samples were summarized. In order to predict the long-term behavior of the radionuclides, related stable elements were also determined. Compared with radioactivity measurements, the ICP-MS method has advantages in terms of its simple analytical procedures, prompt measurement time, and capability of determining the isotope ratio such as240Pu/239Pu, which can not be separated by radiation. Concentration of U and Th in Japanese surface soils were determined in order to determine the background level of the natural radionuclides. The 235U/238U ratio was successfully used to detect the release of enriched U from reconversion facilities to the environment and to understand the source term. The 240Pu/239Pu ratios in environmental samples varied widely depending on the Pu sources. Applications of ICP-MS to the measurement of I and Tc isotopes were also described. The ratio between radiocesium and stable Cs is useful for judging the equilibrium of deposited radiocesium in a forest ecosystem.

  18. Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)

    NASA Astrophysics Data System (ADS)

    Kasibhatla, P.

    2004-12-01

    In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.

  19. The pyroelectric properties of TGS for application in infrared detection

    NASA Technical Reports Server (NTRS)

    Kroes, R. L.; Reiss, D.

    1981-01-01

    The pyroelectric property of triglycine sulfate and its application in the detection of infrared radiation are described. The detectivities of pyroelectric detectors and other types of infrared detectors are compared. The thermal response of a pyroelectric detector element and the resulting electrical response are derived in terms of the material parameters. The noise sources which limit the sensitivity of pyroelectric detectors are described, and the noise equivalent power for each noise source is given as a function of frequency and detector area.

  20. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  1. REMEDIATION FLUID RECYCLING - APPLICATION OF PERVAPORATION TECHNOLOGY TO MATERIAL RECOVERY AND REUSE

    EPA Science Inventory

    In an effort to aggressively remove NAPL source areas, agents such as surfactants and alcohols have been added to in situ flusing systems to enhance the solubility of the NAPL components. Such an approach has the potential to reduce the risk posed by a long term source of ground...

  2. REMEDIATION FLUID RECYCLING: APPLICATION OF PERVAPORATION TECHNOLOGY TO MATERIAL RECOVERY AND REUSE

    EPA Science Inventory

    In an effort to aggressively remove NAPL source areas, agents such as surfactants and alcohols have been added to in situ flushing systems to enhance the solubility of the NAPL components. Such an approach has the potential to reduce the risk posed by a long term source of groun...

  3. Water infiltration and surface soil structural properties as influenced by animal traffic in the Southern Piedmont USA

    USDA-ARS?s Scientific Manuscript database

    Surface-soil structural condition in long-term perennial pastures is expected to be modified by how forage is (a) harvested through haying or grazing and (b) stimulated through source of nutrient application. We determined the effects of harvest management and nutrient source on macropore filling, ...

  4. REMEDIATION FLUID RECYCLING - APPLICATION OF PERVAPORATION TECHNOLOGY TO MATERIAL RECOVERY AND REUSEI

    EPA Science Inventory

    In an effort to aggressively remove NAPL source areas, agents such as surfactants and alcohols have been added to in situ flusing systems to enhance the solubility of the NAPL components. Such an approach has the potential to reduce the risk posed by a long term source of ground...

  5. 40 CFR 35.502 - Definitions of terms.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... in preparing a grant application. Tribal Environmental Agreement (TEA). A dynamic, strategic planning... program sources. Planning target. The amount of funds that the Regional Administrator suggests a grant...

  6. Photovoltaic village power application: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Poley, W. A.; Scudder, L. R.

    1978-01-01

    The village power application represents a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in both the government and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 12 MW min and a foreign market of about 10 GW exists.

  7. Photovoltaic water pumping applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Scudder, L. R.; Poley, W. A.; Cusick, J. P.

    1978-01-01

    Water pumping applications represent a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in the government, commercial/institutional and public sectors. The foreign demand and sources of funding for water pumping systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 240 megawatts and a foreign market of about 6 gigawatts exist.

  8. Monitor Network Traffic with Packet Capture (pcap) on an Android Device

    DTIC Science & Technology

    2015-09-01

    administrative privileges . Under the current design Android development requirement, an Android Graphical User Interface (GUI) application cannot directly...build an Android application to monitor network traffic using open source packet capture (pcap) libraries. 15. SUBJECT TERMS ELIDe, Android , pcap 16...Building Application with Native Codes 5 8.1 Calling Native Codes Using JNI 5 8.2 Calling Native Codes from an Android Application 8 9. Retrieve Live

  9. Numerical modeling of materials processing applications of a pulsed cold cathode electron gun

    NASA Astrophysics Data System (ADS)

    Etcheverry, J. I.; Martínez, O. E.; Mingolo, N.

    1998-04-01

    A numerical study of the application of a pulsed cold cathode electron gun to materials processing is performed. A simple semiempirical model of the discharge is used, together with backscattering and energy deposition profiles obtained by a Monte Carlo technique, in order to evaluate the energy source term inside the material. The numerical computation of the heat equation with the calculated source term is performed in order to obtain useful information on melting and vaporization thresholds, melted radius and depth, and on the dependence of these variables on processing parameters such as operating pressure, initial voltage of the discharge and cathode-sample distance. Numerical results for stainless steel are presented, which demonstrate the need for several modifications of the experimental design in order to achieve a better efficiency.

  10. The Application of Function Points to Predict Source Lines of Code for Software Development

    DTIC Science & Technology

    1992-09-01

    there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available

  11. The exact calculation of quadrupole sources for some incompressible flows

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.

    1988-01-01

    This paper is concerned with the application of the acoustic analogy of Lighthill to the acoustic and aerodynamic problems associated with moving bodies. The Ffowcs Williams-Hawkings equation, which is an interpretation of the acoustic analogy for sound generation by moving bodies, manipulates the source terms into surface and volume sources. Quite often in practice the volume sources, or quadrupoles, are neglected for various reasons. Recently, Farassat, Long and others have attempted to use the FW-H equation with the quadrupole source and neglected to solve for the surface pressure on the body. The purpose of this paper is to examine the contribution of the quadrupole source to the acoustic pressure and body surface pressure for some problems for which the exact solution is known. The inviscid, incompressible, 2-D flow, calculated using the velocity potential, is used to calculate the individual contributions of the various surface and volume source terms in the FW-H equation. The relative importance of each of the sources is then assessed.

  12. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  13. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  14. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  15. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  16. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  17. Evaluation of actuator energy storage and power sources for spacecraft applications

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Young, Fred M.

    1993-01-01

    The objective of this evaluation is to determine an optimum energy storage/power source combination for electrical actuation systems for existing (Solid Rocket Booster (SRB), Shuttle) and future (Advanced Launch System (ALS), Shuttle Derivative) vehicles. Characteristic of these applications is the requirement for high power pulses (50-200 kW) for short times (milliseconds to seconds), coupled with longer-term base or 'housekeeping' requirements (5-16 kW). Specific study parameters (e.g., weight, volume, etc.) as stated in the proposal and specified in the Statement of Work (SOW) are included.

  18. 40 CFR 142.307 - What terms and conditions must be included in a small system variance?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... improvements to comply with the small system variance technology, secure an alternative source of water, or... included in a small system variance? 142.307 Section 142.307 Protection of Environment ENVIRONMENTAL... IMPLEMENTATION Variances for Small System Review of Small System Variance Application § 142.307 What terms and...

  19. 40 CFR 142.307 - What terms and conditions must be included in a small system variance?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... improvements to comply with the small system variance technology, secure an alternative source of water, or... included in a small system variance? 142.307 Section 142.307 Protection of Environment ENVIRONMENTAL... IMPLEMENTATION Variances for Small System Review of Small System Variance Application § 142.307 What terms and...

  20. 40 CFR 142.307 - What terms and conditions must be included in a small system variance?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... improvements to comply with the small system variance technology, secure an alternative source of water, or... included in a small system variance? 142.307 Section 142.307 Protection of Environment ENVIRONMENTAL... IMPLEMENTATION Variances for Small System Review of Small System Variance Application § 142.307 What terms and...

  1. 40 CFR 142.307 - What terms and conditions must be included in a small system variance?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... improvements to comply with the small system variance technology, secure an alternative source of water, or... included in a small system variance? 142.307 Section 142.307 Protection of Environment ENVIRONMENTAL... IMPLEMENTATION Variances for Small System Review of Small System Variance Application § 142.307 What terms and...

  2. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  3. Wave Riemann description of friction terms in unsteady shallow flows: Application to water and mud/debris floods

    NASA Astrophysics Data System (ADS)

    Murillo, J.; García-Navarro, P.

    2012-02-01

    In this work, the source term discretization in hyperbolic conservation laws with source terms is considered using an approximate augmented Riemann solver. The technique is applied to the shallow water equations with bed slope and friction terms with the focus on the friction discretization. The augmented Roe approximate Riemann solver provides a family of weak solutions for the shallow water equations, that are the basis of the upwind treatment of the source term. This has proved successful to explain and to avoid the appearance of instabilities and negative values of the thickness of the water layer in cases of variable bottom topography. Here, this strategy is extended to capture the peculiarities that may arise when defining more ambitious scenarios, that may include relevant stresses in cases of mud/debris flow. The conclusions of this analysis lead to the definition of an accurate and robust first order finite volume scheme, able to handle correctly transient problems considering frictional stresses in both clean water and debris flow, including in this last case a correct modelling of stopping conditions.

  4. On the application of subcell resolution to conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Chang, Shih-Hung

    1989-01-01

    LeVeque and Yee recently investigated a one-dimensional scalar conservation law with stiff source terms modeling the reacting flow problems and discovered that for the very stiff case most of the current finite difference methods developed for non-reacting flows would produce wrong solutions when there is a propagating discontinuity. A numerical scheme, essentially nonoscillatory/subcell resolution - characteristic direction (ENO/SRCD), is proposed for solving conservation laws with stiff source terms. This scheme is a modification of Harten's ENO scheme with subcell resolution, ENO/SR. The locations of the discontinuities and the characteristic directions are essential in the design. Strang's time-splitting method is used and time evolutions are done by advancing along the characteristics. Numerical experiment using this scheme shows excellent results on the model problem of LeVeque and Yee. Comparisons of the results of ENO, ENO/SR, and ENO/SRCD are also presented.

  5. Liquid-metal-ion source development for space propulsion at ARC.

    PubMed

    Tajmar, M; Scharlemann, C; Genovese, A; Buldrini, N; Steiger, W; Vasiljevich, I

    2009-04-01

    The Austrian Research Centers have a long history of developing indium Liquid-Metal-Ion Source (LMIS) for space applications including spacecraft charging compensators, SIMS and propulsion. Specifically the application as a thruster requires long-term operation as well as high-current operation which is very challenging. Recently, we demonstrated the operation of a cluster of single LMIS at an average current of 100muA each for more than 4800h and developed models for tip erosion and droplet deposition suggesting that such a LMIS can operate up to 20,000h or more. In order to drastically increase the current, a porous multi-tip source that allows operation up to several mA was developed. Our paper will highlight the problem areas and challenges from our LMIS development focusing on space propulsion applications.

  6. An imaging-based photometric and colorimetric measurement method for characterizing OLED panels for lighting applications

    NASA Astrophysics Data System (ADS)

    Zhu, Yiting; Narendran, Nadarajah; Tan, Jianchuan; Mou, Xi

    2014-09-01

    The organic light-emitting diode (OLED) has demonstrated its novelty in displays and certain lighting applications. Similar to white light-emitting diode (LED) technology, it also holds the promise of saving energy. Even though the luminous efficacy values of OLED products have been steadily growing, their longevity is still not well understood. Furthermore, currently there is no industry standard for photometric and colorimetric testing, short and long term, of OLEDs. Each OLED manufacturer tests its OLED panels under different electrical and thermal conditions using different measurement methods. In this study, an imaging-based photometric and colorimetric measurement method for OLED panels was investigated. Unlike an LED that can be considered as a point source, the OLED is a large form area source. Therefore, for an area source to satisfy lighting application needs, it is important that it maintains uniform light level and color properties across the emitting surface of the panel over a long period. This study intended to develop a measurement procedure that can be used to test long-term photometric and colorimetric properties of OLED panels. The objective was to better understand how test parameters such as drive current or luminance and temperature affect the degradation rate. In addition, this study investigated whether data interpolation could allow for determination of degradation and lifetime, L70, at application conditions based on the degradation rates measured at different operating conditions.

  7. PaaS for web applications with OpenShift Origin

    NASA Astrophysics Data System (ADS)

    Lossent, A.; Rodriguez Peon, A.; Wagner, A.

    2017-10-01

    The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.

  8. Maximizing the spatial representativeness of NO2 monitoring data using a combination of local wind-based sectoral division and seasonal and diurnal correction factors.

    PubMed

    Donnelly, Aoife; Naughton, Owen; Misstear, Bruce; Broderick, Brian

    2016-10-14

    This article describes a new methodology for increasing the spatial representativeness of individual monitoring sites. Air pollution levels at a given point are influenced by emission sources in the immediate vicinity. Since emission sources are rarely uniformly distributed around a site, concentration levels will inevitably be most affected by the sources in the prevailing upwind direction. The methodology provides a means of capturing this effect and providing additional information regarding source/pollution relationships. The methodology allows for the division of the air quality data from a given monitoring site into a number of sectors or wedges based on wind direction and estimation of annual mean values for each sector, thus optimising the information that can be obtained from a single monitoring station. The method corrects for short-term data, diurnal and seasonal variations in concentrations (which can produce uneven weighting of data within each sector) and uneven frequency of wind directions. Significant improvements in correlations between the air quality data and the spatial air quality indicators were obtained after application of the correction factors. This suggests the application of these techniques would be of significant benefit in land-use regression modelling studies. Furthermore, the method was found to be very useful for estimating long-term mean values and wind direction sector values using only short-term monitoring data. The methods presented in this article can result in cost savings through minimising the number of monitoring sites required for air quality studies while also capturing a greater degree of variability in spatial characteristics. In this way, more reliable, but also more expensive monitoring techniques can be used in preference to a higher number of low-cost but less reliable techniques. The methods described in this article have applications in local air quality management, source receptor analysis, land-use regression mapping and modelling and population exposure studies.

  9. Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison for GPU and MIC Parallel Computing Devices

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Liu, Tianyu; Su, Lin; Bednarz, Bryan; Caracappa, Peter; Xu, X. George

    2017-09-01

    Monte Carlo (MC) simulation is well recognized as the most accurate method for radiation dose calculations. For radiotherapy applications, accurate modelling of the source term, i.e. the clinical linear accelerator is critical to the simulation. The purpose of this paper is to perform source modelling and examine the accuracy and performance of the models on Intel Many Integrated Core coprocessors (aka Xeon Phi) and Nvidia GPU using ARCHER and explore the potential optimization methods. Phase Space-based source modelling for has been implemented. Good agreements were found in a tomotherapy prostate patient case and a TrueBeam breast case. From the aspect of performance, the whole simulation for prostate plan and breast plan cost about 173s and 73s with 1% statistical error.

  10. Generation of optimal artificial neural networks using a pattern search algorithm: application to approximation of chemical systems.

    PubMed

    Ihme, Matthias; Marsden, Alison L; Pitsch, Heinz

    2008-02-01

    A pattern search optimization method is applied to the generation of optimal artificial neural networks (ANNs). Optimization is performed using a mixed variable extension to the generalized pattern search method. This method offers the advantage that categorical variables, such as neural transfer functions and nodal connectivities, can be used as parameters in optimization. When used together with a surrogate, the resulting algorithm is highly efficient for expensive objective functions. Results demonstrate the effectiveness of this method in optimizing an ANN for the number of neurons, the type of transfer function, and the connectivity among neurons. The optimization method is applied to a chemistry approximation of practical relevance. In this application, temperature and a chemical source term are approximated as functions of two independent parameters using optimal ANNs. Comparison of the performance of optimal ANNs with conventional tabulation methods demonstrates equivalent accuracy by considerable savings in memory storage. The architecture of the optimal ANN for the approximation of the chemical source term consists of a fully connected feedforward network having four nonlinear hidden layers and 117 synaptic weights. An equivalent representation of the chemical source term using tabulation techniques would require a 500 x 500 grid point discretization of the parameter space.

  11. Non-Randomized Studies as a Source of Complementary, Sequential or Replacement Evidence for Randomized Controlled Trials in Systematic Reviews on the Effects of Interventions

    ERIC Educational Resources Information Center

    Schünemann, Holger J.; Tugwell, Peter; Reeves, Barnaby C.; Akl, Elie A.; Santesso, Nancy; Spencer, Frederick A.; Shea, Beverley; Wells, George; Helfand, Mark

    2013-01-01

    The terms applicability, generalizability, external validity and transferability are related, sometimes used interchangeably and have in common that they lack a clear and consistent definition in the classic epidemiological literature. However, all of these terms generally describe one overarching theme: whether or not available research evidence…

  12. Passive Localization of Multiple Sources Using Widely-Spaced Arrays With Application to Marine Mammals

    DTIC Science & Technology

    2008-09-30

    developing methods to simultaneously track multiple vocalizing marine mammals, we hope to contribute to the fields of marine mammal bioacoustics, ecology ...mammals, we hope to contribute to the fields of marine mammal bioacoustics, ecology , and anthropogenic impact mitigation. 15. SUBJECT TERMS 16. SECURITY...N00014-05-1-0074 (OA Graduate Traineeship for E-M Nosal) LONG-TERM GOALS The long-term goal of our research is to develop algorithms that use widely

  13. Photovoltaics as a terrestrial energy source. Volume 1: An introduction

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1980-01-01

    Photovoltaic (PV) systems were examined their potential for terrestrial application and future development. Photovoltaic technology, existing and potential photovoltaic applications, and the National Photovoltaics Program are reviewed. The competitive environment for this electrical source, affected by the presence or absence of utility supplied power is evaluated in term of systems prices. The roles of technological breakthroughs, directed research and technology development, learning curves, and commercial demonstrations in the National Program are discussed. The potential for photovoltaics to displace oil consumption is examined, as are the potential benefits of employing PV in either central-station or non-utility owned, small, distributed systems.

  14. Estuarine turbidity, flushing, salinity, and circulation

    NASA Technical Reports Server (NTRS)

    Pritchard, D. W.

    1972-01-01

    The effects of estuarine turbidity, flushing, salinity, and circulation on the ecology of the Chesapeake Bay are discussed. The sources of fresh water, the variations in salinity, and the circulation patterns created by temperature and salinity changes are analyzed. The application of remote sensors for long term observation of water temperatures is described. The sources of sediment and the biological effects resulting from increased sediments and siltation are identified.

  15. [The ethical reflection approach, a source of wellbeing at work].

    PubMed

    Bréhaux, Karine; Grésyk, Bénédicte

    2014-01-01

    Clinical nursing practice, beyond its application to care procedures, can be expressed in terms of ethical added value in the support of patients. In Reims university hospital, where a clinical ethics and care think-tank was created in June 2010, the ethical reflection approach is encouraged in order to reemphasise the global meaning of care as a source of wellbeing at work.

  16. Antimatter Requirements and Energy Costs for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities that could exist within the early part of next century. Results show that although it may be impractical to consider systems that rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-term production capabilities. In fact, a new facility designed solely for antiproton production but based on existing technology could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $6.4 million per mission.

  17. Characterization and in vitro properties of potentially probiotic Bifidobacterium strains isolated from breast-milk.

    PubMed

    Arboleya, Silvia; Ruas-Madiedo, Patricia; Margolles, Abelardo; Solís, Gonzalo; Salminen, Seppo; de Los Reyes-Gavilán, Clara G; Gueimonde, Miguel

    2011-09-01

    Most of the current commercial probiotic strains have not been selected for specific applications, but rather on the basis of their technological potential for use in diverse applications. Therefore, by selecting them from appropriate sources, depending on the target population, it is likely that better performing strains may be identified. Few strains have been specifically selected for human neonates, where the applications of probiotics may have a great positive impact. Breast-milk constitutes an interesting source of potentially probiotic bifidobacteria for inclusion in infant formulas and foods targeted to both pre-term and full-term infants. In this study six Bifidobacterium strains isolated from breast-milk were phenotypically and genotypically characterised according to international guidelines for probiotics. In addition, different in vitro tests were used to assess the safety and probiotic potential of the strains. Although clinical data would be needed before drawing any conclusion on the probiotic properties of the strains, our results indicate that some of them may have probiotic potential for their inclusion in products targeting infants. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.

    PubMed

    Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi

    2011-10-13

    Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.

  19. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text

    PubMed Central

    2011-01-01

    Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939

  20. Accelerated Stress Testing of Multi-Source LED Products: Horticulture Lamps and Tunable-White Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Lynn; Rountree, Kelley; Mills, Karmann

    This report discusses the use of accelerated stress testing (AST) to provide insights into the long-term behavior of commercial products utilizing different types of mid-power LEDs (MP-LEDs) integrated into the same LED module. Test results are presented from two commercial lamps intended for use in horticulture applications and one tunable-white LED module intended for use in educational and office lighting applications. Each of these products is designed to provide a custom spectrum for their targeted applications and each achieves this goal in different ways. Consequently, a comparison of the long-term stability of these devices will provide insights regarding approaches thatmore » could be used to possibly lengthen the lifetime of SSL products.« less

  1. Radiation from Directional Seismic Sources in Laterally Stratified Media with Application to Arctic Ice Cracking Noise

    DTIC Science & Technology

    1989-05-22

    Stress- Strain Relation . . . . . . . . . . . . . . . . . . . . . . . . 88 5.3 Equivalent Transversely Isotropic Elastic Constants for Periodi- cally...a vertical wavenumber parameters for compressional waves. # : vertical wavenumber parameters for shear waves. 6 dip angle, refer to Fig 3.2. E strain ...been pursued along two different lines[1] : First, in terms of body forces ; second, in terms of disconti- nuities in displacement or strain across a

  2. ESPC Coupled Global Prediction System

    DTIC Science & Technology

    2014-09-30

    active, and cloud- nucleating aerosols into NAVGEM for use in long-term simulations and forecasts and for use in the full coupled system. APPROACH...cloud- nucleating aerosols into NAVGEM for use in long-term simulations and forecasts for ESPC applications. We are relying on approaches, findings...function. For sea salt we follow NAAPS and use a source that depends on ocean surface winds and relative humidity . In lieu of the relevant

  3. Derivation and application of the reciprocity relations for radiative transfer with internal illumination

    NASA Technical Reports Server (NTRS)

    Cogley, A. C.

    1975-01-01

    A Green's function formulation is used to derive basic reciprocity relations for planar radiative transfer in a general medium with internal illumination. Reciprocity (or functional symmetry) allows an explicit and generalized development of the equivalence between source and probability functions. Assuming similar symmetry in three-dimensional space, a general relationship is derived between planar-source intensity and point-source total directional energy. These quantities are expressed in terms of standard (universal) functions associated with the planar medium, while all results are derived from the differential equation of radiative transfer.

  4. Time-domain diffuse optics: towards next generation devices

    NASA Astrophysics Data System (ADS)

    Contini, Davide; Dalla Mora, Alberto; Arridge, Simon; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio

    2015-07-01

    Diffuse optics is a powerful tool for clinical applications ranging from oncology to neurology, but also for molecular imaging, and quality assessment of food, wood and pharmaceuticals. We show that ideally time-domain diffuse optics can give higher contrast and a higher penetration depth with respect to standard technology. In order to completely exploit the advantages of a time-domain system a distribution of sources and detectors with fast gating capabilities covering all the sample surface is needed. Here, we present the building block to build up such system. This basic component is made of a miniaturised source-detector pair embedded into the probe based on pulsed Vertical-Cavity Surface-Emitting Lasers (VCSEL) as sources and Single-Photon Avalanche Diodes (SPAD) or Silicon Photomultipliers (SiPM) as detectors. The possibility to miniaturized and dramatically increase the number of source detectors pairs open the way to an advancement of diffuse optics in terms of improvement of performances and exploration of new applications. Furthermore, availability of compact devices with reduction in size and cost can boost the application of this technique.

  5. Real Otto and Diesel Engine Cycles.

    ERIC Educational Resources Information Center

    Giedd, Ronald

    1983-01-01

    A thermodynamic analysis of the properties of otto/diesel engines during the time they operate with open chambers illustrates applicability of thermodynamics to real systems, demonstrates how delivered power is controlled, and explains the source of air pollution in terms of thermodynamic laws. (Author/JN)

  6. Next Generation Air Measurements for Fugitive, Area Source, and Fence Line Applications

    EPA Science Inventory

    Next generation air measurements (NGAM) is an EPA term for the advancing field of air pollutant sensor technologies, data integration concepts, and geospatial modeling strategies. Ranging from personal sensors to satellite remote sensing, NGAM systems may provide revolutionary n...

  7. Generalized fractional supertrace identity for Hamiltonian structure of NLS-MKdV hierarchy with self-consistent sources

    NASA Astrophysics Data System (ADS)

    Dong, Huan He; Guo, Bao Yong; Yin, Bao Shu

    2016-06-01

    In the paper, based on the modified Riemann-Liouville fractional derivative and Tu scheme, the fractional super NLS-MKdV hierarchy is derived, especially the self-consistent sources term is considered. Meanwhile, the generalized fractional supertrace identity is proposed, which is a beneficial supplement to the existing literature on integrable system. As an application, the super Hamiltonian structure of fractional super NLS-MKdV hierarchy is obtained.

  8. Unusual rainbows as auroral candidates: Another point of view

    NASA Astrophysics Data System (ADS)

    Carrasco, Víctor M. S.; Trigo, Ricardo M.; Vaquero, José M.

    2017-04-01

    Several auroral events that occurred in the past have not been cataloged as such due to the fact that they were described in the historical sources with different terminologies. Hayakawa et al. (2016, PASJ, 68, 33) have reviewed historical Oriental chronicles and proposed the terms “unusual rainbow” and “white rainbow” as candidates for auroras. In this work, we present three events that took place in the 18th century in two different settings (the Iberian Peninsula and Brazil) that were originally described with similar definitions or wording to that used by the Oriental chronicles, despite the inherent differences in terms associated with Oriental and Latin languages. We show that these terms are indeed applicable to the three case studies from Europe and South America. Thus, the auroral catalogs available can be extended to Occidental sources using this new terminology.

  9. Green materials for sustainable development

    NASA Astrophysics Data System (ADS)

    Purwasasmita, B. S.

    2017-03-01

    Sustainable development is an integrity of multidiscipline concept combining ecological, social and economic aspects to construct a liveable human living system. The sustainable development can be support through the development of green materials. Green materials offers a unique characteristic and properties including abundant in nature, less toxic, economically affordable and versatility in term of physical and chemical properties. Green materials can be applied for a numerous field in science and technology applications including for energy, building, construction and infrastructures, materials science and engineering applications and pollution management and technology. For instance, green materials can be developed as a source for energy production. Green materials including biomass-based source can be developed as a source for biodiesel and bioethanol production. Biomass-based materials also can be transformed into advanced functionalized materials for advanced bio-applications such as the transformation of chitin into chitosan which further used for biomedicine, biomaterials and tissue engineering applications. Recently, cellulose-based material and lignocellulose-based materials as a source for the developing functional materials attracted the potential prospect for biomaterials, reinforcing materials and nanotechnology. Furthermore, the development of pigment materials has gaining interest by using the green materials as a source due to their unique properties. Eventually, Indonesia as a large country with a large biodiversity can enhance the development of green material to strengthen our nation competitiveness and develop the materials technology for the future.

  10. 48 CFR 252.251-7000 - Ordering from Government supply sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Enterprise Software Agreements, the Contractor shall follow the terms of the applicable schedule or agreement... Enterprise Software Agreement contractor). (2) The following statement: Any price reductions negotiated as part of an Enterprise Software Agreement issued under a Federal Supply Schedule contract shall control...

  11. Antimatter Production for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities which could exist within the early part of next century. Results show that although it may be impractical to consider systems which rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-ten-n production capabilities. In fact, such systems could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $60 million per mission.

  12. On the application of ENO scheme with subcell resolution to conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Chang, Shih-Hung

    1991-01-01

    Two approaches are used to extend the essentially non-oscillatory (ENO) schemes to treat conservation laws with stiff source terms. One approach is the application of the Strang time-splitting method. Here the basic ENO scheme and the Harten modification using subcell resolution (SR), ENO/SR scheme, are extended this way. The other approach is a direct method and a modification of the ENO/SR. Here the technique of ENO reconstruction with subcell resolution is used to locate the discontinuity within a cell and the time evolution is then accomplished by solving the differential equation along characteristics locally and advancing in the characteristic direction. This scheme is denoted ENO/SRCD (subcell resolution - characteristic direction). All the schemes are tested on the equation of LeVeque and Yee (NASA-TM-100075, 1988) modeling reacting flow problems. Numerical results show that these schemes handle this intriguing model problem very well, especially with ENO/SRCD which produces perfect resolution at the discontinuity.

  13. Exploiting semantic linkages among multiple sources for semantic information retrieval

    NASA Astrophysics Data System (ADS)

    Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang

    2014-07-01

    The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.

  14. Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.

    PubMed

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei

    2017-04-01

    Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.

  15. Photovoltaic village power application: assessment of the near-term market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenblum, L.; Bifano, W.J.; Poley, W.A.

    1978-01-01

    A preliminary assessment of the near-term market for photovoltaic village power applications is presented. One of the objectives of the Department of Energy's (DOE) National Photovoltaic Program is to stimulate the demand for photovoltaic power systems so that appropriate markets will be developed in the near-term to support the increasing photovoltaic production capacity also being developed by DOE. The village power application represents such a potential market for photovoltaics. The price of energy for photovoltaic systems is compared to that of utility line extensions and diesel generators. The potential ''domestic''' demand (including the 50 states of the union plus themore » areas under legal control of the U.S. government) is defined in both the goverment and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries are also discussed briefly. It is concluded that a near-term domestic market of at least 12 MW (peak) and a foreign market of about 10 GW (peak) exists and that significant market penetration should be possible beginning in the 1981--82 period.« less

  16. Single colloidal quantum dots as sources of single photons for quantum cryptography

    NASA Astrophysics Data System (ADS)

    Pisanello, Ferruccio; Qualtieri, Antonio; Leménager, Godefroy; Martiradonna, Luigi; Stomeo, Tiziana; Cingolani, Roberto; Bramati, Alberto; De Vittorio, Massimo

    2011-02-01

    Colloidal nanocrystals, i.e. quantum dots synthesized trough wet-chemistry approaches, are promising nanoparticles for photonic applications and, remarkably, their quantum nature makes them very promising for single photon emission at room temperature. In this work we describe two approaches to engineer the emission properties of these nanoemitters in terms of radiative lifetime and photon polarization, drawing a viable strategy for their exploitation as room-temperature single photon sources for quantum information and quantum telecommunications.

  17. REVIEW OF VOLATILE ORGANIC COMPOUND SOURCE APPORTIONMENT BY CHEMICAL MASS BALANCE. (R826237)

    EPA Science Inventory

    The chemical mass balance (CMB) receptor model has apportioned volatile organic compounds (VOCs) in more than 20 urban areas, mostly in the United States. These applications differ in terms of the total fraction apportioned, the calculation method, the chemical compounds used ...

  18. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  19. Mass transfer apparatus and method for separation of gases

    DOEpatents

    Blount, Gerald C.

    2015-10-13

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  20. Mass transfer apparatus and method for separation of gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blount, Gerald C.; Gorensek, Maximilian Boris; Hamm, Luther L.

    A process and apparatus for separating components of a source gas is provided in which more soluble components of the source gas are dissolved in an aqueous solvent at high pressure. The system can utilize hydrostatic pressure to increase solubility of the components of the source gas. The apparatus includes gas recycle throughout multiple mass transfer stages to improve mass transfer of the targeted components from the liquid to gas phase. Separated components can be recovered for use in a value added application or can be processed for long-term storage, for instance in an underwater reservoir.

  1. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  2. Emergency Preparedness technology support to the Health and Safety Executive (HSE), Nuclear Installations Inspectorate (NII) of the United Kingdom. Appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.

    1994-03-01

    The Nuclear Installations Inspectorate (NII) of the United Kingdom (UK) suggested the use of an accident progression logic model method developed by Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) for K Reactor to predict the magnitude and timing of radioactivity releases (the source term) based on an advanced logic model methodology. Predicted releases are output from the personal computer-based model in a level-of-confidence format. Additional technical discussions eventually led to a request from the NII to develop a proposal for assembling a similar technology to predict source terms for the UK`s advanced gas-cooled reactor (AGR) type.more » To respond to this request, WSRC is submitting a proposal to provide contractual assistance as specified in the Scope of Work. The work will produce, document, and transfer technology associated with a Decision-Oriented Source Term Estimator for Emergency Preparedness (DOSE-EP) for the NII to apply to AGRs in the United Kingdom. This document, Appendix A is a part of this proposal.« less

  3. Can fungi compete with marine sources for chitosan production?

    PubMed

    Ghormade, V; Pathan, E K; Deshpande, M V

    2017-11-01

    Chitosan, a β-1,4-linked glucosamine polymer is formed by deacetylation of chitin. It has a wide range of applications from agriculture to human health care products. Chitosan is commercially produced from shellfish, shrimp waste, crab and lobster processing using strong alkalis at high temperatures for long time periods. The production of chitin and chitosan from fungal sources has gained increased attention in recent years due to potential advantages in terms of homogenous polymer length, high degree of deacetylation and solubility over the current marine source. Zygomycetous fungi such as Absidia coerulea, Benjaminiella poitrasii, Cunninghamella elegans, Gongrenella butleri, Mucor rouxii, Mucor racemosus and Rhizopus oryzae have been studied extensively. Isolation of chitosan are reported from few edible basidiomycetous fungi like Agaricus bisporus, Lentinula edodes and Pleurotus sajor-caju. Other organisms from mycotech industries explored for chitosan production are Aspergillus niger, Penicillium chrysogenum, Saccharomyces cerevisiae and other wine yeasts. Number of aspects such as value addition to the existing applications of fungi, utilization of waste from agriculture sector, and issues and challenges for the production of fungal chitosan to compete with existing sources, metabolic engineering and novel applications have been discussed to adjudge the potential of fungal sources for commercial chitosan production. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices

    NASA Astrophysics Data System (ADS)

    Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla

    2017-08-01

    We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6 mm2) has been previously developed for range finding applications and is able to provide short, high energy (˜100 ps, ˜0.5 nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades.

  5. Atomic processes and equation of state of high Z plasmas for EUV sources and their effects on the spatial and temporal evolution of the plasmas

    NASA Astrophysics Data System (ADS)

    Sasaki, Akira; Sunahara, Atushi; Furukawa, Hiroyuki; Nishihara, Katsunobu; Nishikawa, Takeshi; Koike, Fumihiro

    2016-03-01

    Laser-produced plasma (LPP) extreme ultraviolet (EUV) light sources have been intensively investigated due to potential application to next-generation semiconductor technology. Current studies focus on the atomic processes and hydrodynamics of plasmas to develop shorter wavelength sources at λ = 6.x nm as well as to improve the conversion efficiency (CE) of λ = 13.5 nm sources. This paper examines the atomic processes of mid-z elements, which are potential candidates for λ = 6.x nm source using n=3-3 transitions. Furthermore, a method to calculate the hydrodynamics of the plasmas in terms of the initial interaction between a relatively weak prepulse laser is presented.

  6. A hybrid approach for nonlinear computational aeroacoustics predictions

    NASA Astrophysics Data System (ADS)

    Sassanis, Vasileios; Sescu, Adrian; Collins, Eric M.; Harris, Robert E.; Luke, Edward A.

    2017-01-01

    In many aeroacoustics applications involving nonlinear waves and obstructions in the far-field, approaches based on the classical acoustic analogy theory or the linearised Euler equations are unable to fully characterise the acoustic field. Therefore, computational aeroacoustics hybrid methods that incorporate nonlinear wave propagation have to be constructed. In this study, a hybrid approach coupling Navier-Stokes equations in the acoustic source region with nonlinear Euler equations in the acoustic propagation region is introduced and tested. The full Navier-Stokes equations are solved in the source region to identify the acoustic sources. The flow variables of interest are then transferred from the source region to the acoustic propagation region, where the full nonlinear Euler equations with source terms are solved. The transition between the two regions is made through a buffer zone where the flow variables are penalised via a source term added to the Euler equations. Tests were conducted on simple acoustic and vorticity disturbances, two-dimensional jets (Mach 0.9 and 2), and a three-dimensional jet (Mach 1.5), impinging on a wall. The method is proven to be effective and accurate in predicting sound pressure levels associated with the propagation of linear and nonlinear waves in the near- and far-field regions.

  7. Mapping water availability, projected use and cost in the western United States

    NASA Astrophysics Data System (ADS)

    Tidwell, Vincent C.; Moreland, Barbara D.; Zemlick, Katie M.; Roberts, Barry L.; Passell, Howard D.; Jensen, Daniel; Forsgren, Christopher; Sehlke, Gerald; Cook, Margaret A.; King, Carey W.; Larsen, Sara

    2014-05-01

    New demands for water can be satisfied through a variety of source options. In some basins surface and/or groundwater may be available through permitting with the state water management agency (termed unappropriated water), alternatively water might be purchased and transferred out of its current use to another (termed appropriated water), or non-traditional water sources can be captured and treated (e.g., wastewater). The relative availability and cost of each source are key factors in the development decision. Unfortunately, these measures are location dependent with no consistent or comparable set of data available for evaluating competing water sources. With the help of western water managers, water availability was mapped for over 1200 watersheds throughout the western US. Five water sources were individually examined, including unappropriated surface water, unappropriated groundwater, appropriated water, municipal wastewater and brackish groundwater. Also mapped was projected change in consumptive water use from 2010 to 2030. Associated costs to acquire, convey and treat the water, as necessary, for each of the five sources were estimated. These metrics were developed to support regional water planning and policy analysis with initial application to electric transmission planning in the western US.

  8. A Generalized Evolution Criterion in Nonequilibrium Convective Systems

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu; Nisizima, Kunisuke

    1989-04-01

    A general evolution criterion, applicable to transport processes such as the conduction of heat and mass diffusion, is obtained as a direct version of the Le Chatelier-Braun principle for stationary states. The present theory is not based on any radical departure from the conventional one. The generalized theory is made determinate by proposing the balance equations for extensive thermodynamic variables which will reflect the character of convective systems under the assumption of local equilibrium. As a consequence of the introduction of source terms in the balance equations, there appear additional terms in the expression of the local entropy production, which are bilinear in terms of the intensive variables and the sources. In the present paper, we show that we can construct a dissipation function for such general cases, in which the premises of the Glansdorff-Prigogine theory are accumulated. The new dissipation function permits us to formulate a generalized evolution criterion for convective systems.

  9. The VISTA spacecraft: Advantages of ICF (Inertial Confinement Fusion) for interplanetary fusions propulsion applications

    NASA Technical Reports Server (NTRS)

    Orth, Charles D.; Klein, Gail; Sercel, Joel; Hoffman, Nate; Murray, Kathy; Chang-Diaz, Franklin

    1987-01-01

    Inertial Confinement Fusion (ICF) is an attractive engine power source for interplanetary manned spacecraft, especially for near-term missions requiring minimum flight duration, because ICF has inherent high power-to-mass ratios and high specific impulses. We have developed a new vehicle concept called VISTA that uses ICF and is capable of round-trip manned missions to Mars in 100 days using A.D. 2020 technology. We describe VISTA's engine operation, discuss associated plasma issues, and describe the advantages of DT fuel for near-term applications. Although ICF is potentially superior to non-fusion technologies for near-term interplanetary transport, the performance capabilities of VISTA cannot be meaningfully compared with those of magnetic-fusion systems because of the lack of a comparable study of the magnetic-fusion systems. We urge that such a study be conducted.

  10. Hiring in a Hobbesian World. Social Infrastructure and Employers' Use of Information.

    ERIC Educational Resources Information Center

    Miller, Shazia Rafiullah; Rosenbaum, James E.

    1997-01-01

    Interviews of 51 employers showed they do not use transcripts or teacher recommendations in hiring. They mistrust applicant information from most sources, emphasizing interviews and "gut instinct," which often gives invalid results. They tend to use information from other employees or long-term social networks. (SK)

  11. Hydrogen: A Future Energy Mediator?

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1975

    1975-01-01

    Hydrogen may be the fuel to help the United States to a non fossil energy source. Although hydrogen may not be widely used as a fuel until after the turn of the century, special applications may become feasible in the short term. Costs, uses, safety, and production methods are discussed. (BT)

  12. Modeling greenhouse gas emissions from dairy farms

    USDA-ARS?s Scientific Manuscript database

    Dairy farms have been identified as an important source of greenhouse gas emissions. Within the farm, important emissions include enteric methane (CH4) from the animals, CH4 and nitrous oxide (N2O) from manure in housing facilities, during long-term storage and during field application, and N2O from...

  13. A Blended Learning Scenario to Enhance Learners' Oral Production Skills

    ERIC Educational Resources Information Center

    Kim, Hee-Kyung

    2015-01-01

    This paper examines the effectiveness of a mobile assisted blended learning scenario for pronunciation in Korean language. In particular, we analyze how asynchronous oral communication between learners of Korean and native speakers via "kakaotalk" (an open source mobile phone application) may be beneficial to the learner in terms of…

  14. 78 FR 28184 - Inviting Applications for Rural Business Opportunity Grants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-14

    ... long-term planning that integrates targeted investments in workforce training, infrastructure, research.... Examples of acceptable documentation include: A signed letter from the source of funds stating the amount... CFR 4284.639(d)(2)). You must describe a structural change (for example, the loss of major employer or...

  15. 40 CFR 142.307 - What terms and conditions must be included in a small system variance?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that may affect proper and effective operation and maintenance of the technology; (2) Monitoring... effective installation, operation and maintenance of the applicable small system variance technology in... health, which may include: (i) Public education requirements; and (ii) Source water protection...

  16. Source term evaluation for combustion modeling

    NASA Technical Reports Server (NTRS)

    Sussman, Myles A.

    1993-01-01

    A modification is developed for application to the source terms used in combustion modeling. The modification accounts for the error of the finite difference scheme in regions where chain-branching chemical reactions produce exponential growth of species densities. The modification is first applied to a one-dimensional scalar model problem. It is then generalized to multiple chemical species, and used in quasi-one-dimensional computations of shock-induced combustion in a channel. Grid refinement studies demonstrate the improved accuracy of the method using this modification. The algorithm is applied in two spatial dimensions and used in simulations of steady and unsteady shock-induced combustion. Comparisons with ballistic range experiments give confidence in the numerical technique and the 9-species hydrogen-air chemistry model.

  17. HIGH-PRECISION ASTROMETRIC MILLIMETER VERY LONG BASELINE INTERFEROMETRY USING A NEW METHOD FOR ATMOSPHERIC CALIBRATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rioja, M.; Dodson, R., E-mail: maria.rioja@icrar.org

    2011-04-15

    We describe a new method which achieves high-precision very long baseline interferometry (VLBI) astrometry in observations at millimeter (mm) wavelengths. It combines fast frequency-switching observations, to correct for the dominant non-dispersive tropospheric fluctuations, with slow source-switching observations, for the remaining ionospheric dispersive terms. We call this method source-frequency phase referencing. Provided that the switching cycles match the properties of the propagation media, one can recover the source astrometry. We present an analytic description of the two-step calibration strategy, along with an error analysis to characterize its performance. Also, we provide observational demonstrations of a successful application with observations using themore » Very Long Baseline Array at 86 GHz of the pairs of sources 3C274 and 3C273 and 1308+326 and 1308+328 under various conditions. We conclude that this method is widely applicable to mm-VLBI observations of many target sources, and unique in providing bona fide astrometrically registered images and high-precision relative astrometric measurements in mm-VLBI using existing and newly built instruments, including space VLBI.« less

  18. Isotropic source terms of San Jacinto fault zone earthquakes based on waveform inversions with a generalized CAP method

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.; Zhu, L.

    2015-02-01

    We analyse source tensor properties of seven Mw > 4.2 earthquakes in the complex trifurcation area of the San Jacinto Fault Zone, CA, with a focus on isotropic radiation that may be produced by rock damage in the source volumes. The earthquake mechanisms are derived with generalized `Cut and Paste' (gCAP) inversions of three-component waveforms typically recorded by >70 stations at regional distances. The gCAP method includes parameters ζ and χ representing, respectively, the relative strength of the isotropic and CLVD source terms. The possible errors in the isotropic and CLVD components due to station variability is quantified with bootstrap resampling for each event. The results indicate statistically significant explosive isotropic components for at least six of the events, corresponding to ˜0.4-8 per cent of the total potency/moment of the sources. In contrast, the CLVD components for most events are not found to be statistically significant. Trade-off and correlation between the isotropic and CLVD components are studied using synthetic tests with realistic station configurations. The associated uncertainties are found to be generally smaller than the observed isotropic components. Two different tests with velocity model perturbation are conducted to quantify the uncertainty due to inaccuracies in the Green's functions. Applications of the Mann-Whitney U test indicate statistically significant explosive isotropic terms for most events consistent with brittle damage production at the source.

  19. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  20. Seeing the light: Applications of in situ optical measurements for understanding DOM dynamics in river systems (Invited)

    NASA Astrophysics Data System (ADS)

    Pellerin, B. A.; Bergamaschi, B. A.; Downing, B. D.; Saraceno, J.; Fleck, J.; Shanley, J. B.; Aiken, G.; Boss, E.; Fujii, R.

    2009-12-01

    A critical challenge for understanding the sources, character and cycling of dissolved organic matter (DOM) is making measurements at the time scales in which changes occur in aquatic systems. Traditional approaches for data collection (daily to monthly discrete sampling) are often limited by analytical and field costs, site access and logistical challenges, particularly for long-term sampling at a large number of sites. The ability to make optical measurements of DOM in situ has been known for more than 50 years, but much of the work on in situ DOM absorbance and fluorescence using commercially-available instruments has taken place in the last few years. Here we present several recent examples that highlight the application of in situ measurements for understanding DOM dynamics in riverine systems at intervals of minutes to hours. Examples illustrate the utility of in situ optical sensors for studies of DOM over short-duration events of days to weeks (diurnal cycles, tidal cycles, storm events and snowmelt periods) as well as longer-term continuous monitoring for months to years. We also highlight the application of in situ optical DOM measurements as proxies for constituents that are significantly more difficult and expensive to measure at high frequencies (e.g. methylmercury, trihalomethanes). Relatively simple DOM absorbance and fluorescence measurements made in situ could be incorporated into short and long-term ecological research and monitoring programs, resulting in advanced understanding of organic matter sources, character and cycling in riverine systems.

  1. Research on Geo-information Data Model for Preselected Areas of Geological Disposal of High-level Radioactive Waste

    NASA Astrophysics Data System (ADS)

    Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.

    2016-11-01

    The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.

  2. Rainfall intensity and phosphorus source effects on phosphorus transport in surface runoff from soil trays.

    PubMed

    Shigaki, Francirose; Sharpley, Andrew; Prochnow, Luis Ignacio

    2007-02-01

    Phosphorus runoff from agricultural fields amended with mineral fertilizers and manures has been linked to freshwater eutrophication. A rainfall simulation study was conducted to evaluate the effects of different rainfall intensities and P sources differing in water soluble P (WSP) concentration on P transport in runoff from soil trays packed with a Berks loam and grassed with annual ryegrass (Lolium multiflorum Lam.). Triple superphosphate (TSP; 79% WSP), low-grade super single phosphate (LGSSP; 50% WSP), North Carolina rock phosphate (NCRP; 0.5% WSP) and swine manure (SM; 70% WSP), were broadcast (100 kg total P ha-1) and rainfall applied at 25, 50 and 75 mm h-1 1, 7, 21, and 56 days after P source application. The concentration of dissolved reactive (DRP), particulate (PP), and total P (TP) was significantly (P<0.01) greater in runoff with a rainfall intensity of 75 than 25 mm h-1 for all P sources. Further, runoff DRP increased as P source WSP increased, with runoff from a 50 mm h-1 rain 1 day after source application having a DRP concentration of 0.25 mg L-1 for NCRP and 28.21 mg L-1 for TSP. In contrast, the proportion of runoff TP as PP was greater with low (39% PP for NCRP) than high WSP sources (4% PP for TSP) averaged for all rainfall intensities. The increased PP transport is attributed to the detachment and transport of undissolved P source particles during runoff. These results show that P source water solubility and rainfall intensity can influence P transport in runoff, which is important in evaluating the long-term risks of P source application on P transport in surface runoff.

  3. Alfalfa Responses to Gypsum Application Measured Using Undisturbed Soil Columns

    PubMed Central

    Tirado-Corbalá, Rebecca; Slater, Brian K.; Dick, Warren A.; Barker, Dave

    2017-01-01

    Gypsum is an excellent source of Ca and S, both of which are required for crop growth. Large amounts of by-product gypsum [Flue gas desulfurization gypsum-(FGDG)] are produced from coal combustion in the United States, but only 4% is used for agricultural purposes. The objective of this study was to evaluate the effects of (1) untreated, (2) short-term (4-year annual applications of gypsum totaling 6720 kg ha−1), and (3) long-term (12-year annual applications of gypsum totaling 20,200 kg ha−1) on alfalfa (Medicago sativa L.) growth and nutrient uptake, and gypsum movement through soil. The study was conducted in a greenhouse using undisturbed soil columns of two non-sodic soils (Celina silt loam and Brookston loam). Aboveground growth of alfalfa was not affected by gypsum treatments when compared with untreated (p > 0.05). Total root biomass (0–75 cm) for both soils series was significantly increased by gypsum application (p = 0.04), however, increased root growth was restricted to 0–10 cm depth. Soil and plant analyses indicated no unfavorable environmental impact from of the 4-year and 12-year annual application of FGDG. We concluded that under sufficient water supply, by-product gypsum is a viable source of Ca and S for land application that might benefit alfalfa root growth, but has less effect on aboveground alfalfa biomass production. Undisturbed soil columns were a useful adaptation of the lysimeter method that allowed detailed measurements of alfalfa nutrient uptake, root biomass, and yield and nutrient movement in soil. PMID:28696383

  4. Development and application of a screening model for evaluating bioenhanced dissolution in DNAPL source zones

    NASA Astrophysics Data System (ADS)

    Phelan, Thomas J.; Abriola, Linda M.; Gibson, Jenny L.; Smits, Kathleen M.; Christ, John A.

    2015-12-01

    In-situ bioremediation, a widely applied treatment technology for source zones contaminated with dense non-aqueous phase liquids (DNAPLs), has proven economical and reasonably efficient for long-term management of contaminated sites. Successful application of this remedial technology, however, requires an understanding of the complex interaction of transport, mass transfer, and biotransformation processes. The bioenhancement factor, which represents the ratio of DNAPL mass transfer under microbially active conditions to that which would occur under abiotic conditions, is commonly used to quantify the effectiveness of a particular bioremediation remedy. To date, little research has been directed towards the development and validation of methods to predict bioenhancement factors under conditions representative of real sites. This work extends an existing, first-order, bioenhancement factor expression to systems with zero-order and Monod kinetics, representative of many source-zone scenarios. The utility of this model for predicting the bioenhancement factor for previously published laboratory and field experiments is evaluated. This evaluation demonstrates the applicability of these simple bioenhancement factors for preliminary experimental design and analysis, and for assessment of dissolution enhancement in ganglia-contaminated source zones. For ease of application, a set of nomographs is presented that graphically depicts the dependence of bioenhancement factor on physicochemical properties. Application of these nomographs is illustrated using data from a well-documented field site. Results suggest that this approach can successfully capture field-scale, as well as column-scale, behavior. Sensitivity analyses reveal that bioenhanced dissolution will critically depend on in-situ biomass concentrations.

  5. Readability of online patient education materials for velopharyngeal insufficiency.

    PubMed

    Xie, Deborah X; Wang, Ray Y; Chinnadurai, Sivakumar

    2018-01-01

    Evaluate the readability of online and mobile application health information about velopharyngeal insufficiency (VPI). Top website and mobile application results for search terms "velopharyngeal insufficiency", "velopharyngeal dysfunction", "VPI", and "VPD" were analyzed. Readability was determined using 10 algorithms with Readability Studio Professional Edition (Oleander Software Ltd; Vandalia, OH). Subgroup analysis was performed based on search term and article source - academic hospital, general online resource, peer-reviewed journal, or professional organization. 18 unique articles were identified. Overall mean reading grade level was a 12.89 ± 2.9. The highest reading level among these articles was 15.47-approximately the level of a college senior. Articles from "velopharyngeal dysfunction" had the highest mean reading level (13.73 ± 2.11), above "velopharyngeal insufficiency" (12.30 ± 1.56) and "VPI" (11.66 ± 1.70). Articles from peer-reviewed journals had the highest mean reading level (15.35 ± 2.79), while articles from academic hospitals had the lowest (12.81 ± 1.66). There were statistically significant differences in reading levels between the different search terms (P < 0.01) and article source types (P < 0.05). Only one mobile application was identified with VPI information, with a readability of 10.68. Despite published reading level guidelines, online patient education materials for VPI are disseminated with language too complex for most readers. There is also a lack of VPI-related mobile application data available for patients. Patients will benefit if future updates to websites and disseminated patient information are undertaken with health literacy in mind. Future studies will investigate patient comprehension of these materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A modification of the Regional Nutrient Management model (ReNuMa) to identify long-term changes in riverine nitrogen sources

    NASA Astrophysics Data System (ADS)

    Hu, Minpeng; Liu, Yanmei; Wang, Jiahui; Dahlgren, Randy A.; Chen, Dingjiang

    2018-06-01

    Source apportionment is critical for guiding development of efficient watershed nitrogen (N) pollution control measures. The ReNuMa (Regional Nutrient Management) model, a semi-empirical, semi-process-oriented model with modest data requirements, has been widely used for riverine N source apportionment. However, the ReNuMa model contains limitations for addressing long-term N dynamics by ignoring temporal changes in atmospheric N deposition rates and N-leaching lag effects. This work modified the ReNuMa model by revising the source code to allow yearly changes in atmospheric N deposition and incorporation of N-leaching lag effects into N transport processes. The appropriate N-leaching lag time was determined from cross-correlation analysis between annual watershed individual N source inputs and riverine N export. Accuracy of the modified ReNuMa model was demonstrated through analysis of a 31-year water quality record (1980-2010) from the Yongan watershed in eastern China. The revisions considerably improved the accuracy (Nash-Sutcliff coefficient increased by ∼0.2) of the modified ReNuMa model for predicting riverine N loads. The modified model explicitly identified annual and seasonal changes in contributions of various N sources (i.e., point vs. nonpoint source, surface runoff vs. groundwater) to riverine N loads as well as the fate of watershed anthropogenic N inputs. Model results were consistent with previously modeled or observed lag time length as well as changes in riverine chloride and nitrate concentrations during the low-flow regime and available N levels in agricultural soils of this watershed. The modified ReNuMa model is applicable for addressing long-term changes in riverine N sources, providing decision-makers with critical information for guiding watershed N pollution control strategies.

  7. The technology application process as applied to a firefighter's breathing system

    NASA Technical Reports Server (NTRS)

    Mclaughlan, P. B.

    1974-01-01

    The FBS Program indicated that applications of advanced technology can result in an improved FBS that will satisfy the requirements defined by municipal fire departments. To accomplish this technology transfer, a substantial commitment of resources over an extended period of time has been required. This program has indicated that the ability of NASA in terms of program management such as requirement definition, system analysis, and industry coordination may play as important a role as specific sources of hardware technology. As a result of the FBS program, a sequence of milestones was passed that may have applications as generalized milestones and objectives for any technical application program.

  8. TE/TM decomposition of electromagnetic sources

    NASA Technical Reports Server (NTRS)

    Lindell, Ismo V.

    1988-01-01

    Three methods are given by which bounded EM sources can be decomposed into two parts radiating transverse electric (TE) and transverse magnetic (TM) fields with respect to a given constant direction in space. The theory applies source equivalence and nonradiating source concepts, which lead to decomposition methods based on a recursive formula or two differential equations for the determination of the TE and TM components of the original source. Decompositions for a dipole in terms of point, line, and plane sources are studied in detail. The planar decomposition is seen to match to an earlier result given by Clemmow (1963). As an application of the point decomposition method, it is demonstrated that the general exact image expression for the Sommerfeld half-space problem, previously derived through heuristic reasoning, can be more straightforwardly obtained through the present decomposition method.

  9. Long-term ferrocyanide application via deicing salts promotes the establishment of Actinomycetales assimilating ferrocyanide-derived carbon in soil.

    PubMed

    Gschwendtner, Silvia; Mansfeldt, Tim; Kublik, Susanne; Touliari, Evangelia; Buegger, Franz; Schloter, Michael

    2016-07-01

    Cyanides are highly toxic and produced by various microorganisms as defence strategy or to increase their competitiveness. As degradation is the most efficient way of detoxification, some microbes developed the capability to use cyanides as carbon and nitrogen source. However, it is not clear if this potential also helps to lower cyanide concentrations in roadside soils where deicing salt application leads to significant inputs of ferrocyanide. The question remains if biodegradation in soils can occur without previous photolysis. By conducting a microcosm experiment using soils with/without pre-exposition to road salts spiked with (13) C-labelled ferrocyanide, we were able to confirm biodegradation and in parallel to identify bacteria using ferrocyanide as C source via DNA stable isotope probing (DNA-SIP), TRFLP fingerprinting and pyrosequencing. Bacteria assimilating (13) C were highly similar in the pre-exposed soils, belonging mostly to Actinomycetales (Kineosporia, Mycobacterium, Micromonosporaceae). In the soil without pre-exposition, bacteria belonging to Acidobacteria (Gp3, Gp4, Gp6), Gemmatimonadetes (Gemmatimonas) and Gammaproteobacteria (Thermomonas, Xanthomonadaceae) used ferrocyanide as C source but not the present Actinomycetales. This indicated that (i) various bacteria are able to assimilate ferrocyanide-derived C and (ii) long-term exposition to ferrocyanide applied with deicing salts leads to Actinomycetales outcompeting other microorganisms for the use of ferrocyanide as C source. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  10. Make Your Own Mashup Maps

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.; Whiting, Mervyn J.

    2008-01-01

    "Mashup" is a new technology term used to describe a web application that combines data or technology from several different sources. You can apply this concept in your classroom by having students create their own mashup maps. Google Maps provides you with the simple tools, map databases, and online help you'll need to quickly master this…

  11. Nutrient cycling in an agroforestry alley cropping system receiving poultry litter or nitrogen fertilizer

    USDA-ARS?s Scientific Manuscript database

    Optimal utilization of animal manures as a plant nutrient source should also prevent adverse impacts on water quality. The objective of this study was to evaluate long-term poultry litter and N fertilizer application on nutrient cycling following establishment of an alley cropping system with easter...

  12. Energy for Development: Third World Options. Worldwatch Paper 15.

    ERIC Educational Resources Information Center

    Hayes, Denis

    Focusing on the need for energy to sustain economic development on a long-term basis, the document examines energy options of the post-petroleum era in developing nations. Nuclear power and solar power are the most important among proposed alternative energy sources. Limited applicability of nuclear technology to the Third World is discussed.…

  13. COMPARATIVE POTENCY METHOD FOR CANCER RISK ASSESSMENT: APPLICATION TO THE QUANTITATIVE ASSESSMENT OF THE CONTRIBUTION OF COMBUSTION EMISSIONS TO LUNG CANCER RISK

    EPA Science Inventory

    Combustion sources emit soot particles containing carcinogenic polycyclic organic compounds which are mutagenic in short-term genetic bioassays in microbial and mammalian cells and are tumorigenic in animals. Although soot is considered to be a human carcinogen, soots from differ...

  14. 76 FR 12715 - Magnuson-Stevens Act Provisions; General Provisions for Domestic Fisheries; Application for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-08

    ... and marketing evaluation and strategies; and outreach and implementation of the project results. The... devise strategies and means to efficiently harvest the redfish resource in the Gulf of Maine (GOM) while... in terms of their potential effects on results. Sources of variability include: Area fished; seasonal...

  15. NSRD-10: Leak Path Factor Guidance Using MELCOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.; McWhorter, D.B.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a proposed framework for quantifying the degree to which risk is reduced as mass is removed from DNAPL source areas in shallow, saturated, low-permeability media. Risk is defined in terms of meeting an alternate concentration limit (ACL) at a compliance well in an aquifer underlying the sourcemore » zone. The ACL is back-calculated from a carcinogenic health-risk characterization at a downgradient water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phase (aqueous, sorbed, NAPL). Due to the uncertainties in currently available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making specific risk-reduction calculations for individual technologies. Despite the qualitative nature of the exercise, results imply that very high total mass-removal efficiencies are required to achieve significant long-term risk reduction with technology applications of finite duration. This paper is not an argument for no action at contaminated sites. Rather, it provides support for the conclusions of Cherry et al. (1992) that the primary goal of current remediation should be short-term risk reduction through containment, with the aim to pass on to future generations site conditions that are well-suited to the future applications of emerging technologies with improved mass-removal capabilities.« less

  17. Analysis of Vibratory Excitation of Gear Systems as a Contributor to Aircraft Interior Noise. [helicopter cabin noise

    NASA Technical Reports Server (NTRS)

    Mark, W. D.

    1979-01-01

    Application of the transfer function approach to predict the resulting interior noise contribution requires gearbox vibration sources and paths to be characterized in the frequency domain. Tooth-face deviations from perfect involute surfaces were represented in terms of Legendre polynomials which may be directly interpreted in terms of tooth-spacing errors, mean and random deviations associated with involute slope and fullness, lead mismatch and crowning, and analogous higher-order components. The contributions of these components to the spectrum of the static transmission error is discussed and illustrated using a set of measurements made on a pair of helicopter spur gears. The general methodology presented is applicable to both spur and helical gears.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabrera-Palmer, Belkis

    Predicting the performance of radiation detection systems at field sites based on measured performance acquired under controlled conditions at test locations, e.g., the Nevada National Security Site (NNSS), remains an unsolved and standing issue within DNDO’s testing methodology. Detector performance can be defined in terms of the system’s ability to detect and/or identify a given source or set of sources, and depends on the signal generated by the detector for the given measurement configuration (i.e., source strength, distance, time, surrounding materials, etc.) and on the quality of the detection algorithm. Detector performance is usually evaluated in the performance and operationalmore » testing phases, where the measurement configurations are selected to represent radiation source and background configurations of interest to security applications.« less

  19. Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices.

    PubMed

    Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla

    2017-08-01

    We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6  mm2) has been previously developed for range finding applications and is able to provide short, high energy (∼100  ps, ∼0.5  nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE PAGES

    Khodak, Andrei

    2017-08-21

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  1. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, Andrei

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  2. A simple mass-conserved level set method for simulation of multiphase flows

    NASA Astrophysics Data System (ADS)

    Yuan, H.-Z.; Shu, C.; Wang, Y.; Shu, S.

    2018-04-01

    In this paper, a modified level set method is proposed for simulation of multiphase flows with large density ratio and high Reynolds number. The present method simply introduces a source or sink term into the level set equation to compensate the mass loss or offset the mass increase. The source or sink term is derived analytically by applying the mass conservation principle with the level set equation and the continuity equation of flow field. Since only a source term is introduced, the application of the present method is as simple as the original level set method, but it can guarantee the overall mass conservation. To validate the present method, the vortex flow problem is first considered. The simulation results are compared with those from the original level set method, which demonstrates that the modified level set method has the capability of accurately capturing the interface and keeping the mass conservation. Then, the proposed method is further validated by simulating the Laplace law, the merging of two bubbles, a bubble rising with high density ratio, and Rayleigh-Taylor instability with high Reynolds number. Numerical results show that the mass is a well-conserved by the present method.

  3. Sources and Nature of Cost Analysis Data Base Reference Manual.

    DTIC Science & Technology

    1983-07-01

    COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON

  4. Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application

    NASA Astrophysics Data System (ADS)

    Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni

    2018-06-01

    Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.

  5. The sound of moving bodies. Ph.D. Thesis - Cambridge Univ.

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth Steven

    1990-01-01

    The importance of the quadrupole source term in the Ffowcs, Williams, and Hawkings (FWH) equation was addressed. The quadrupole source contains fundamental components of the complete fluid mechanics problem, which are ignored only at the risk of error. The results made it clear that any application of the acoustic analogy should begin with all of the source terms in the FWH theory. The direct calculation of the acoustic field as part of the complete unsteady fluid mechanics problem using CFD is considered. It was shown that aeroelastic calculation can indeed be made with CFD codes. The results indicate that the acoustic field is the most susceptible component of the computation to numerical error. Therefore, the ability to measure the damping of acoustic waves is absolutely essential both to develop acoustic computations. Essential groundwork for a new approach to the problem of sound generation by moving bodies is presented. This new computational acoustic approach holds the promise of solving many problems hitherto pushed aside.

  6. Gene and protein nomenclature in public databases

    PubMed Central

    Fundel, Katrin; Zimmer, Ralf

    2006-01-01

    Background Frequently, several alternative names are in use for biological objects such as genes and proteins. Applications like manual literature search, automated text-mining, named entity identification, gene/protein annotation, and linking of knowledge from different information sources require the knowledge of all used names referring to a given gene or protein. Various organism-specific or general public databases aim at organizing knowledge about genes and proteins. These databases can be used for deriving gene and protein name dictionaries. So far, little is known about the differences between databases in terms of size, ambiguities and overlap. Results We compiled five gene and protein name dictionaries for each of the five model organisms (yeast, fly, mouse, rat, and human) from different organism-specific and general public databases. We analyzed the degree of ambiguity of gene and protein names within and between dictionaries, to a lexicon of common English words and domain-related non-gene terms, and we compared different data sources in terms of size of extracted dictionaries and overlap of synonyms between those. The study shows that the number of genes/proteins and synonyms covered in individual databases varies significantly for a given organism, and that the degree of ambiguity of synonyms varies significantly between different organisms. Furthermore, it shows that, despite considerable efforts of co-curation, the overlap of synonyms in different data sources is rather moderate and that the degree of ambiguity of gene names with common English words and domain-related non-gene terms varies depending on the considered organism. Conclusion In conclusion, these results indicate that the combination of data contained in different databases allows the generation of gene and protein name dictionaries that contain significantly more used names than dictionaries obtained from individual data sources. Furthermore, curation of combined dictionaries considerably increases size and decreases ambiguity. The entries of the curated synonym dictionary are available for manual querying, editing, and PubMed- or Google-search via the ProThesaurus-wiki. For automated querying via custom software, we offer a web service and an exemplary client application. PMID:16899134

  7. Stem cell biobanks.

    PubMed

    Bardelli, Silvana

    2010-04-01

    Stem cells contribute to innate healing and harbor a promising role for regenerative medicine. Stem cell banking through long-term storage of different stem cell platforms represents a fundamental source to preserve original features of stem cells for patient-specific clinical applications. Stem cell research and clinical translation constitute fundamental and indivisible modules catalyzed through biobanking activity, generating a return of investment.

  8. The influence of legacy P on lake water quality in a Midwestern agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    Decades of fertilizer and manure application have led to a buildup of phosphorus (P) in agricultural soils and stream and lake sediments, commonly referred to as legacy P. Legacy P can provide a long-term source of P to surface waters where it causes eutrophication. Using a suite of numerical model...

  9. DRUG ABUSE, A SOURCE BOOK AND GUIDE FOR TEACHERS.

    ERIC Educational Resources Information Center

    HILL, PATRICIA J.; KITZINGER, ANGELA

    THIS SOURCEBOOK CONTAINS INFORMATION TO HELP TEACHERS INSTRUCT ABOUT DRUGS AND DISCOURAGE DRUG ABUSE. THE INFORMATION IS APPLICABLE TO ANY GROUP OR GRADE LEVEL BUT IT IS PRIMARILY DIRECTED AT A K-12 PROGRAM. THE CONTENT HAS BEEN SELECTED, ORGANIZED, AND PRESENTED IN TERMS OF PRESUMED TEACHER NEED AND IS NOT INTENDED FOR DIRECT PUPIL USE.…

  10. Telling the story of tree species’ range shifts in a complex landscape

    Treesearch

    Sharon M. Stanton; Vicente J. Monleon; Heather E. Lintz; Joel Thompson

    2015-01-01

    The Forest Inventory and Analysis Program is the unrivaled source for long-term, spatially balanced, publicly available data. FIA will continue to be providers of data, but the program is growing and adapting, including a shift in how we communicate information and knowledge derived from those data. Online applications, interactive mapping, and infographics provide...

  11. Photovoltaics as a terrestrial energy source. Volume 3: An overview

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1980-01-01

    Photovoltaic (PV) systems were evaluated in terms of their potential for terrestrial application A comprehensive overview of important issues which bear on photovoltaic (PV) systems development is presented. Studies of PV system costs, the societal implications of PV system development, and strategies in PV research and development in relationship to current energy policies are summarized.

  12. Biotechnological production and practical application of L-asparaginase enzyme.

    PubMed

    Vimal, Archana; Kumar, Awanish

    2017-04-01

    L-asparaginase is a vital enzyme of medical importance, and renowned as a chemotherapeutic agent. The relevance of this enzyme is not only limited as an anti-cancer agent, it also possesses a wide range of medical application. The application includes the antimicrobial property, treatment of infectious diseases, autoimmune diseases, canine and feline cancer. Apart from the health care industry, its significance is also established in the food sector as a food processing agent to reduce the acrylamide concentration. L-asparaginase is known to be produced from various bacterial, fungal and plant sources. However, there is a huge market demand due to its wide range of application. Therefore, the industry is still in the search of better-producing source in terms of high yield and low immunogenicity. It can be produced by both submerged and solid state fermentation, and each fermentation process has its own merits and demerits. This review paper focuses on its improved production strategy by adopting statistical experimental optimization techniques, development of recombinant strains, through mutagenesis and nanoparticle immobilization, adopting advanced and cost-effective purification techniques. Available research literature proves the competence and therapeutic potential of this enzyme. Therefore, research orientation toward the exploration of this clinical significant enzyme has to be accelerated. The objectives of this review are to discuss the high yielding sources, current production strategies, improvement of production, effective downstream processing and therapeutic application of L-asparaginase.

  13. Syphilis in immigrants and the Canadian immigration medical examination.

    PubMed

    MacPherson, Douglas W; Gushulak, Brian D

    2008-02-01

    Immigrants to Canada must undergo screening for syphilis. This study presents the results of syphilis screening from 2000 to 2004 and describes its impact on Canadian syphilis reporting and epidemiology. The study identifies migrant groups at risk of syphilis disease. All permanent resident applicants 15 years of age or older; younger individuals who have syphilis risk factors, and long-term temporary resident applicants are required to have non-treponemal syphilis screening done. Reactive results were confirmed. Immigration-related syphilis screening results were analyzed for year, migrant origin, migrant age and classification. A total of 2,209 individuals were found with positive syphilis serology from the screening of 2,001,417 applicants. The sex ratio of positive cases was M:F = 1.4. Rates per 100,000 applicants were: refugees 286, refugee claimants 267, family class 187, temporary residents 85, and economic class 63. Age and geographic distribution reflected sexual transmission, known international prevalence, and the Canadian processes of immigration. Certain immigration class applicants from syphilis high-prevalence source countries are a significant source of syphilis notifications in Canada. Identifiable populations and the immigration application medical processes represent global public health policy and program opportunities at the national level.

  14. Sewage sludge as fertiliser - environmental assessment of storage and land application options.

    PubMed

    Willén, A; Junestedt, C; Rodhe, L; Pell, M; Jönsson, H

    2017-03-01

    Sewage sludge (SS) contains beneficial plant nutrients and organic matter, and therefore application of SS on agricultural land helps close nutrient loops. However, spreading operations are restricted to certain seasons and hence the SS needs to be stored. Storage and land application of SS are both potential sources of greenhouse gases and ammonia, leading to global warming, acidification and eutrophication. Covering the stored SS, treating it with urea and choosing the correct time for land application all have the potential to reduce emissions from the system. Using life cycle assessment (LCA), this study compares storage and land application options of SS in terms of global warming potential (GWP), acidification potential, eutrophication potential and primary energy use. The system with covered storage has the lowest impact of all categories. Systems with autumn application are preferable to spring application for all impact categories but, when nitrate leaching is considered, spring application is preferable in terms of eutrophication and primary energy use and, for some SS treatments, GWP. Ammonia addition reduces nitrous oxide and ammonia emissions during storage, but increases these emissions after land application. Storage duration has a large impact on GWP, while amount of chemical nitrogen fertiliser substituted has a large impact on primary energy use.

  15. Ammonium as sole N source improves grain quality in wheat.

    PubMed

    Fuertes-Mendizábal, Teresa; González-Torralba, Jon; Arregui, Luis M; González-Murua, Carmen; González-Moro, M Begoña; Estavillo, José M

    2013-07-01

    The skilful handling of N fertilizer, including N source type and its timing, is necessary to obtain maximum profitability in wheat crops in terms of production and quality. Studies on grain yield and quality with ammonium as sole N source have not yet been conducted. The aim of this study was to evaluate the effect of N source management (nitrate vs. ammonium), and splitting it into two or three amendments during the wheat life cycle, on grain yield and quality under irrigated conditions. This experiment demonstrates that Cezanne wheat plants growing with ammonium as exclusive N source are able to achieve the same yield as plants growing with nitrate and that individual wheat plants grown in irrigated pots can efficiently use late N applied in GS37. Ammonium nutrition increased both types of grain reserve proteins (gliadins and glutenins) and also increased the ratio gli/glu with respect to nitrate nutrition. The splitting of the N rate enhanced the ammonium effect on grain protein composition. The application of ammonium N source, especially when split into three amendments, has an analogous effect on grain protein content and composition to applications at a higher N rate, leading to higher N use efficiency. © 2012 Society of Chemical Industry.

  16. Weak unique continuation property and a related inverse source problem for time-fractional diffusion-advection equations

    NASA Astrophysics Data System (ADS)

    Jiang, Daijun; Li, Zhiyuan; Liu, Yikan; Yamamoto, Masahiro

    2017-05-01

    In this paper, we first establish a weak unique continuation property for time-fractional diffusion-advection equations. The proof is mainly based on the Laplace transform and the unique continuation properties for elliptic and parabolic equations. The result is weaker than its parabolic counterpart in the sense that we additionally impose the homogeneous boundary condition. As a direct application, we prove the uniqueness for an inverse problem on determining the spatial component in the source term by interior measurements. Numerically, we reformulate our inverse source problem as an optimization problem, and propose an iterative thresholding algorithm. Finally, several numerical experiments are presented to show the accuracy and efficiency of the algorithm.

  17. Carbon-dependent alleviation of ammonia toxicity for algae cultivation and associated mechanisms exploration.

    PubMed

    Lu, Qian; Chen, Paul; Addy, Min; Zhang, Renchuan; Deng, Xiangyuan; Ma, Yiwei; Cheng, Yanling; Hussain, Fida; Chen, Chi; Liu, Yuhuan; Ruan, Roger

    2018-02-01

    Ammonia toxicity in wastewater is one of the factors that limit the application of algae technology in wastewater treatment. This work explored the correlation between carbon sources and ammonia assimilation and applied a glucose-assisted nitrogen starvation method to alleviate ammonia toxicity. In this study, ammonia toxicity to Chlorella sp. was observed when NH 3 -N concentration reached 28.03mM in artificial wastewater. Addition of alpha-ketoglutarate in wastewater promoted ammonia assimilation, but low utilization efficiency and high cost of alpha-ketoglutarate limits its application in wastewater treatment. Comparison of three common carbon sources, glucose, citric acid, and sodium bicarbonate, indicates that in terms of ammonia assimilation, glucose is the best carbon source. Experimental results suggest that organic carbon with good ability of generating energy and hydride donor may be critical to ammonia assimilation. Nitrogen starvation treatment assisted by glucose increased ammonia removal efficiencies and algal viabilities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2010-12-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Terrestrial Applications of Extreme Environment Stirling Space Power Systems

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger. W.

    2012-01-01

    NASA has been developing power systems capable of long-term operation in extreme environments such as the surface of Venus. This technology can use any external heat source to efficiently provide electrical power and cooling; and it is designed to be extremely efficient and reliable for extended space missions. Terrestrial applications include: use in electric hybrid vehicles; distributed home co-generation/cooling; and quiet recreational vehicle power generation. This technology can reduce environmental emissions, petroleum consumption, and noise while eliminating maintenance and environmental damage from automotive fluids such as oil lubricants and air conditioning coolant. This report will provide an overview of this new technology and its applications.

  20. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  1. Implementation issues of the nearfield equivalent source imaging microphone array

    NASA Astrophysics Data System (ADS)

    Bai, Mingsian R.; Lin, Jia-Hong; Tseng, Chih-Wen

    2011-01-01

    This paper revisits a nearfield microphone array technique termed nearfield equivalent source imaging (NESI) proposed previously. In particular, various issues concerning the implementation of the NESI algorithm are examined. The NESI can be implemented in both the time domain and the frequency domain. Acoustical variables including sound pressure, particle velocity, active intensity and sound power are calculated by using multichannel inverse filters. Issues concerning sensor deployment are also investigated for the nearfield array. The uniform array outperformed a random array previously optimized for far-field imaging, which contradicts the conventional wisdom in far-field arrays. For applications in which only a patch array with scarce sensors is available, a virtual microphone approach is employed to ameliorate edge effects using extrapolation and to improve imaging resolution using interpolation. To enhance the processing efficiency of the time-domain NESI, an eigensystem realization algorithm (ERA) is developed. Several filtering methods are compared in terms of computational complexity. Significant saving on computations can be achieved using ERA and the frequency-domain NESI, as compared to the traditional method. The NESI technique was also experimentally validated using practical sources including a 125 cc scooter and a wooden box model with a loudspeaker fitted inside. The NESI technique proved effective in identifying broadband and non-stationary sources produced by the sources.

  2. Fusion or confusion: knowledge or nonsense?

    NASA Astrophysics Data System (ADS)

    Rothman, Peter L.; Denton, Richard V.

    1991-08-01

    The terms 'data fusion,' 'sensor fusion,' multi-sensor integration,' and 'multi-source integration' have been used widely in the technical literature to refer to a variety of techniques, technologies, systems, and applications which employ and/or combine data derived from multiple information sources. Applications of data fusion range from real-time fusion of sensor information for the navigation of mobile robots to the off-line fusion of both human and technical strategic intelligence data. The Department of Defense Critical Technologies Plan lists data fusion in the highest priority group of critical technologies, but just what is data fusion? The DoD Critical Technologies Plan states that data fusion involves 'the acquisition, integration, filtering, correlation, and synthesis of useful data from diverse sources for the purposes of situation/environment assessment, planning, detecting, verifying, diagnosing problems, aiding tactical and strategic decisions, and improving system performance and utility.' More simply states, sensor fusion refers to the combination of data from multiple sources to provide enhanced information quality and availability over that which is available from any individual source alone. This paper presents a survey of the state-of-the- art in data fusion technologies, system components, and applications. A set of characteristics which can be utilized to classify data fusion systems is presented. Additionally, a unifying mathematical and conceptual framework within which to understand and organize fusion technologies is described. A discussion of often overlooked issues in the development of sensor fusion systems is also presented.

  3. China Naval Modernization: Implications for U.S. Navy Capabilities - Background and Issues for Congress

    DTIC Science & Technology

    2012-03-23

    Carrier-based fighters 0 0 0 0 0 ~60 ~90 Helicopters n/a n/a n/a n/a ~34 ~153 ~157 Subtotal above aircraft n/a n/a n/a n/a ~179 ~468 ~ 505 Source...strategic emphasis on the Asia-Pacific region and the Middle East, but are applicable anywhere on the globe where U.S. national security or vital...support application against ground forces. Developing a near-term, long-range naval gunfire engagement capability for air, missile, and surface

  4. Long time stability of lamps with nanostructural carbon field emission cathodes

    NASA Astrophysics Data System (ADS)

    Kalenik, J.; Firek, P.; Szmidt, J.; Czerwosz, E.; Kozłowski, M.; Stepińska, I.; Wódka, T.

    2017-08-01

    A luminescent lamp with field emission cathode was constructed and tested. Phosphor excited by electrons from field emission cathode is the source of light. The cathode is covered with nickel-carbon film containing multilayer carbon nanotubes that enhance electron emission from the cathode. Results of luminance stability measurements are presented. Luminance of elaborated luminance lamp is high enough for lighting application. Long term stability (1000 hours) is satisfactory for mass lamp application. Initial short time decrease of luminance is still too high and it needs reduction.

  5. Asymptotically and exactly energy balanced augmented flux-ADER schemes with application to hyperbolic conservation laws with geometric source terms

    NASA Astrophysics Data System (ADS)

    Navas-Montilla, A.; Murillo, J.

    2016-07-01

    In this work, an arbitrary order HLL-type numerical scheme is constructed using the flux-ADER methodology. The proposed scheme is based on an augmented Derivative Riemann solver that was used for the first time in Navas-Montilla and Murillo (2015) [1]. Such solver, hereafter referred to as Flux-Source (FS) solver, was conceived as a high order extension of the augmented Roe solver and led to the generation of a novel numerical scheme called AR-ADER scheme. Here, we provide a general definition of the FS solver independently of the Riemann solver used in it. Moreover, a simplified version of the solver, referred to as Linearized-Flux-Source (LFS) solver, is presented. This novel version of the FS solver allows to compute the solution without requiring reconstruction of derivatives of the fluxes, nevertheless some drawbacks are evidenced. In contrast to other previously defined Derivative Riemann solvers, the proposed FS and LFS solvers take into account the presence of the source term in the resolution of the Derivative Riemann Problem (DRP), which is of particular interest when dealing with geometric source terms. When applied to the shallow water equations, the proposed HLLS-ADER and AR-ADER schemes can be constructed to fulfill the exactly well-balanced property, showing that an arbitrary quadrature of the integral of the source inside the cell does not ensure energy balanced solutions. As a result of this work, energy balanced flux-ADER schemes that provide the exact solution for steady cases and that converge to the exact solution with arbitrary order for transient cases are constructed.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a framework for quantifying the degree to which risk is reduced as mass is removed from shallow, saturated, low-permeability, dual-porosity, DNAPL source zones. Risk is defined in terms of meeting an alternate concentration level (ACL) at a compliance well in an aquifer underlying the source zone. Themore » ACL is back-calculated from a carcinogenic health-risk characterization at a downstream water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phases (dissolved, sorbed, free product). Due to the uncertainties in currently-available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making risk-reduction calculations for specific technologies. Despite the qualitative nature of the exercise, results imply that very high mass-removal efficiencies are required to achieve significant long-term risk reduction with technology, applications of finite duration. 17 refs., 7 figs., 6 tabs.« less

  7. ISS Ambient Air Quality: Updated Inventory of Known Aerosol Sources

    NASA Technical Reports Server (NTRS)

    Meyer, Marit

    2014-01-01

    Spacecraft cabin air quality is of fundamental importance to crew health, with concerns encompassing both gaseous contaminants and particulate matter. Little opportunity exists for direct measurement of aerosol concentrations on the International Space Station (ISS), however, an aerosol source model was developed for the purpose of filtration and ventilation systems design. This model has successfully been applied, however, since the initial effort, an increase in the number of crewmembers from 3 to 6 and new processes on board the ISS necessitate an updated aerosol inventory to accurately reflect the current ambient aerosol conditions. Results from recent analyses of dust samples from ISS, combined with a literature review provide new predicted aerosol emission rates in terms of size-segregated mass and number concentration. Some new aerosol sources have been considered and added to the existing array of materials. The goal of this work is to provide updated filtration model inputs which can verify that the current ISS filtration system is adequate and filter lifetime targets are met. This inventory of aerosol sources is applicable to other spacecraft, and becomes more important as NASA considers future long term exploration missions, which will preclude the opportunity for resupply of filtration products.

  8. The implementation of the new Kentucky nitrogen and phosphorus index to reduce agricultural nonpoint source pollution

    USDA-ARS?s Scientific Manuscript database

    A new study released in September 2011 by the USDA found that all of three best management practices (BMPs) for nitrogen in terms of application rate, time, and method, are done for only about a third of U.S. cropland (http://www.ers.usda.gov/Publications/ERR127/). Without BMPs, the potential for ni...

  9. 77 FR 72840 - CE FLNG, LLC; Application for Long-Term Authorization To Export Liquefied Natural Gas Produced...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-06

    ... States natural gas pipeline system. CE FLNG states that the source of natural gas supply will come from... purchase gas for export from any point in the U.S. interstate pipeline system. CE FLNG states that this... Authorization To Export Liquefied Natural Gas Produced From Domestic Natural Gas Resources to Non-Free Trade...

  10. 78 FR 75337 - Eos LNG LLC; Application for Long-Term Authorization To Export Liquefied Natural Gas Produced...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-11

    ... Authorization To Export Liquefied Natural Gas Produced From Domestic Natural Gas Resources to Non-Free Trade...- contract authorization to export LNG produced from domestic sources in a volume equivalent to approximately... treatment for trade in natural gas (non-FTA countries) with which trade is not prohibited by U.S. law or...

  11. 78 FR 75339 - Barca LNG LLC; Application for Long-Term Authorization To Export Liquefied Natural Gas Produced...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-11

    ... Authorization To Export Liquefied Natural Gas Produced From Domestic Natural Gas Resources to Non-Free Trade...- contract authorization to export LNG produced from domestic sources in a volume equivalent to approximately... treatment for trade in natural gas (non-FTA countries) with which trade is not prohibited by U.S. law or...

  12. A boundary element approach to optimization of active noise control sources on three-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cunefare, K. A.; Koopmann, G. H.

    1991-01-01

    This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.

  13. A Second Law Based Unstructured Finite Volume Procedure for Generalized Flow Simulation

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok

    1998-01-01

    An unstructured finite volume procedure has been developed for steady and transient thermo-fluid dynamic analysis of fluid systems and components. The procedure is applicable for a flow network consisting of pipes and various fittings where flow is assumed to be one dimensional. It can also be used to simulate flow in a component by modeling a multi-dimensional flow using the same numerical scheme. The flow domain is discretized into a number of interconnected control volumes located arbitrarily in space. The conservation equations for each control volume account for the transport of mass, momentum and entropy from the neighboring control volumes. In addition, they also include the sources of each conserved variable and time dependent terms. The source term of entropy equation contains entropy generation due to heat transfer and fluid friction. Thermodynamic properties are computed from the equation of state of a real fluid. The system of equations is solved by a hybrid numerical method which is a combination of simultaneous Newton-Raphson and successive substitution schemes. The paper also describes the application and verification of the procedure by comparing its predictions with the analytical and numerical solution of several benchmark problems.

  14. Physical bases of the generation of short-term earthquake precursors: A complex model of ionization-induced geophysical processes in the lithosphere-atmosphere-ionosphere-magnetosphere system

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Ouzounov, D. P.; Karelin, A. V.; Davidenko, D. V.

    2015-07-01

    This paper describes the current understanding of the interaction between geospheres from a complex set of physical and chemical processes under the influence of ionization. The sources of ionization involve the Earth's natural radioactivity and its intensification before earthquakes in seismically active regions, anthropogenic radioactivity caused by nuclear weapon testing and accidents in nuclear power plants and radioactive waste storage, the impact of galactic and solar cosmic rays, and active geophysical experiments using artificial ionization equipment. This approach treats the environment as an open complex system with dissipation, where inherent processes can be considered in the framework of the synergistic approach. We demonstrate the synergy between the evolution of thermal and electromagnetic anomalies in the Earth's atmosphere, ionosphere, and magnetosphere. This makes it possible to determine the direction of the interaction process, which is especially important in applications related to short-term earthquake prediction. That is why the emphasis in this study is on the processes proceeding the final stage of earthquake preparation; the effects of other ionization sources are used to demonstrate that the model is versatile and broadly applicable in geophysics.

  15. Energy harvesting concepts for small electric unmanned systems

    NASA Astrophysics Data System (ADS)

    Qidwai, Muhammad A.; Thomas, James P.; Kellogg, James C.; Baucom, Jared N.

    2004-07-01

    In this study, we identify and survey energy harvesting technologies for small electrically powered unmanned systems designed for long-term (>1 day) time-on-station missions. An environmental energy harvesting scheme will provide long-term, energy additions to the on-board energy source. We have identified four technologies that cover a broad array of available energy sources: solar, kinetic (wind) flow, autophagous structure-power (both combustible and metal air-battery systems) and electromagnetic (EM) energy scavenging. We present existing conceptual designs, critical system components, performance, constraints and state-of-readiness for each technology. We have concluded that the solar and autophagous technologies are relatively matured for small-scale applications and are capable of moderate power output levels (>1 W). We have identified key components and possible multifunctionalities in each technology. The kinetic flow and EM energy scavenging technologies will require more in-depth study before they can be considered for implementation. We have also realized that all of the harvesting systems require design and integration of various electrical, mechanical and chemical components, which will require modeling and optimization using hybrid mechatronics-circuit simulation tools. This study provides a starting point for detailed investigation into the proposed technologies for unmanned system applications under current development.

  16. Related Studies in Long Term Lithium Battery Stability

    NASA Technical Reports Server (NTRS)

    Horning, R. J.; Chua, D. L.

    1984-01-01

    The continuing growth of the use of lithium electrochemical systems in a wide variety of both military and industrial applications is primarily a result of the significant benefits associated with the technology such as high energy density, wide temperature operation and long term stability. The stability or long term storage capability of a battery is a function of several factors, each important to the overall storage life and, therefore, each potentially a problem area if not addressed during the design, development and evaluation phases of the product cycle. Design (e.g., reserve vs active), inherent material thermal stability, material compatibility and self-discharge characteristics are examples of factors key to the storability of a power source.

  17. Heat source reconstruction from noisy temperature fields using an optimised derivative Gaussian filter

    NASA Astrophysics Data System (ADS)

    Delpueyo, D.; Balandraud, X.; Grédiac, M.

    2013-09-01

    The aim of this paper is to present a post-processing technique based on a derivative Gaussian filter to reconstruct heat source fields from temperature fields measured by infrared thermography. Heat sources can be deduced from temperature variations thanks to the heat diffusion equation. Filtering and differentiating are key-issues which are closely related here because the temperature fields which are processed are unavoidably noisy. We focus here only on the diffusion term because it is the most difficult term to estimate in the procedure, the reason being that it involves spatial second derivatives (a Laplacian for isotropic materials). This quantity can be reasonably estimated using a convolution of the temperature variation fields with second derivatives of a Gaussian function. The study is first based on synthetic temperature variation fields corrupted by added noise. The filter is optimised in order to reconstruct at best the heat source fields. The influence of both the dimension and the level of a localised heat source is discussed. Obtained results are also compared with another type of processing based on an averaging filter. The second part of this study presents an application to experimental temperature fields measured with an infrared camera on a thin plate in aluminium alloy. Heat sources are generated with an electric heating patch glued on the specimen surface. Heat source fields reconstructed from measured temperature fields are compared with the imposed heat sources. Obtained results illustrate the relevancy of the derivative Gaussian filter to reliably extract heat sources from noisy temperature fields for the experimental thermomechanics of materials.

  18. Fiber Optic Raman Sensor to Monitor Concentration Ratio of Nitrogen and Oxygen in a Cryogenic Mixture

    NASA Technical Reports Server (NTRS)

    Tiwari, Vidhu S.; Kalluru, Rajamohan R.; Yueh, Fang-Yu; Singh, Jagdish P.; SaintCyr, William

    2007-01-01

    A spontaneous Raman scattering optical fiber sensor is developed for a specific need of NASA/SSC for long-term detection and monitoring of the quality of liquid oxygen (LOX) in the delivery line during ground testing of rocket engines. The sensor performance was tested in the laboratory and with different excitation light sources. To evaluate the sensor performance with different excitation light sources for the LOX quality application, we have used the various mixtures of liquid oxygen and liquid nitrogen as samples. The study of the sensor performance shows that this sensor offers a great deal of flexibility and provides a cost effective solution for the application. However, an improved system response time is needed for the real-time, quantitative monitoring of the quality of cryogenic fluids in harsh environment.

  19. A dictionary server for supplying context sensitive medical knowledge.

    PubMed

    Ruan, W; Bürkle, T; Dudeck, J

    2000-01-01

    The Giessen Data Dictionary Server (GDDS), developed at Giessen University Hospital, integrates clinical systems with on-line, context sensitive medical knowledge to help with making medical decisions. By "context" we mean the clinical information that is being presented at the moment the information need is occurring. The dictionary server makes use of a semantic network supported by a medical data dictionary to link terms from clinical applications to their proper information sources. It has been designed to analyze the network structure itself instead of knowing the layout of the semantic net in advance. This enables us to map appropriate information sources to various clinical applications, such as nursing documentation, drug prescription and cancer follow up systems. This paper describes the function of the dictionary server and shows how the knowledge stored in the semantic network is used in the dictionary service.

  20. Supercontinuum Fourier transform spectrometry with balanced detection on a single photodiode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goncharov, Vasily; Hall, Gregory

    Here, we have developed phase-sensitive signal detection and processing algorithms for Fourier transform spectrometers fitted with supercontinuum sources for applications requiring ultimate sensitivity. Similar to well-established approach of source noise cancellation through balanced detection of monochromatic light, our method is capable of reducing the relative intensity noise of polychromatic light by 40 dB. Unlike conventional balanced detection, which relies on differential absorption measured with a well matched pair of photo-detectors, our algorithm utilizes phase-sensitive differential detection on a single photodiode and is capable of the real-time correction for instabilities in supercontinuum spectral structure over a broad range of wavelengths. Inmore » the resulting method is universal in terms of applicable wavelengths and compatible with commercial spectrometers. We present a proof-of-principle experimental« less

  1. Supercontinuum Fourier transform spectrometry with balanced detection on a single photodiode

    DOE PAGES

    Goncharov, Vasily; Hall, Gregory

    2016-08-25

    Here, we have developed phase-sensitive signal detection and processing algorithms for Fourier transform spectrometers fitted with supercontinuum sources for applications requiring ultimate sensitivity. Similar to well-established approach of source noise cancellation through balanced detection of monochromatic light, our method is capable of reducing the relative intensity noise of polychromatic light by 40 dB. Unlike conventional balanced detection, which relies on differential absorption measured with a well matched pair of photo-detectors, our algorithm utilizes phase-sensitive differential detection on a single photodiode and is capable of the real-time correction for instabilities in supercontinuum spectral structure over a broad range of wavelengths. Inmore » the resulting method is universal in terms of applicable wavelengths and compatible with commercial spectrometers. We present a proof-of-principle experimental« less

  2. Bragg-Fresnel optics: New field of applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snigirev, A.

    Bragg-Fresnel Optics shows excellent compatibility with the third generation synchrotron radiation sources such as ESRF and is capable of obtaining monochromatic submicron focal spots with 10{sup 8}-10{sup 9} photons/sec in an energy bandwidth of 10{sup -4}-10{sup -6} and in a photon energy range between 2-100 keV. New types of Bragg-Fresnel lenses like modified, ion implanted, bent and acoustically modulated were tested. Microprobe techniques like microdiffraction and microfluorescence based on Bragg-Fresnel optics were realised at the ESRF beamlines. Excellent parameters of the X-ray beam at the ESRF in terms of low emittance and quite small angular source size allow for Bragg-Fresnelmore » optics to occupy new fields of applications such as high resolution diffraction, holography, interferometry and phase contrast imaging.« less

  3. Scattering in infrared radiative transfer: A comparison between the spectrally averaging model JURASSIC and the line-by-line model KOPRA

    NASA Astrophysics Data System (ADS)

    Griessbach, Sabine; Hoffmann, Lars; Höpfner, Michael; Riese, Martin; Spang, Reinhold

    2013-09-01

    The viability of a spectrally averaging model to perform radiative transfer calculations in the infrared including scattering by atmospheric particles is examined for the application of infrared limb remote sensing measurements. Here we focus on the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) aboard the European Space Agency's Envisat. Various spectra for clear air and cloudy conditions were simulated with a spectrally averaging radiative transfer model and a line-by-line radiative transfer model for three atmospheric window regions (825-830, 946-951, 1224-1228 cm-1) and compared to each other. The results are rated in terms of the MIPAS noise equivalent spectral radiance (NESR). The clear air simulations generally agree within one NESR. The cloud simulations neglecting the scattering source term agree within two NESR. The differences between the cloud simulations including the scattering source term are generally below three and always below four NESR. We conclude that the spectrally averaging approach is well suited for fast and accurate infrared radiative transfer simulations including scattering by clouds. We found that the main source for the differences between the cloud simulations of both models is the cloud edge sampling. Furthermore we reasoned that this model comparison for clouds is also valid for atmospheric aerosol in general.

  4. Biofuels as an Alternative Energy Source for Aviation-A Survey

    NASA Technical Reports Server (NTRS)

    McDowellBomani, Bilal M.; Bulzan, Dan L.; Centeno-Gomez, Diana I.; Hendricks, Robert C.

    2009-01-01

    The use of biofuels has been gaining in popularity over the past few years because of their ability to reduce the dependence on fossil fuels. As a renewable energy source, biofuels can be a viable option for sustaining long-term energy needs if they are managed efficiently. We investigate past, present, and possible future biofuel alternatives currently being researched and applied around the world. More specifically, we investigate the use of ethanol, cellulosic ethanol, biodiesel (palm oil, algae, and halophytes), and synthetic fuel blends that can potentially be used as fuels for aviation and nonaerospace applications. We also investigate the processing of biomass via gasification, hydrolysis, and anaerobic digestion as a way to extract fuel oil from alternative biofuels sources.

  5. Pesticides exposure assessment of kettleman city using the industrial source complex short-term model version 3.

    PubMed

    Tao, Jing; Barry, Terrell; Segawa, Randy; Neal, Rosemary; Tuli, Atac

    2013-01-01

    Kettleman City, California, reported a higher than expected number of birth defect cases between 2007 and 2010, raising the concern of community and government agencies. A pesticide exposure evaluation was conducted as part of a complete assessment of community chemical exposure. Nineteen pesticides that potentially cause birth defects were investigated. The Industrial Source Complex Short-Term Model Version 3 (ISCST3) was used to estimate off-site air concentrations associated with pesticide applications within 8 km of the community from late 2006 to 2009. The health screening levels were designed to indicate potential health effects and used for preliminary health evaluations of estimated air concentrations. A tiered approach was conducted. The first tier modeled simple, hypothetical worst-case situations for each of 19 pesticides. The second tier modeled specific applications of the pesticides with estimated concentrations exceeding health screening levels in the first tier. The pesticide use report database of the California Department of Pesticide Regulation provided application information. Weather input data were summarized from the measurements of a local weather station in the California Irrigation Management Information System. The ISCST3 modeling results showed that during the target period, only two application days of one pesticide (methyl isothiocyanate) produced air concentration estimates above the health screening level for developmental effects at the boundary of Kettleman City. These results suggest that the likelihood of birth defects caused by pesticide exposure was low. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  6. Tabulation of asbestos-related terminology

    USGS Publications Warehouse

    Lowers, Heather; Meeker, Greg

    2002-01-01

    The term asbestos has been defined in numerous publications including many State and Federal regulations. The definition of asbestos often varies depending on the source or publication in which it is used. Differences in definitions also exist for the asbestos-related terms acicular, asbestiform, cleavage, cleavage fragment, fiber, fibril, fibrous, and parting. An inexperienced reader of the asbestos literature would have difficulty understanding these differences and grasping many of the subtleties that exist in the literature and regulatory language. Disagreement among workers from the industrial, medical, mineralogical, and regulatory communities regarding these definitions has fueled debate as to their applicability to various morphological structures and chemical compositions that exist in the amphibole and serpentine groups of minerals. This debate has significant public health, economic and legal implications. This report summarizes asbestos-related definitions taken from a variety of academic, industrial, and regulatory sources. This summary is by no means complete but includes the majority of significant definitions currently applied in the discipline.

  7. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  8. Reprint of: High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2011-05-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. On the role of mean flows in Doppler shifted frequencies

    NASA Astrophysics Data System (ADS)

    Gerkema, Theo; Maas, Leo R. M.; van Haren, Hans

    2013-04-01

    In the oceanographic literature, the term 'Doppler shift' often features in the context of mean flows and (internal) waves. Closer inspection reveals that the term is in fact used for two different things, which should be carefully distinguished, for their conflation results in incorrect interpretations. One refers to the difference in frequencies measured by two observers, one at a fixed position and one moving with the mean flow. The other definition is the one used in physics, where the frequency measured by an observer is compared to that of the source. In the latter sense, Doppler shifts occur only if the source and observer move with respect to each other; a steady mean flow cannot create a Doppler shift. We rehash the classical theory to straighten out some misconceptions and discuss how wave dispersion affects the classical relations and their application, for example on near-inertial internal waves.

  10. Power sources for search and rescue 406 MHz beacons

    NASA Technical Reports Server (NTRS)

    Attia, Alan I.; Perrone, David E.

    1987-01-01

    The results of a study directed at the selection of a commercially available, safe, low cost, light weight and long storage life battery for search and rescue (Sarsat) 406 MHz emergency beacons are presented. In the course of this work, five electrochemical systems (lithium-manganese dioxide, lithium-carbon monofluoride, lithium-silver vanadium oxide, alkaline cells, and cadmium-mercuric oxide) were selected for limited experimental studies to determine their suitability for this application. Two safe, commercially available batteries (lithium-manganese dioxide and lithium-carbon monofluoride) which meet the near term requirements and several alternatives for the long term were identified.

  11. Production Strategies and Applications of Microbial Single Cell Oils

    PubMed Central

    Ochsenreither, Katrin; Glück, Claudia; Stressler, Timo; Fischer, Lutz; Syldatk, Christoph

    2016-01-01

    Polyunsaturated fatty acids (PUFAs) of the ω-3 and ω-6 class (e.g., α-linolenic acid, linoleic acid) are essential for maintaining biofunctions in mammalians like humans. Due to the fact that humans cannot synthesize these essential fatty acids, they must be taken up from different food sources. Classical sources for these fatty acids are porcine liver and fish oil. However, microbial lipids or single cell oils, produced by oleaginous microorganisms such as algae, fungi and bacteria, are a promising source as well. These single cell oils can be used for many valuable chemicals with applications not only for nutrition but also for fuels and are therefore an ideal basis for a bio-based economy. A crucial point for the establishment of microbial lipids utilization is the cost-effective production and purification of fuels or products of higher value. The fermentative production can be realized by submerged (SmF) or solid state fermentation (SSF). The yield and the composition of the obtained microbial lipids depend on the type of fermentation and the particular conditions (e.g., medium, pH-value, temperature, aeration, nitrogen source). From an economical point of view, waste or by-product streams can be used as cheap and renewable carbon and nitrogen sources. In general, downstream processing costs are one of the major obstacles to be solved for full economic efficiency of microbial lipids. For the extraction of lipids from microbial biomass cell disruption is most important, because efficiency of cell disruption directly influences subsequent downstream operations and overall extraction efficiencies. A multitude of cell disruption and lipid extraction methods are available, conventional as well as newly emerging methods, which will be described and discussed in terms of large scale applicability, their potential in a modern biorefinery and their influence on product quality. Furthermore, an overview is given about applications of microbial lipids or derived fatty acids with emphasis on food applications. PMID:27761130

  12. PCB remediation in schools: a review.

    PubMed

    Brown, Kathleen W; Minegishi, Taeko; Cummiskey, Cynthia Campisano; Fragala, Matt A; Hartman, Ross; MacIntosh, David L

    2016-02-01

    Growing awareness of polychlorinated biphenyls (PCBs) in legacy caulk and other construction materials of schools has created a need for information on best practices to control human exposures and comply with applicable regulations. A concise review of approaches and techniques for management of building-related PCBs is the focus of this paper. Engineering and administrative controls that block pathways of PCB transport, dilute concentrations of PCBs in indoor air or other exposure media, or establish uses of building space that mitigate exposure can be effective initial responses to identification of PCBs in a building. Mitigation measures also provide time for school officials to plan a longer-term remediation strategy and to secure the necessary resources. These longer-term strategies typically involve removal of caulk or other primary sources of PCBs as well as nearby masonry or other materials contaminated with PCBs by the primary sources. The costs of managing PCB-containing building materials from assessment through ultimate disposal can be substantial. Optimizing the efficacy and cost-effectiveness of remediation programs requires aligning a thorough understanding of sources and exposure pathways with the most appropriate mitigation and abatement methods.

  13. Highlighting Uncertainty and Recommendations for Improvement of Black Carbon Biomass Fuel-Based Emission Inventories in the Indo-Gangetic Plain Region.

    PubMed

    Soneja, Sutyajeet I; Tielsch, James M; Khatry, Subarna K; Curriero, Frank C; Breysse, Patrick N

    2016-03-01

    Black carbon (BC) is a major contributor to hydrological cycle change and glacial retreat within the Indo-Gangetic Plain (IGP) and surrounding region. However, significant variability exists for estimates of BC regional concentration. Existing inventories within the IGP suffer from limited representation of rural sources, reliance on idealized point source estimates (e.g., utilization of emission factors or fuel-use estimates for cooking along with demographic information), and difficulty in distinguishing sources. Inventory development utilizes two approaches, termed top down and bottom up, which rely on various sources including transport models, emission factors, and remote sensing applications. Large discrepancies exist for BC source attribution throughout the IGP depending on the approach utilized. Cooking with biomass fuels, a major contributor to BC production has great source apportionment variability. Areas requiring attention tied to research of cookstove and biomass fuel use that have been recognized to improve emission inventory estimates include emission factors, particulate matter speciation, and better quantification of regional/economic sectors. However, limited attention has been given towards understanding ambient small-scale spatial variation of BC between cooking and non-cooking periods in low-resource environments. Understanding the indoor to outdoor relationship of BC emissions due to cooking at a local level is a top priority to improve emission inventories as many health and climate applications rely upon utilization of accurate emission inventories.

  14. Seismoelectric imaging of shallow targets

    USGS Publications Warehouse

    Haines, S.S.; Pride, S.R.; Klemperer, S.L.; Biondi, B.

    2007-01-01

    We have undertaken a series of controlled field experiments to develop seismoelectric experimental methods for near-surface applications and to improve our understanding of seismoelectric phenomena. In a set of off-line geometry surveys (source separated from the receiver line), we place seismic sources and electrode array receivers on opposite sides of a man-made target (two sand-filled trenches) to record separately two previously documented seismoelectric modes: (1) the electromagnetic interface response signal created at the target and (2) the coseismic electric fields located within a compressional seismic wave. With the seismic source point in the center of a linear electrode array, we identify the previously undocumented seismoelectric direct field, and the Lorentz field of the metal hammer plate moving in the earth's magnetic field. We place the seismic source in the center of a circular array of electrodes (radial and circumferential orientations) to analyze the source-related direct and Lorentz fields and to establish that these fields can be understood in terms of simple analytical models. Using an off-line geometry, we create a multifold, 2D image of our trenches as dipping layers, and we also produce a complementary synthetic image through numerical modeling. These images demonstrate that off-line geometry (e.g., crosswell) surveys offer a particularly promising application of the seismoelectric method because they effectively separate the interface response signal from the (generally much stronger) coseismic and source-related fields. ?? 2007 Society of Exploration Geophysicists.

  15. Bremsstrahlung versus Monoenergetic Photon Dose and Photonuclear Stimulation Comparisons at Long Standoff Distances

    NASA Astrophysics Data System (ADS)

    Jones, J. L.; Sterbentz, J. W.; Yoon, W. Y.; Norman, D. R.

    2009-12-01

    Energetic photon sources with energies greater than 6 MeV continue to be recognized as viable source for various types of inspection applications, especially those related to nuclear and/or explosive material detection. These energetic photons can be produced as a continuum of energies (i.e., bremsstrahlung) or as a set of one or more discrete photon energies (i.e., monoenergetic). This paper will provide a follow-on extension of the photon dose comparison presented at the 9th International Conference on Applications of Nuclear Techniques (June 2008). Our previous paper showed the comparative advantages and disadvantages of the photon doses provided by these two energetic interrogation sources and highlighted the higher energy advantage of the bremsstrahlung source, especially at long standoff distances (i.e., distance from source to the inspected object). This paper will pursue higher energy photon inspection advantage (up to 100 MeV) by providing dose and stimulated photonuclear interaction predictions in air and for an infinitely dilute interrogated material (used for comparative interaction rate assessments since it excludes material self-shielding) as the interrogation object positioned forward on the inspection beam axis at increasing standoff distances. In addition to the direct energetic photon-induced stimulation, the predictions will identify the importance of secondary downscattered/attenuated source-term effects arising from the photon transport in the intervening air environment.

  16. Poster — Thur Eve — 40: Automated Quality Assurance for Remote-Afterloading High Dose Rate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Anthony; Ravi, Ananth

    2014-08-15

    High dose rate (HDR) remote afterloading brachytherapy involves sending a small, high-activity radioactive source attached to a cable to different positions within a hollow applicator implanted in the patient. It is critical that the source position within the applicator and the dwell time of the source are accurate. Daily quality assurance (QA) tests of the positional and dwell time accuracy are essential to ensure that the accuracy of the remote afterloader is not compromised prior to patient treatment. Our centre has developed an automated, video-based QA system for HDR brachytherapy that is dramatically superior to existing diode or film QAmore » solutions in terms of cost, objectivity, positional accuracy, with additional functionalities such as being able to determine source dwell time and transit time of the source. In our system, a video is taken of the brachytherapy source as it is sent out through a position check ruler, with the source visible through a clear window. Using a proprietary image analysis algorithm, the source position is determined with respect to time as it moves to different positions along the check ruler. The total material cost of the video-based system was under $20, consisting of a commercial webcam and adjustable stand. The accuracy of the position measurement is ±0.2 mm, and the time resolution is 30 msec. Additionally, our system is capable of robustly verifying the source transit time and velocity (a test required by the AAPM and CPQR recommendations), which is currently difficult to perform accurately.« less

  17. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.

  18. Chips: A Tool for Developing Software Interfaces Interactively.

    DTIC Science & Technology

    1987-10-01

    of the application through the objects on the screen. Chips makes this easy by supplying simple and direct access to the source code and data ...object-oriented programming, user interface management systems, programming environments. Typographic Conventions Technical terms appearing in the...creating an environment in which we could do our work. This project could not have happened without him. Jeff Bonar started and managed the Chips

  19. Plant reestablishment after soil disturbance: Effects of soils, treatment, and time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandt, C.A.; Alford, K.; McIlveny, G.

    The Pacific Northwest Laboratory examined plant growth and establishment on 16 sites where severe land disturbance had taken place. The purpose of the study was to evaluate the relative effectiveness of the different methods in term of their effects on establishment of native and alien plants. Disturbances ranged from 1 to 50 years in age. Revegetation using native plants had been attempted at 14 of the sites; the remainder were abandoned without any further management. Revegetation efforts variously included seeding, fertilizer application, mulching with various organic sources, compost application, application of Warden silt loam topsoil over sand and gravel soils,more » and moderate irrigation.« less

  20. A dictionary server for supplying context sensitive medical knowledge.

    PubMed Central

    Ruan, W.; Bürkle, T.; Dudeck, J.

    2000-01-01

    The Giessen Data Dictionary Server (GDDS), developed at Giessen University Hospital, integrates clinical systems with on-line, context sensitive medical knowledge to help with making medical decisions. By "context" we mean the clinical information that is being presented at the moment the information need is occurring. The dictionary server makes use of a semantic network supported by a medical data dictionary to link terms from clinical applications to their proper information sources. It has been designed to analyze the network structure itself instead of knowing the layout of the semantic net in advance. This enables us to map appropriate information sources to various clinical applications, such as nursing documentation, drug prescription and cancer follow up systems. This paper describes the function of the dictionary server and shows how the knowledge stored in the semantic network is used in the dictionary service. PMID:11079978

  1. Low Data Drug Discovery with One-Shot Learning.

    PubMed

    Altae-Tran, Han; Ramsundar, Bharath; Pappu, Aneesh S; Pande, Vijay

    2017-04-26

    Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. 2015, 55, 263-274). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https://github.com/deepchem/deepchem, 2016).

  2. Realization of an omnidirectional source of sound using parametric loudspeakers.

    PubMed

    Sayin, Umut; Artís, Pere; Guasch, Oriol

    2013-09-01

    Parametric loudspeakers are often used in beam forming applications where a high directivity is required. Withal, in this paper it is proposed to use such devices to build an omnidirectional source of sound. An initial prototype, the omnidirectional parametric loudspeaker (OPL), consisting of a sphere with hundreds of ultrasonic transducers placed on it has been constructed. The OPL emits audible sound thanks to the parametric acoustic array phenomenon, and the close proximity and the large number of transducers results in the generation of a highly omnidirectional sound field. Comparisons with conventional dodecahedron loudspeakers have been made in terms of directivity, frequency response, and in applications such as the generation of diffuse acoustic fields in reverberant chambers. The OPL prototype has performed better than the conventional loudspeaker especially for frequencies higher than 500 Hz, its main drawback being the difficulty to generate intense pressure levels at low frequencies.

  3. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  4. Monitoring water quality by remote sensing

    NASA Technical Reports Server (NTRS)

    Brown, R. L. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. A limited study was conducted to determine the applicability of remote sensing for evaluating water quality conditions in the San Francisco Bay and delta. Considerable supporting data were available for the study area from other than overflight sources, but short-term temporal and spatial variability precluded their use. The study results were not sufficient to shed much light on the subject, but it did appear that, with the present state of the art in image analysis and the large amount of ground truth needed, remote sensing has only limited application in monitoring water quality.

  5. Sweet potato growth parameters, yield components and nutritive value for CELSS applications

    NASA Technical Reports Server (NTRS)

    Loretan, P. A.; Bonsi, C. K.; Hill, W. A.; Ogbuehi, C. R.; Mortley, D. G.

    1989-01-01

    Sweet potatoes have been grown hydroponically using the nutrient film technique (NFT) to provide a potential food source for long-term manned space missions. Experiments in both sand and NFT cultivars have produced up to 1790 g/plant of fresh storage root with an edible biomass index ranging from 60-89 percent and edible biomass linear growth rates of 39-66 g/sq m day in 105 to 130 days. Experiments with different cultivars, nutrient solution compositions, application rates, air and root temperatures, photoperiods, and light intensities indicate good potential for sweet potatoes in CELSS.

  6. A framework for emissions source apportionment in industrial areas: MM5/CALPUFF in a near-field application.

    PubMed

    Ghannam, K; El-Fadel, M

    2013-02-01

    This paper examines the relative source contribution to ground-level concentrations of carbon monoxide (CO), nitrogen dioxide (NO2), and PM10 (particulate matter with an aerodynamic diameter < 10 microm) in a coastal urban area due to emissions from an industrial complex with multiple stacks, quarrying activities, and a nearby highway. For this purpose, an inventory of CO, oxide of nitrogen (NO(x)), and PM10 emissions was coupled with the non-steady-state Mesoscale Model 5/California Puff Dispersion Modeling system to simulate individual source contributions under several spatial and temporal scales. As the contribution of a particular source to ground-level concentrations can be evaluated by simulating this single-source emissions or otherwise total emissions except that source, a set of emission sensitivity simulations was designed to examine if CALPUFF maintains a linear relationship between emission rates and predicted concentrations in cases where emitted plumes overlap and chemical transformations are simulated. Source apportionment revealed that ground-level releases (i.e., highway and quarries) extended over large areas dominated the contribution to exposure levels over elevated point sources, despite the fact that cumulative emissions from point sources are higher. Sensitivity analysis indicated that chemical transformations of NO(x) are insignificant, possibly due to short-range plume transport, with CALPUFF exhibiting a linear response to changes in emission rate. The current paper points to the significance of ground-level emissions in contributing to urban air pollution exposure and questions the viability of the prevailing paradigm of point-source emission reduction, especially that the incremental improvement in air quality associated with this common abatement strategy may not accomplish the desirable benefit in terms of lower exposure with costly emissions capping. The application of atmospheric dispersion models for source apportionment helps in identifying major contributors to regional air pollution. In industrial urban areas where multiple sources with different geometry contribute to emissions, ground-level releases extended over large areas such as roads and quarries often dominate the contribution to ground-level air pollution. Industrial emissions released at elevated stack heights may experience significant dilution, resulting in minor contribution to exposure at ground level. In such contexts, emission reduction, which is invariably the abatement strategy targeting industries at a significant investment in control equipment or process change, may result in minimal return on investment in terms of improvement in air quality at sensitive receptors.

  7. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  8. Towards next generation time-domain diffuse optics devices

    NASA Astrophysics Data System (ADS)

    Dalla Mora, Alberto; Contini, Davide; Arridge, Simon R.; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio

    2015-03-01

    Diffuse Optics is growing in terms of applications ranging from e.g. oximetry, to mammography, molecular imaging, quality assessment of food and pharmaceuticals, wood optics, physics of random media. Time-domain (TD) approaches, although appealing in terms of quantitation and depth sensibility, are presently limited to large fiber-based systems, with limited number of source-detector pairs. We present a miniaturized TD source-detector probe embedding integrated laser sources and single-photon detectors. Some electronics are still external (e.g. power supply, pulse generators, timing electronics), yet full integration on-board using already proven technologies is feasible. The novel devices were successfully validated on heterogeneous phantoms showing performances comparable to large state-of-the-art TD rack-based systems. With an investigation based on simulations we provide numerical evidence that the possibility to stack many TD compact source-detector pairs in a dense, null source-detector distance arrangement could yield on the brain cortex about 1 decade higher contrast as compared to a continuous wave (CW) approach. Further, a 3-fold increase in the maximum depth (down to 6 cm) is estimated, opening accessibility to new organs such as the lung or the heart. Finally, these new technologies show the way towards compact and wearable TD probes with orders of magnitude reduction in size and cost, for a widespread use of TD devices in real life.

  9. Tunable white light source for medical applications

    NASA Astrophysics Data System (ADS)

    Blaszczak, Urszula J.; Gryko, Lukasz; Zajac, Andrzej

    2017-08-01

    Development of light-emitting diodes has brought new possibilities in many applications, especially in terms of flexible adjustment of light spectra. This feature is very useful in construction of many devices, for example for medical diagnosis and treatment. It was proved, that in some cases LEDs can easily replace lasers during therapy of cancer without reduction of efficiency of this process. On the other hand during diagnosis process LED-based constructions can provide unique ability to adjust the color temperature of the output light while maintaining high color rendering. It allows for optimum surface contrast and enhanced tissue differentiation at the operator site. In the paper we describe the construction of the tunable LED-based source designed for application in endoscopy. It was optimized from the point of view of the color rendition for 5 different correlated color temperatures (illuminant A, D55, D65, 3500K and 4500K) with the restriction of very high (>90) values of general and specific color rendering indexes (according to Ra method). The source is composed of 13 light-emitting diodes from visible region mounted on the common radiator and controlled by dedicated system. Spectra of the components are mixed and the spectra of output light is analyzed. On the basis of obtained spectra colorimetric parameters are calculated and compared with the results of theoretical analysis.

  10. Design of HIFU transducers for generating specified nonlinear ultrasound fields

    PubMed Central

    Rosnitskiy, Pavel B.; Yuldashev, Petr V.; Sapozhnikov, Oleg A.; Maxwell, Adam; Kreider, Wayne; Bailey, Michael R.; Khokhlova, Vera A.

    2016-01-01

    Various clinical applications of high intensity focused ultrasound (HIFU) have different requirements for the pressure levels and degree of nonlinear waveform distortion at the focus. The goal of this work was to determine transducer design parameters that produce either a specified shock amplitude in the focal waveform or specified peak pressures while still maintaining quasilinear conditions at the focus. Multi-parametric nonlinear modeling based on the KZK equation with an equivalent source boundary condition was employed. Peak pressures, shock amplitudes at the focus, and corresponding source outputs were determined for different transducer geometries and levels of nonlinear distortion. Results are presented in terms of the parameters of an equivalent single-element, spherically shaped transducer. The accuracy of the method and its applicability to cases of strongly focused transducers were validated by comparing the KZK modeling data with measurements and nonlinear full-diffraction simulations for a single-element source and arrays with 7 and 256 elements. The results provide look-up data for evaluating nonlinear distortions at the focus of existing therapeutic systems as well as for guiding the design of new transducers that generate specified nonlinear fields. PMID:27775904

  11. Diamond-based single-photon emitters

    NASA Astrophysics Data System (ADS)

    Aharonovich, I.; Castelletto, S.; Simpson, D. A.; Su, C.-H.; Greentree, A. D.; Prawer, S.

    2011-07-01

    The exploitation of emerging quantum technologies requires efficient fabrication of key building blocks. Sources of single photons are extremely important across many applications as they can serve as vectors for quantum information—thereby allowing long-range (perhaps even global-scale) quantum states to be made and manipulated for tasks such as quantum communication or distributed quantum computation. At the single-emitter level, quantum sources also afford new possibilities in terms of nanoscopy and bio-marking. Color centers in diamond are prominent candidates to generate and manipulate quantum states of light, as they are a photostable solid-state source of single photons at room temperature. In this review, we discuss the state of the art of diamond-based single-photon emitters and highlight their fabrication methodologies. We present the experimental techniques used to characterize the quantum emitters and discuss their photophysical properties. We outline a number of applications including quantum key distribution, bio-marking and sub-diffraction imaging, where diamond-based single emitters are playing a crucial role. We conclude with a discussion of the main challenges and perspectives for employing diamond emitters in quantum information processing.

  12. Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant: Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Saunier, Olivier; Mathieu, Anne

    2012-03-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativity of the measurements, those that are instrumental, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. We propose to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We apply the method to the estimation of the Fukushima Daiichi source term using activity concentrations in the air. The results are compared to an L-curve estimation technique and to Desroziers's scheme. The total reconstructed activities significantly depend on the chosen method. Because of the poor observability of the Fukushima Daiichi emissions, these methods provide lower bounds for cesium-137 and iodine-131 reconstructed activities. These lower bound estimates, 1.2 × 1016 Bq for cesium-137, with an estimated standard deviation range of 15%-20%, and 1.9 - 3.8 × 1017 Bq for iodine-131, with an estimated standard deviation range of 5%-10%, are of the same order of magnitude as those provided by the Japanese Nuclear and Industrial Safety Agency and about 5 to 10 times less than the Chernobyl atmospheric releases.

  13. Survey of university programs in remote sensing funded under grants from the NASA University-Space Applications program

    NASA Technical Reports Server (NTRS)

    Madigan, J. A.; Earhart, R. W.

    1978-01-01

    NASA's Office of Space and Terrestrial Applications (OSTA) is currently assessing approaches to transferring NASA technology to both the public and private sectors. As part of this assessment, NASA is evaluating the effectiveness of an ongoing program in remote sensing technology transfer conducted by 20 university contractors/grantees, each supported totally or partially by NASA funds. The University-Space Applications program has as its objective the demonstration of practical benefits from the use of remote sensing technology to a broad spectrum of new users, principally in state and local governments. To evaluate the University-Space Applications program, NASA has a near-term requirement for data on each university effort including total funding, funding sources, length of program, program description, and effectiveness measures.

  14. Extremozymes: A Potential Source for Industrial Applications.

    PubMed

    Dumorné, Kelly; Córdova, David Camacho; Astorga-Eló, Marcia; Renganathan, Prabhaharan

    2017-04-28

    Extremophilic microorganisms have established a diversity of molecular strategies in order to survive in extreme conditions. Biocatalysts isolated by these organisms are termed extremozymes, and possess extraordinary properties of salt allowance, thermostability, and cold adaptivity. Extremozymes are very resistant to extreme conditions owing to their great solidity, and they pose new opportunities for biocatalysis and biotransformations, as well as for the development of the economy and new line of research, through their application. Thermophilic proteins, piezophilic proteins, acidophilic proteins, and halophilic proteins have been studied during the last few years. Amylases, proteases, lipases, pullulanases, cellulases, chitinases, xylanases, pectinases, isomerases, esterases, and dehydrogenases have great potential application for biotechnology, such as in agricultural, chemical, biomedical, and biotechnological processes. The study of extremozymes and their main applications have emerged during recent years.

  15. Leveraging terminological resources for mapping between rare disease information sources.

    PubMed

    Rance, Bastien; Snyder, Michelle; Lewis, Janine; Bodenreider, Olivier

    2013-01-01

    Rare disease information sources are incompletely and inconsistently cross-referenced to one another, making it difficult for information seekers to navigate across them. The development of such cross-references established manually by experts is generally labor intensive and costly. To develop an automatic mapping between two of the major rare diseases information sources, GARD and Orphanet, by leveraging terminological resources, especially the UMLS. We map the rare disease terms from Orphanet and ORDR to the UMLS. We use the UMLS as a pivot to bridge between the rare disease terminologies. We compare our results to a mapping obtained through manually established cross-references to OMIM. Our mapping has a precision of 94%, a recall of 63% and an F1-score of 76%. Our automatic mapping should help facilitate the development of more complete and consistent cross-references between GARD and Orphanet, and is applicable to other rare disease information sources as well.

  16. The evolution of methods for noise prediction of high speed rotors and propellers in the time domain

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1986-01-01

    Linear wave equation models which have been used over the years at NASA Langley for describing noise emissions from high speed rotating blades are summarized. The noise sources are assumed to lie on a moving surface, and analysis of the situation has been based on the Ffowcs Williams-Hawkings (FW-H) equation. Although the equation accounts for two surface and one volume source, the NASA analyses have considered only the surface terms. Several variations on the FW-H model are delineated for various types of applications, noting the computational benefits of removing the frequency dependence of the calculations. Formulations are also provided for compact and noncompact sources, and features of Long's subsonic integral equation and Farassat's high speed integral equation are discussed. The selection of subsonic or high speed models is dependent on the Mach number of the blade surface where the source is located.

  17. Design and characterization of electron beam focusing for X-ray generation in novel medical imaging architecturea

    PubMed Central

    Bogdan Neculaes, V.; Zou, Yun; Zavodszky, Peter; Inzinna, Louis; Zhang, Xi; Conway, Kenneth; Caiafa, Antonio; Frutschy, Kristopher; Waters, William; De Man, Bruno

    2014-01-01

    A novel electron beam focusing scheme for medical X-ray sources is described in this paper. Most vacuum based medical X-ray sources today employ a tungsten filament operated in temperature limited regime, with electrostatic focusing tabs for limited range beam optics. This paper presents the electron beam optics designed for the first distributed X-ray source in the world for Computed Tomography (CT) applications. This distributed source includes 32 electron beamlets in a common vacuum chamber, with 32 circular dispenser cathodes operated in space charge limited regime, where the initial circular beam is transformed into an elliptical beam before being collected at the anode. The electron beam optics designed and validated here are at the heart of the first Inverse Geometry CT system, with potential benefits in terms of improved image quality and dramatic X-ray dose reduction for the patient. PMID:24826066

  18. Investigation of remote sensing techniques as inputs to operational resource management models. [South Dakota

    NASA Technical Reports Server (NTRS)

    Schmer, F. A. (Principal Investigator); Isakson, R. E.; Eidenshink, J. C.

    1977-01-01

    The author has identified the following significant results. Successful operational applications of LANDSAT data were found for level 1 land use mapping, drainage network delineation, and aspen mapping. Visual LANDSAT interpretation using 1:125,000 color composite imagery was the least expensive method of obtaining timely level 1 land use data. With an average agricultural/rangeland interpretation accuracy in excess of 80%, such a data source was considered the most cost effective of those sources available to state agencies. Costs do not compare favorably with those incurred using the present method of extracting land use data from historical tabular summaries. The cost increase in advancing from the present procedure to a satellite-based data source was justified in terms of expanded data content.

  19. Source reconstruction via the spatiotemporal Kalman filter and LORETA from EEG time series with 32 or fewer electrodes.

    PubMed

    Hamid, Laith; Al Farawn, Ali; Merlet, Isabelle; Japaridze, Natia; Heute, Ulrich; Stephani, Ulrich; Galka, Andreas; Wendling, Fabrice; Siniatchkin, Michael

    2017-07-01

    The clinical routine of non-invasive electroencephalography (EEG) is usually performed with 8-40 electrodes, especially in long-term monitoring, infants or emergency care. There is a need in clinical and scientific brain imaging to develop inverse solution methods that can reconstruct brain sources from these low-density EEG recordings. In this proof-of-principle paper we investigate the performance of the spatiotemporal Kalman filter (STKF) in EEG source reconstruction with 9-, 19- and 32- electrodes. We used simulated EEG data of epileptic spikes generated from lateral frontal and lateral temporal brain sources using state-of-the-art neuronal population models. For validation of source reconstruction, we compared STKF results to the location of the simulated source and to the results of low-resolution brain electromagnetic tomography (LORETA) standard inverse solution. STKF consistently showed less localization bias compared to LORETA, especially when the number of electrodes was decreased. The results encourage further research into the application of the STKF in source reconstruction of brain activity from low-density EEG recordings.

  20. Derivation of the linear-logistic model and Cox's proportional hazard model from a canonical system description.

    PubMed

    Voit, E O; Knapp, R G

    1997-08-15

    The linear-logistic regression model and Cox's proportional hazard model are widely used in epidemiology. Their successful application leaves no doubt that they are accurate reflections of observed disease processes and their associated risks or incidence rates. In spite of their prominence, it is not a priori evident why these models work. This article presents a derivation of the two models from the framework of canonical modeling. It begins with a general description of the dynamics between risk sources and disease development, formulates this description in the canonical representation of an S-system, and shows how the linear-logistic model and Cox's proportional hazard model follow naturally from this representation. The article interprets the model parameters in terms of epidemiological concepts as well as in terms of general systems theory and explains the assumptions and limitations generally accepted in the application of these epidemiological models.

  1. Calculation and Visualization of Atomistic Mechanical Stresses in Nanomaterials and Biomolecules

    PubMed Central

    Gilson, Michael K.

    2014-01-01

    Many biomolecules have machine-like functions, and accordingly are discussed in terms of mechanical properties like force and motion. However, the concept of stress, a mechanical property that is of fundamental importance in the study of macroscopic mechanics, is not commonly applied in the biomolecular context. We anticipate that microscopical stress analyses of biomolecules and nanomaterials will provide useful mechanistic insights and help guide molecular design. To enable such applications, we have developed Calculator of Atomistic Mechanical Stress (CAMS), an open-source software package for computing atomic resolution stresses from molecular dynamics (MD) simulations. The software also enables decomposition of stress into contributions from bonded, nonbonded and Generalized Born potential terms. CAMS reads GROMACS topology and trajectory files, which are easily generated from AMBER files as well; and time-varying stresses may be animated and visualized in the VMD viewer. Here, we review relevant theory and present illustrative applications. PMID:25503996

  2. Calculation and visualization of atomistic mechanical stresses in nanomaterials and biomolecules.

    PubMed

    Fenley, Andrew T; Muddana, Hari S; Gilson, Michael K

    2014-01-01

    Many biomolecules have machine-like functions, and accordingly are discussed in terms of mechanical properties like force and motion. However, the concept of stress, a mechanical property that is of fundamental importance in the study of macroscopic mechanics, is not commonly applied in the biomolecular context. We anticipate that microscopical stress analyses of biomolecules and nanomaterials will provide useful mechanistic insights and help guide molecular design. To enable such applications, we have developed Calculator of Atomistic Mechanical Stress (CAMS), an open-source software package for computing atomic resolution stresses from molecular dynamics (MD) simulations. The software also enables decomposition of stress into contributions from bonded, nonbonded and Generalized Born potential terms. CAMS reads GROMACS topology and trajectory files, which are easily generated from AMBER files as well; and time-varying stresses may be animated and visualized in the VMD viewer. Here, we review relevant theory and present illustrative applications.

  3. Study, optimization, and design of a laser heat engine. [for satellite applications

    NASA Technical Reports Server (NTRS)

    Taussig, R. T.; Cassady, P. E.; Zumdieck, J. F.

    1978-01-01

    Laser heat engine concepts, proposed for satellite applications, are analyzed to determine which engine concept best meets the requirements of high efficiency (50 percent or better), continuous operation in space using near-term technology. The analysis of laser heat engines includes the thermodynamic cycles, engine design, laser power sources, collector/concentrator optics, receiving windows, absorbers, working fluids, electricity generation, and heat rejection. Specific engine concepts, optimized according to thermal efficiency, are rated by their technological availability and scaling to higher powers. A near-term experimental demonstration of the laser heat engine concept appears feasible utilizing an Otto cycle powered by CO2 laser radiation coupled into the engine through a diamond window. Higher cycle temperatures, higher efficiencies, and scalability to larger sizes appear to be achievable from a laser heat engine design based on the Brayton cycle and powered by a CO laser.

  4. Tracing changes in soil N transformations to explain the doubling of N2O emissions under elevated CO2 in the Giessen FACE

    NASA Astrophysics Data System (ADS)

    Moser, Gerald; Brenzinger, Kristof; Gorenflo, Andre; Clough, Tim; Braker, Gesche; Müller, Christoph

    2017-04-01

    To reduce the emissions of greenhouse gases (CO2, CH4 & N2O) it is important to quantify main sources and identify the respective ecosystem processes. While the main sources of N2O emissions in agro-ecosystems under current conditions are well known, the influence of a projected higher level of CO2 on the main ecosystem processes responsible for N2O emissions has not been investigated in detail. A major result of the Giessen FACE in a managed temperate grassland was that a +20% CO2 level caused a positive feedback due to increased emissions of N2O to 221% related to control condition. To be able to trace the sources of additional N2O emissions a 15N tracing study was conducted. We measured the N2O emission and its 15N signature, together with the 15N signature of soil and plant samples. The results were analyzed using a 15N tracing model which quantified the main changes in N transformation rates under elevated CO2. Directly after 15N fertilizer application a much higher dynamic of N transformations was observed than in the long run. Absolute mineralisation and DNRA rates were lower under elevated CO2 in the short term but higher in the long term. During the one year study period beginning with the 15N labelling a 1.8-fold increase of N2O emissions occurred under elevated CO2. The source of increased N2O was associated with NO3- in the first weeks after 15N application. Elevated CO2 affected denitrification rates, which resulted in increased N2O emissions due to a change of gene transcription rates (nosZ/(nirK+nirS)) and resulting enzyme activity (see: Brenzinger et al.). Here we show that the reported enhanced N2O emissions for the first 8 FACE years do prevail even in the long-term (> 15 years). The effect of elevated CO2 on N2O production/emission can be explained by altered activity ratios within a stable microbial community.

  5. RAND Workshop on Antiproton Science and Technology, Annotated Executive Summary. (October 6-9, 1987)

    DTIC Science & Technology

    1988-10-01

    parity violation to condensed matter . A number of near-term important applications are possible using the source and portable storage devices...from charge parity violation studies to condensed matter studies. -vi - The CERN/LEAR facility will continue to only scratch the surface of important...technology programs. These technology programs include possible small tools to study extreme states of matter ;, a propulsion test facility for

  6. Acoustic Impact of Short-Term Ocean Variability in the Okinawa Trough

    DTIC Science & Technology

    2010-01-20

    nature run: Generalized Digital Environment Model ( GDEM ) 3.0 climatologyfl], Modular Ocean Data Assimilation System (MODAS) synthetic profiles[2], Navy...potentially preferred for a particular class of applications, and thus a possible source of sound speed for estimates of acoustic transmission. Three, GDEM ...MODAS, and NCODA, are statistical products, and the other three are dynamic forecasts from NCOM. GDEM is a climatology based solely on historical

  7. Application of the DG-1199 methodology to the ESBWR and ABWR.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinich, Donald A.; Gauntt, Randall O.; Walton, Fotini

    2010-09-01

    Appendix A-5 of Draft Regulatory Guide DG-1199 'Alternative Radiological Source Term for Evaluating Design Basis Accidents at Nuclear Power Reactors' provides guidance - applicable to RADTRAD MSIV leakage models - for scaling containment aerosol concentration to the expected steam dome concentration in order to preserve the simplified use of the Accident Source Term (AST) in assessing containment performance under assumed design basis accident (DBA) conditions. In this study Economic and Safe Boiling Water Reactor (ESBWR) and Advanced Boiling Water Reactor (ABWR) RADTRAD models are developed using the DG-1199, Appendix A-5 guidance. The models were run using RADTRAD v3.03. Low Populationmore » Zone (LPZ), control room (CR), and worst-case 2-hr Exclusion Area Boundary (EAB) doses were calculated and compared to the relevant accident dose criteria in 10 CFR 50.67. For the ESBWR, the dose results were all lower than the MSIV leakage doses calculated by General Electric/Hitachi (GEH) in their licensing technical report. There are no comparable ABWR MSIV leakage doses, however, it should be noted that the ABWR doses are lower than the ESBWR doses. In addition, sensitivity cases were evaluated to ascertain the influence/importance of key input parameters/features of the models.« less

  8. A large eddy simulation scheme for turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Gao, Feng

    1993-01-01

    The recent development of the dynamic subgrid-scale (SGS) model has provided a consistent method for generating localized turbulent mixing models and has opened up great possibilities for applying the large eddy simulation (LES) technique to real world problems. Given the fact that the direct numerical simulation (DNS) can not solve for engineering flow problems in the foreseeable future (Reynolds 1989), the LES is certainly an attractive alternative. It seems only natural to bring this new development in SGS modeling to bear on the reacting flows. The major stumbling block for introducing LES to reacting flow problems has been the proper modeling of the reaction source terms. Various models have been proposed, but none of them has a wide range of applicability. For example, some of the models in combustion have been based on the flamelet assumption which is only valid for relatively fast reactions. Some other models have neglected the effects of chemical reactions on the turbulent mixing time scale, which is certainly not valid for fast and non-isothermal reactions. The probability density function (PDF) method can be usefully employed to deal with the modeling of the reaction source terms. In order to fit into the framework of LES, a new PDF, the large eddy PDF (LEPDF), is introduced. This PDF provides an accurate representation for the filtered chemical source terms and can be readily calculated in the simulations. The details of this scheme are described.

  9. Influence of heat conducting substrates on explosive crystallization in thin layers

    NASA Astrophysics Data System (ADS)

    Schneider, Wilhelm

    2017-09-01

    Crystallization in a thin, initially amorphous layer is considered. The layer is in thermal contact with a substrate of very large dimensions. The energy equation of the layer contains source and sink terms. The source term is due to liberation of latent heat in the crystallization process, while the sink term is due to conduction of heat into the substrate. To determine the latter, the heat diffusion equation for the substrate is solved by applying Duhamel's integral. Thus, the energy equation of the layer becomes a heat diffusion equation with a time integral as an additional term. The latter term indicates that the heat loss due to the substrate depends on the history of the process. To complete the set of equations, the crystallization process is described by a rate equation for the degree of crystallization. The governing equations are then transformed to a moving co-ordinate system in order to analyze crystallization waves that propagate with invariant properties. Dual solutions are found by an asymptotic expansion for large activation energies of molecular diffusion. By introducing suitable variables, the results can be presented in a universal form that comprises the influence of all non-dimensional parameters that govern the process. Of particular interest for applications is the prediction of a critical heat loss parameter for the existence of crystallization waves with invariant properties.

  10. Maintaining Scientific Community Vocabularies in Drupal through Consumption of Linked Open Data and Web Services

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Arko, R. A.; Maffei, A. R.; Chandler, C. L.

    2012-12-01

    In the Summer of 2011, a one-year pilot project was funded by the National Science Foundation to build a pre-cruise planning application using the Drupal content management system (CMS). This application will be used to assist the individual operators of research vessels in the UNOLS fleet. A large portion of the operator's pre-cruise process revolves around a questionnaire presented to the principal investigator(PI) that is used to gather information about the nature of their upcoming cruise. The Drupal-based application will be delivered as a distribution for use by any operator of a UNOLS vessel to construct customized questionnaires and provide an interface for the PI to complete this questionnaire at their leisure. A major goal of the project is to develop an application that will require as little programming maintenance as possible after the initial development effort. One of the strategies employed is the reuse of existing controlled vocabularies and linked open data wherever possible for fields of the questionnaire - most notably to populate the concepts of Country, Organization, Port, and Ship. The Rolling Deck to Repository (R2R) program manages controlled vocabularies for these concepts and currently exposes these concepts as linked open data. Furthermore, R2R has identified the authoritative source for pertinent oceanographic community vocabularies as ICES for Ship, UNOLS for Port, IANA for Organization, ISO for Country, ISO for Language, SeaDataNet for Device, FIPS for State, and IHO for Sea Area as described at http://www.rvdata.us/voc. The scope of the terms provided by these sources matches the scope of the operator's needs for these concepts, and so the application is being designed to automatically consume served information about these vocabulary terms to populate and update Drupal taxonomies for use in the questionnaire. Where newer terms are required for a PI to complete a questionnaire (before they appear in the vocabularies), the Drupal-based application employs features that provide extensibility to the Drupal taxonomies while striving for lower development and maintenance costs through the use of existing Drupal modules such as web_taxonomy, autocomplete field widgets and custom modules for consuming and managing data at SPARQL endpoints.

  11. Photoacoustic Spectroscopy with Quantum Cascade Lasers for Trace Gas Detection

    PubMed Central

    Elia, Angela; Di Franco, Cinzia; Lugarà, Pietro Mario; Scamarcio, Gaetano

    2006-01-01

    Various applications, such as pollution monitoring, toxic-gas detection, non invasive medical diagnostics and industrial process control, require sensitive and selective detection of gas traces with concentrations in the parts in 109 (ppb) and sub-ppb range. The recent development of quantum-cascade lasers (QCLs) has given a new aspect to infrared laser-based trace gas sensors. In particular, single mode distributed feedback QCLs are attractive spectroscopic sources because of their excellent properties in terms of narrow linewidth, average power and room temperature operation. In combination with these laser sources, photoacoustic spectroscopy offers the advantage of high sensitivity and selectivity, compact sensor platform, fast time-response and user friendly operation. This paper reports recent developments on quantum cascade laser-based photoacoustic spectroscopy for trace gas detection. In particular, different applications of a photoacoustic trace gas sensor employing a longitudinal resonant cell with a detection limit on the order of hundred ppb of ozone and ammonia are discussed. We also report two QC laser-based photoacoustic sensors for the detection of nitric oxide, for environmental pollution monitoring and medical diagnostics, and hexamethyldisilazane, for applications in semiconductor manufacturing process.

  12. Correcting STIS CCD Point-Source Spectra for CTE Loss

    NASA Technical Reports Server (NTRS)

    Goudfrooij, Paul; Bohlin, Ralph C.; Maiz-Apellaniz, Jesus

    2006-01-01

    We review the on-orbit spectroscopic observations that are being used to characterize the Charge Transfer Efficiency (CTE) of the STIS CCD in spectroscopic mode. We parameterize the CTE-related loss for spectrophotometry of point sources in terms of dependencies on the brightness of the source, the background level, the signal in the PSF outside the standard extraction box, and the time of observation. Primary constraints on our correction algorithm are provided by measurements of the CTE loss rates for simulated spectra (images of a tungsten lamp taken through slits oriented along the dispersion axis) combined with estimates of CTE losses for actual spectra of spectrophotometric standard stars in the first order CCD modes. For point-source spectra at the standard reference position at the CCD center, CTE losses as large as 30% are corrected to within approx.1% RMS after application of the algorithm presented here, rendering the Poisson noise associated with the source detection itself to be the dominant contributor to the total flux calibration uncertainty.

  13. The potential contribution of geothermal energy to electricity supply in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Chandrasekharam, D.; Lashin, Aref; Al Arifi, Nassir

    2016-10-01

    With increase in demand for electricity at 7.5% per year, the major concern of Saudi Arabia is the amount of CO2 being emitted. The country has the potential of generating 200×106 kWh from hydrothermal sources and 120×106 terawatt hour from Enhanced Geothermal System (EGS) sources. In addition to electricity generation and desalination, the country has substantial source for direct application such as space cooling and heating, a sector that consumes 80% of the electricity generated from fossil fuels. Geothermal energy can offset easily 17 million kWh of electricity that is being used for desalination. At least a part of 181,000 Gg of CO2 emitted by conventional space cooling units can also be mitigated through ground-source heat pump technology immediately. Future development of EGS sources together with the wet geothermal systems will make the country stronger in terms of oil reserves saved and increase in exports.

  14. Silver nanoparticles (AgNPs) as a contrast agent for imaging of animal tissue using swept-source optical coherence tomography (SSOCT)

    NASA Astrophysics Data System (ADS)

    Mondal, Indranil; Raj, Shipra; Roy, Poulomi; Poddar, Raju

    2018-01-01

    We present noninvasive three-dimensional depth-resolved imaging of animal tissue with a swept-source optical coherence tomography system at 1064 nm center wavelength and silver nanoparticles (AgNPs) as a potential contrast agent. A swept-source laser light source is used to enable an imaging rate of 100 kHz (100 000 A-scans s-1). Swept-source optical coherence tomography is a new variant of the optical coherence tomography (OCT) technique, offering unique advantages in terms of sensitivity, reduction of motion artifacts, etc. To enhance the contrast of an OCT image, AgNPs are utilized as an exogeneous contrast agent. AgNPs are synthesized using a modified Tollens method and characterization is done by UV-vis spectroscopy, dynamic light scattering, scanning electron microscopy and energy dispersive x-ray spectroscopy. In vitro imaging of chicken breast tissue, with and without the application of AgNPs, is performed. The effect of AgNPs is studied with different exposure times. A mathematical model is also built to calculate changes in the local scattering coefficient of tissue from OCT images. A quantitative estimation of scattering coefficient and contrast is performed for tissues with and without application of AgNPs. Significant improvement in contrast and increase in scattering coefficient with time is observed.

  15. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  16. Chern-Simons Term: Theory and Applications.

    NASA Astrophysics Data System (ADS)

    Gupta, Kumar Sankar

    1992-01-01

    We investigate the quantization and applications of Chern-Simons theories to several systems of interest. Elementary canonical methods are employed for the quantization of abelian and nonabelian Chern-Simons actions using ideas from gauge theories and quantum gravity. When the spatial slice is a disc, it yields quantum states at the edge of the disc carrying a representation of the Kac-Moody algebra. We next include sources in this model and their quantum states are shown to be those of a conformal family. Vertex operators for both abelian and nonabelian sources are constructed. The regularized abelian Wilson line is proved to be a vertex operator. The spin-statistics theorem is established for Chern-Simons dynamics using purely geometrical techniques. Chern-Simons action is associated with exotic spin and statistics in 2 + 1 dimensions. We study several systems in which the Chern-Simons action affects the spin and statistics. The first class of systems we study consist of G/H models. The solitons of these models are shown to obey anyonic statistics in the presence of a Chern-Simons term. The second system deals with the effect of the Chern -Simons term in a model for high temperature superconductivity. The coefficient of the Chern-Simons term is shown to be quantized, one of its possible values giving fermionic statistics to the solitons of this model. Finally, we study a system of spinning particles interacting with 2 + 1 gravity, the latter being described by an ISO(2,1) Chern-Simons term. An effective action for the particles is obtained by integrating out the gauge fields. Next we construct operators which exchange the particles. They are shown to satisfy the braid relations. There are ambiguities in the quantization of this system which can be exploited to give anyonic statistics to the particles. We also point out that at the level of the first quantized theory, the usual spin-statistics relation need not apply to these particles.

  17. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal

    NASA Astrophysics Data System (ADS)

    Johnston, C. D.; Davis, G. B.; Bastow, T. P.; Woodbury, R. J.; Rao, P. S. C.; Annable, M. D.; Rhodes, S.

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L3/L2/T) and mass fluxes (Jc; M/L2/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104 g day- 1 to 24-31 g day- 1 (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site.

  18. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal.

    PubMed

    Johnston, C D; Davis, G B; Bastow, T P; Woodbury, R J; Rao, P S C; Annable, M D; Rhodes, S

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L(3)/L(2)/T) and mass fluxes (Jc; M/L(2)/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104gday(-1) to 24-31gday(-1) (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Advances in 193 nm excimer lasers for mass spectrometry applications

    NASA Astrophysics Data System (ADS)

    Delmdahl, Ralph; Esser, Hans-Gerd; Bonati, Guido

    2016-03-01

    Ongoing progress in mass analysis applications such as laser ablation inductively coupled mass spectrometry of solid samples and ultraviolet photoionization mediated sequencing of peptides and proteins is to a large extent driven by ultrashort wavelength excimer lasers at 193 nm. This paper will introduce the latest improvements achieved in the development of compact high repetition rate excimer lasers and elaborate on the impact on mass spectrometry instrumentation. Various performance and lifetime measurements obtained in a long-term endurance test over the course of 18 months will be shown and discussed in view of the laser source requirements of different mass spectrometry tasks. These sampling type applications are served by excimer lasers delivering pulsed 193 nm output of several mJ as well as fast repetition rates which are already approaching one Kilohertz. In order to open up the pathway from the laboratory to broader market industrial use, sufficient component lifetimes and long-term stable performance behavior have to be ensured. The obtained long-term results which will be presented are based on diverse 193 nm excimer laser tube improvements aiming at e.g. optimizing the gas flow dynamics and have extended the operational life the laser tube for the first time over several billion pulses even under high duty-cycle conditions.

  20. Integrated Global Observation Strategy - Ozone and Atmospheric Chemistry Project

    NASA Technical Reports Server (NTRS)

    Hilsenrath, Ernest; Readings, C. J.; Kaye, J.; Mohnen, V.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The "Long Term Continuity of Stratospheric Ozone Measurements and Atmospheric Chemistry" project was one of six established by the Committee on Earth Observing Satellites (CEOS) in response to the Integrated Global Observing Strategy (IGOS) initiative. IGOS links satellite and ground based systems for global environmental observations. The strategy of this project is to develop a consensus of user requirements including the scientific (SPARC, IGAC, WCRP) and the applications community (WMO, UNEP) and to develop a long-term international plan for ozone and atmospheric chemistry measurements. The major components of the observing system include operational and research (meeting certain criteria) satellite platforms planned by the space faring nations which are integrated with a well supported and sustained ground, aircraft, and balloon measurements program for directed observations as well satellite validation. Highly integrated and continuous measurements of ozone, validation, and reanalysis efforts are essential to meet the international scientific and applications goals. In order to understand ozone trends, climate change, and air quality, it is essential to conduct long term measurements of certain other atmospheric species. These species include key source, radical, and reservoir constituents.

  1. Overview of β-Glucans from Laminaria spp.: Immunomodulation Properties and Applications on Biologic Models

    PubMed Central

    Bonfim-Mendonça, Patrícia de Souza; Capoci, Isis Regina Grenier; Tobaldini-Valerio, Flávia Kelly; Negri, Melyssa; Svidzinski, Terezinha Inez Estivalet

    2017-01-01

    Glucans are a group of glucose polymers that are found in bacteria, algae, fungi, and plants. While their properties are well known, their biochemical and solubility characteristics vary considerably, and glucans obtained from different sources can have different applications. Research has described the bioactivity of β-glucans extracted from the algae of the Laminaria genus, including in vivo and in vitro studies assessing pro- and anti-inflammatory cytokines, vaccine production, inhibition of cell proliferation, and anti- and pro-oxidant activity. Thus, the objective of this article was to review the potential application of β-glucans from Laminaria spp. in terms of their immunomodulatory properties, microorganism host interaction, anti-cancer activity and vaccine development. PMID:28878139

  2. Study of efficient video compression algorithms for space shuttle applications

    NASA Technical Reports Server (NTRS)

    Poo, Z.

    1975-01-01

    Results are presented of a study on video data compression techniques applicable to space flight communication. This study is directed towards monochrome (black and white) picture communication with special emphasis on feasibility of hardware implementation. The primary factors for such a communication system in space flight application are: picture quality, system reliability, power comsumption, and hardware weight. In terms of hardware implementation, these are directly related to hardware complexity, effectiveness of the hardware algorithm, immunity of the source code to channel noise, and data transmission rate (or transmission bandwidth). A system is recommended, and its hardware requirement summarized. Simulations of the study were performed on the improved LIM video controller which is computer-controlled by the META-4 CPU.

  3. Exopolysaccharides enriched in rare sugars: bacterial sources, production, and applications.

    PubMed

    Roca, Christophe; Alves, Vitor D; Freitas, Filomena; Reis, Maria A M

    2015-01-01

    Microbial extracellular polysaccharides (EPS), produced by a wide range of bacteria, are high molecular weight biopolymers, presenting an extreme diversity in terms of chemical structure and composition. They may be used in many applications, depending on their chemical and physical properties. A rather unexplored aspect is the presence of rare sugars in the composition of some EPS. Rare sugars, such as rhamnose or fucose, may provide EPS with additional biological properties compared to those composed of more common sugar monomers. This review gives a brief overview of these specific EPS and their producing bacteria. Cultivation conditions are summarized, demonstrating their impact on the EPS composition, together with downstream processing. Finally, their use in different areas, including cosmetics, food products, pharmaceuticals, and biomedical applications, are discussed.

  4. Radiometric Calibration Techniques for Signal-of-Opportunity Reflectometers

    NASA Technical Reports Server (NTRS)

    Piepmeier, Jeffrey R.; Shah, Rashmi; Deshpande, Manohar; Johnson, Carey

    2014-01-01

    Bi-static reflection measurements utilizing global navigation satellite service (GNSS) or other signals of opportunity (SoOp) can be used to sense ocean and terrestrial surface properties. End-to-end calibration of GNSS-R has been performed using well-characterized reflection surface (e.g., water), direct path antenna, and receiver gain characterization. We propose an augmented approach using on-board receiver electronics for radiometric calibration of SoOp reflectometers utilizing direct and reflected signal receiving antennas. The method calibrates receiver and correlator gains and offsets utilizing a reference switch and common noise source. On-board electronic calibration sources, such as reference switches, noise diodes and loop-back circuits, have shown great utility in stabilizing total power and correlation microwave radiometer and scatterometer receiver electronics in L-band spaceborne instruments. Application to SoOp instruments is likely to bring several benefits. For example, application to provide short and long time scale calibration stability of the direct path channel, especially in low signal-to-noise ratio configurations, is directly analogous to the microwave radiometer problem. The direct path channel is analogous to the loopback path in a scatterometer to provide a reference of the transmitted power, although the receiver is independent from the reflected path channel. Thus, a common noise source can be used to measure the gain ratio of the two paths. Using these techniques long-term (days to weeks) calibration stability of spaceborne L-band scatterometer and radiometer has been achieved better than 0.1. Similar long-term stability would likely be needed for a spaceborne reflectometer mission to measure terrestrial properties such as soil moisture.

  5. (abstract) An Assessment of Electric Propulsion Research, Development, and Application in the United States

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1995-01-01

    This paper will discuss the development of Electric Propulsion technology in the U.S. from the 1960's to the present. It will summarize the various activities related to arcjets, resistojets, pulsed plasma thrustors, magneto-plasma-dynamic thrustors, ion engines, and more recently the evaluation of Hall effect thrustors of the SPT or Anode Layer type developed in Russia. Also, demonstration test flight and actual mission applications will be summarized. Finally, the future application of electric propulsion to near-term commercial communications satellites and planetary missions will be projected. This history is rich in diversity, and has involved a succession of types of thrustors, propellants, and electric power sources. With the recent use of arcjets on commercial communication satellites and the flight tests of ion engines for this application, it appears that electric propulsion is finally on the verge of wide spread application.

  6. Integrating diverse forage sources reduces feed gaps on mixed crop-livestock farms.

    PubMed

    Bell, L W; Moore, A D; Thomas, D T

    2017-12-04

    Highly variable climates induce large variability in the supply of forage for livestock and so farmers must manage their livestock systems to reduce the risk of feed gaps (i.e. periods when livestock feed demand exceeds forage supply). However, mixed crop-livestock farmers can utilise a range of feed sources on their farms to help mitigate these risks. This paper reports on the development and application of a simple whole-farm feed-energy balance calculator which is used to evaluate the frequency and magnitude of feed gaps. The calculator matches long-term simulations of variation in forage and metabolisable energy supply from diverse sources against energy demand for different livestock enterprises. Scenarios of increasing the diversity of forage sources in livestock systems is investigated for six locations selected to span Australia's crop-livestock zone. We found that systems relying on only one feed source were prone to higher risk of feed gaps, and hence, would often have to reduce stocking rates to mitigate these risks or use supplementary feed. At all sites, by adding more feed sources to the farm feedbase the continuity of supply of both fresh and carry-over forage was improved, reducing the frequency and magnitude of feed deficits. However, there were diminishing returns from making the feedbase more complex, with combinations of two to three feed sources typically achieving the maximum benefits in terms of reducing the risk of feed gaps. Higher stocking rates could be maintained while limiting risk when combinations of other feed sources were introduced into the feedbase. For the same level of risk, a feedbase relying on a diversity of forage sources could support stocking rates 1.4 to 3 times higher than if they were using a single pasture source. This suggests that there is significant capacity to mitigate both risk of feed gaps at the same time as increasing 'safe' stocking rates through better integration of feed sources on mixed crop-livestock farms across diverse regions and climates.

  7. Modularized multilevel and z-source power converter as renewable energy interface for vehicle and grid-connected applications

    NASA Astrophysics Data System (ADS)

    Cao, Dong

    Due the energy crisis and increased oil price, renewable energy sources such as photovoltaic panel, wind turbine, or thermoelectric generation module, are used more and more widely for vehicle and grid-connected applications. However, the output of these renewable energy sources varies according to different solar radiation, wind speed, or temperature difference, a power converter interface is required for the vehicle or grid-connected applications. Thermoelectric generation (TEG) module as a renewable energy source for automotive industry is becoming very popular recently. Because of the inherent characteristics of TEG modules, a low input voltage, high input current and high voltage gain dc-dc converters are needed for the automotive load. Traditional high voltage gain dc-dc converters are not suitable for automotive application in terms of size and high temperature operation. Switched-capacitor dc-dc converters have to be used for this application. However, high voltage spike and EMI problems exist in traditional switched-capacitor dc-dc converters. Huge capacitor banks have to be utilized to reduce the voltage ripple and achieve high efficiency. A series of zero current switching (ZCS) or zero voltage switching switched-capacitor dc-dc converters have been proposed to overcome the aforementioned problems of the traditional switched-capacitor dc-dc converters. By using the proposed soft-switching strategy, high voltage spike is reduced, high EMI noise is restricted, and the huge capacitor bank is eliminated. High efficiency, high power density and high temperature switched-capacitor dc-dc converters could be made for the TEG interface in vehicle applications. Several prototypes have been made to validate the proposed circuit and confirm the circuit operation. In order to apply PV panel for grid-connected application, a low cost dc-ac inverter interface is required. From the use of transformer and safety concern, two different solutions can be implemented, non-isolated or isolated PV inverter. For the non-isolated transformer-less solution, a semi-Z-source inverter for single phase photovoltaic systems has been proposed. The proposed semi-Z-source inverter utilizes only two switching devices with doubly grounded feature. The total cost have been reduced, the safety and EMI issues caused by the high frequency ground current are solved. For the transformer isolated solution, a boost half-bridge dc-ac micro-inverter has been proposed. The proposed boost half-bridge dc-dc converter utilizes only two switching devices with zero voltage switching features which is able to reduce the total system cost and power loss.

  8. High-radiance LDP source for mask inspection and beam line applications (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Teramoto, Yusuke; Santos, Bárbara; Mertens, Guido; Kops, Ralf; Kops, Margarete; von Wezyk, Alexander; Bergmann, Klaus; Yabuta, Hironobu; Nagano, Akihisa; Ashizawa, Noritaka; Taniguchi, Yuta; Yamatani, Daiki; Shirai, Takahiro; Kasama, Kunihiko

    2017-04-01

    High-throughput actinic mask inspection tools are needed as EUVL begins to enter into volume production phase. One of the key technologies to realize such inspection tools is a high-radiance EUV source of which radiance is supposed to be as high as 100 W/mm2/sr. Ushio is developing laser-assisted discharge-produced plasma (LDP) sources. Ushio's LDP source is able to provide sufficient radiance as well as cleanliness, stability and reliability. Radiance behind the debris mitigation system was confirmed to be 120 W/mm2/sr at 9 kHz and peak radiance at the plasma was increased to over 200 W/mm2/sr in the recent development which supports high-throughput, high-precision mask inspection in the current and future technology nodes. One of the unique features of Ushio's LDP source is cleanliness. Cleanliness evaluation using both grazing-incidence Ru mirrors and normal-incidence Mo/Si mirrors showed no considerable damage to the mirrors other than smooth sputtering of the surface at the pace of a few nm per Gpulse. In order to prove the system reliability, several long-term tests were performed. Data recorded during the tests was analyzed to assess two-dimensional radiance stability. In addition, several operating parameters were monitored to figure out which contributes to the radiance stability. The latest model that features a large opening angle was recently developed so that the tool can utilize a large number of debris-free photons behind the debris shield. The model was designed both for beam line application and high-throughput mask inspection application. At the time of publication, the first product is supposed to be in use at the customer site.

  9. International market assessment of stand-alone photovoltaic power systems for cottage industry applications

    NASA Astrophysics Data System (ADS)

    Philippi, T. M.

    1981-11-01

    The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.

  10. International market assessment of stand-alone photovoltaic power systems for cottage industry applications

    NASA Technical Reports Server (NTRS)

    Philippi, T. M.

    1981-01-01

    The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.

  11. Assessment of groundwater exploitation in an aquifer using the random walk on grid method: a case study at Ordos, China

    NASA Astrophysics Data System (ADS)

    Nan, Tongchao; Li, Kaixuan; Wu, Jichun; Yin, Lihe

    2018-04-01

    Sustainability has been one of the key criteria of effective water exploitation. Groundwater exploitation and water-table decline at Haolebaoji water source site in the Ordos basin in NW China has drawn public attention due to concerns about potential threats to ecosystems and grazing land in the area. To better investigate the impact of production wells at Haolebaoji on the water table, an adapted algorithm called the random walk on grid method (WOG) is applied to simulate the hydraulic head in the unconfined and confined aquifers. This is the first attempt to apply WOG to a real groundwater problem. The method can not only evaluate the head values but also the contributions made by each source/sink term. One is allowed to analyze the impact of source/sink terms just as if one had an analytical solution. The head values evaluated by WOG match the values derived from the software Groundwater Modeling System (GMS). It suggests that WOG is effective and applicable in a heterogeneous aquifer with respect to practical problems, and the resultant information is useful for groundwater management.

  12. Enumeration, isolation and identification of diazotrophs from Korean wetland rice varieties grown with long-term application of N and compost and their short-term inoculation effect on rice plants.

    PubMed

    Muthukumarasamy, R; Kang, U G; Park, K D; Jeon, W-T; Park, C Y; Cho, Y S; Kwon, S-W; Song, J; Roh, D-H; Revathi, G

    2007-04-01

    This study has been aimed (i) to isolate and identify diazotrophs from Korean rice varieties; (ii) to examine the long-term effect of N and compost on the population dynamics of diazotrophs and (iii) to realize the shot-term inoculation effect of these diazotrophs on rice seedlings. Diazotrophic and heterotrophic bacterial numbers were enumerated by most probable number method and the isolates were identified based on morphological, physiological, biochemical and 16s rDNA sequence analysis. Long-term application of fertilizer N with compost enhanced both these numbers in rice plants and its environment. Bacteria were high in numbers when malate and azelaic acids were used as carbon source, but less when sucrose was used as a carbon substrate. The combined application promoted the association of diazotrophic bacteria like Azospirillum spp., Herbaspirillum spp., Burkholderia spp., Gluconacetobacter diazotrophicus and Pseudomonas spp. in wetland rice plants. Detection of nifD genes from different diazotrophic isolates indicated their nitrogen fixing ability. Inoculation of a representative isolate from each group onto rice seedlings of the variety IR 36 grown in test tubes indicated the positive effect of these diazotrophs on the growth of rice seedlings though the percentage of N present in the plants did not differ much. Application of compost with fertilizer N promoted the diazotrophic and heterotrophic bacterial numbers and their association with wetland rice and its environment. Compost application in high N fertilized fields would avert the reduction of N(2)-fixing bacterial numbers and their association was beneficial to the growth of rice plants. The inhibitory effect of high N fertilization on diazotrophic bacterial numbers could be reduced by the application of compost and this observation would encourage more usage of organic manure. This study has also thrown light on the wider geographic distribution of G. diazotrophicus with wetland rice in temperate region where sugarcane (from which this bacterium was first reported to be associating and thereon from other plant species) is not cultivated.

  13. Feature extraction applied to agricultural crops as seen by LANDSAT

    NASA Technical Reports Server (NTRS)

    Kauth, R. J.; Lambeck, P. F.; Richardson, W.; Thomas, G. S.; Pentland, A. P. (Principal Investigator)

    1979-01-01

    The physical interpretation of the spectral-temporal structure of LANDSAT data can be conveniently described in terms of a graphic descriptive model called the Tassled Cap. This model has been a source of development not only in crop-related feature extraction, but also for data screening and for haze effects correction. Following its qualitative description and an indication of its applications, the model is used to analyze several feature extraction algorithms.

  14. High Efficiency Variable Speed Versatile Power Air Conditioning System

    DTIC Science & Technology

    2013-08-08

    Design concept applicable for wide range of HVAC and refrigeration systems • One TXV size can be used for a wide range of cooling capacity...versatility, can run from AC and DC sources Cooling load adaptive, variable Speed Fully operable up to 140 degrees Fahrenheit 15. SUBJECT TERMS 16. SECURITY...ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 High Efficiency HVAC &R Technology

  15. Integrated optics technology study

    NASA Technical Reports Server (NTRS)

    Chen, B.; Findakly, T.; Innarella, R.

    1982-01-01

    The status and near term potential of materials and processes available for the fabrication of single mode integrated electro-optical components are discussed. Issues discussed are host material and orientation, waveguide formation, optical loss mechanisms, wavelength selection, polarization effects and control, laser to integrated optics coupling fiber optic waveguides to integrated optics coupling, sources, and detectors. Recommendations of the best materials, technology, and processes for fabrication of integrated optical components for communications and fiber gyro applications are given.

  16. Birth Cohort and the Black-White Achievement Gap: The Roles of Access and Health Soon After Birth. WP2008-20

    ERIC Educational Resources Information Center

    Chay, Kenneth Y.; Guryan, Jonathan; Mazumder, Bhashkar

    2009-01-01

    One literature documents a significant, black-white gap in average test scores, while another finds a substantial narrowing of the gap during the 1980's, and stagnation in convergence after. We use two data sources--the Long Term Trends NAEP and AFQT scores for the universe of applicants to the U.S. military between 1976 and 1991--to show: (1) the…

  17. Birth Cohort and the Black-White Achievement Gap: The Roles of Access and Health Soon after Birth. NBER Working Paper No. 15078

    ERIC Educational Resources Information Center

    Chay, Kenneth Y.; Guryan, Jonathan; Bhashkar, Mazumder

    2009-01-01

    One literature documents a significant, black-white gap in average test scores, while another finds a substantial narrowing of the gap during the 1980's, and stagnation in convergence after. We use two data sources -- the Long Term Trends NAEP and AFQT scores for the universe of applicants to the U.S. military between 1976 and 1991 -- to show: 1)…

  18. Passive Localization of Multiple Sources Using Widely-Spaced Arrays with Application to Marine Mammals

    DTIC Science & Technology

    2007-09-30

    the behavioral ecology of marine mammals by simultaneously tracking multiple vocalizing individuals in space and time. OBJECTIVES The ...goal is to contribute to the behavioral ecology of marine mammals by simultaneously tracking multiple vocalizing individuals in space and time. 15...OA Graduate Traineeship for E-M Nosal) LONG-TERM GOALS The goal of our research is to develop systems that use a widely spaced hydrophone array

  19. Screening for Cellulase Encoding Clones in Metagenomic Libraries.

    PubMed

    Ilmberger, Nele; Streit, Wolfgang R

    2017-01-01

    For modern biotechnology there is a steady need to identify novel enzymes. In biotechnological applications, however, enzymes often must function under extreme and nonnatural conditions (i.e., in the presence of solvents, high temperature and/or at extreme pH values). Cellulases have many industrial applications from the generation of bioethanol, a realistic long-term energy source, to the finishing of textiles. These industrial processes require cellulolytic activity under a wide range of pH, temperature, and ionic conditions, and they are usually carried out by mixtures of cellulases. Investigation of the broad diversity of cellulolytic enzymes involved in the natural degradation of cellulose is necessary for optimizing these processes.

  20. On recontamination and directional-bias problems in Monte Carlo simulation of PDF turbulence models

    NASA Technical Reports Server (NTRS)

    Hsu, Andrew T.

    1991-01-01

    Turbulent combustion can not be simulated adequately by conventional moment closure turbulence models. The difficulty lies in the fact that the reaction rate is in general an exponential function of the temperature, and the higher order correlations in the conventional moment closure models of the chemical source term can not be neglected, making the applications of such models impractical. The probability density function (pdf) method offers an attractive alternative: in a pdf model, the chemical source terms are closed and do not require additional models. A grid dependent Monte Carlo scheme was studied, since it is a logical alternative, wherein the number of computer operations increases only linearly with the increase of number of independent variables, as compared to the exponential increase in a conventional finite difference scheme. A new algorithm was devised that satisfies a restriction in the case of pure diffusion or uniform flow problems. Although for nonuniform flows absolute conservation seems impossible, the present scheme has reduced the error considerably.

  1. Application of Geodetic VLBI Data to Obtaining Long-Term Light Curves for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kijima, Masachika

    2010-01-01

    The long-term light curve is important to research on binary black holes and disk instability in AGNs. The light curves have been drawn mainly using single dish data provided by the University of Michigan Radio Observatory and the Metsahovi Radio Observatory. Hence, thus far, we have to research on limited sources. I attempt to draw light curves using VLBI data for those sources that have not been monitored by any observatories with single dish. I developed software, analyzed all geodetic VLBI data available at the IVS Data Centers, and drew the light curves at 8 GHz. In this report, I show the tentative results for two AGNs. I compared two light curves of 4C39.25, which were drawn based on single dish data and on VLBI data. I confirmed that the two light curves were consistent. Furthermore, I succeeded in drawing the light curve of 0454-234 with VLBI data, which has not been monitored by any observatory with single dish. In this report, I suggest that the geodetic VLBI archive data is useful to obtain the long-term light curves at radio bands for astrophysics.

  2. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  3. Recent Accomplishments in Laser-Photovoltaic Wireless Power Transmission

    NASA Technical Reports Server (NTRS)

    Fikes, John C.; Henley, Mark W.; Mankins, John C.; Howell, Joe T.; Fork, Richard L.; Cole, Spencer T.; Skinner, Mark

    2003-01-01

    Wireless power transmission can be accomplished over long distances using laser power sources and photovoltaic receivers. Recent research at AMOS has improved our understanding of the use of this technology for practical applications. Research by NASA, Boeing, the University of Alabama-Huntsville, the University of Colorado, Harvey Mudd College, and the Naval Postgraduate School has tested various commercial lasers and photovoltaic receiver configurations. Lasers used in testing have included gaseous argon and krypton, solid-state diodes, and fiber optic sources, at wavelengths ranging from the visible to the near infra-red. A variety of Silicon and Gallium Arsenide photovoltaic have been tested with these sources. Safe operating procedures have been established, and initial tests have been conducted in the open air at AMOS facilities. This research is progressing toward longer distance ground demonstrations of the technology and practical near-term space demonstrations.

  4. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  5. History of surgery for atrial fibrillation.

    PubMed

    Edgerton, Zachary J; Edgerton, James R

    2009-12-01

    There is a rich history of surgery for atrial fibrillation. Initial procedures were aimed at controlling the ventricular response rate. Later procedures were directed at converting atrial fibrillation to normal sinus rhythm. These culminated in the Cox Maze III procedure. While highly effective, the complexity and morbidity of the cut and sew Maze III limited its adoption. Enabling technology has developed alternate energy sources designed to produce a transmural atrial scar without cutting and sewing. Termed the Maze IV, this lessened the morbidity of the procedure and widened the applicability. Further advances in minimal access techniques are now being developed to allow totally thorascopic placement of all the left atrial lesions on the full, beating heart, using alternate energy sources.

  6. Instantaneous and time-averaged dispersion and measurement models for estimation theory applications with elevated point source plumes

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1977-01-01

    Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.

  7. A generalized LSTM-like training algorithm for second-order recurrent neural networks

    PubMed Central

    Monner, Derek; Reggia, James A.

    2011-01-01

    The Long Short Term Memory (LSTM) is a second-order recurrent neural network architecture that excels at storing sequential short-term memories and retrieving them many time-steps later. LSTM’s original training algorithm provides the important properties of spatial and temporal locality, which are missing from other training approaches, at the cost of limiting it’s applicability to a small set of network architectures. Here we introduce the Generalized Long Short-Term Memory (LSTM-g) training algorithm, which provides LSTM-like locality while being applicable without modification to a much wider range of second-order network architectures. With LSTM-g, all units have an identical set of operating instructions for both activation and learning, subject only to the configuration of their local environment in the network; this is in contrast to the original LSTM training algorithm, where each type of unit has its own activation and training instructions. When applied to LSTM architectures with peephole connections, LSTM-g takes advantage of an additional source of back-propagated error which can enable better performance than the original algorithm. Enabled by the broad architectural applicability of LSTM-g, we demonstrate that training recurrent networks engineered for specific tasks can produce better results than single-layer networks. We conclude that LSTM-g has the potential to both improve the performance and broaden the applicability of spatially and temporally local gradient-based training algorithms for recurrent neural networks. PMID:21803542

  8. Stratified flows with variable density: mathematical modelling and numerical challenges.

    NASA Astrophysics Data System (ADS)

    Murillo, Javier; Navas-Montilla, Adrian

    2017-04-01

    Stratified flows appear in a wide variety of fundamental problems in hydrological and geophysical sciences. They may involve from hyperconcentrated floods carrying sediment causing collapse, landslides and debris flows, to suspended material in turbidity currents where turbulence is a key process. Also, in stratified flows variable horizontal density is present. Depending on the case, density varies according to the volumetric concentration of different components or species that can represent transported or suspended materials or soluble substances. Multilayer approaches based on the shallow water equations provide suitable models but are not free from difficulties when moving to the numerical resolution of the governing equations. Considering the variety of temporal and spatial scales, transfer of mass and energy among layers may strongly differ from one case to another. As a consequence, in order to provide accurate solutions, very high order methods of proved quality are demanded. Under these complex scenarios it is necessary to observe that the numerical solution provides the expected order of accuracy but also converges to the physically based solution, which is not an easy task. To this purpose, this work will focus in the use of Energy balanced augmented solvers, in particular, the Augmented Roe Flux ADER scheme. References: J. Murillo , P. García-Navarro, Wave Riemann description of friction terms in unsteady shallow flows: Application to water and mud/debris floods. J. Comput. Phys. 231 (2012) 1963-2001. J. Murillo B. Latorre, P. García-Navarro. A Riemann solver for unsteady computation of 2D shallow flows with variable density. J. Comput. Phys.231 (2012) 4775-4807. A. Navas-Montilla, J. Murillo, Energy balanced numerical schemes with very high order. The Augmented Roe Flux ADER scheme. Application to the shallow water equations, J. Comput. Phys. 290 (2015) 188-218. A. Navas-Montilla, J. Murillo, Asymptotically and exactly energy balanced augmented flux-ADER schemes with application to hyperbolic conservation laws with geometric source terms, J. Comput. Phys. 317 (2016) 108-147. J. Murillo and A. Navas-Montilla, A comprehensive explanation and exercise of the source terms in hyperbolic systems using Roe type solutions. Application to the 1D-2D shallow water equations, Advances in Water Resources 98 (2016) 70-96.

  9. Simulation of dissolved nutrient export from the Dongjiang river basin with a grid-based NEWS model

    NASA Astrophysics Data System (ADS)

    Rong, Qiangqiang; Su, Meirong; Yang, Zhifeng; Cai, Yanpeng; Yue, Wencong; Dang, Zhi

    2018-06-01

    In this research, a grid-based NEWS model was proposed through coupling the geographic information system (GIS) with the Global NEWS model framework. The model was then applied to the Dongjiang River basin to simulate the dissolved nutrient export from this area. The model results showed that the total amounts of the dissolved nitrogen and phosphorus exported from the Dongjiang River basin were approximately 27154.87 and 1389.33 t, respectively. 90 % of the two loads were inorganic forms (i.e. dissolved inorganic nitrogen and phosphorus, DIN and DIP). Also, the nutrient export loads did not evenly distributed in the basin. The main stream watershed of the Dongjiang River basin has the largest DIN and DIP export loads, while the largest dissolved organic nitrogen and phosphorus (DON and DOP) loads were observed in the middle and upper stream watersheds of the basin, respectively. As for the nutrient exported from each subbasin, different sources had different influences on the output of each nutrient form. For the DIN load in each subbasin, fertilization application, atmospheric deposition and biological fixation were the three main contributors, while eluviation was the most important source for DON. In terms of DIP load, fertilizer application and breeding wastewater were the main contributors, while eluviation and fertilizer application were the two main sources for DOP.

  10. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames.

    PubMed

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  11. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames

    NASA Astrophysics Data System (ADS)

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    2014-05-01

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  12. Establishing Standards on Colors from Natural Sources.

    PubMed

    Simon, James E; Decker, Eric A; Ferruzzi, Mario G; Giusti, M Monica; Mejia, Carla D; Goldschmidt, Mark; Talcott, Stephen T

    2017-11-01

    Color additives are applied to many food, drug, and cosmetic products. With up to 85% of consumer buying decisions potentially influenced by color, appropriate application of color additives and their safety is critical. Color additives are defined by the U.S. Federal Food, Drug, and Cosmetic Act (FD&C Act) as any dye, pigment, or substance that can impart color to a food, drug, or cosmetic or to the human body. Under current U.S. Food and Drug Administration (FDA) regulations, colors fall into 2 categories as those subject to an FDA certification process and those that are exempt from certification often referred to as "natural" colors by consumers because they are sourced from plants, minerals, and animals. Certified colors have been used for decades in food and beverage products, but consumer interest in natural colors is leading market applications. However, the popularity of natural colors has also opened a door for both unintentional and intentional economic adulteration. Whereas FDA certifications for synthetic dyes and lakes involve strict quality control, natural colors are not evaluated by the FDA and often lack clear definitions and industry accepted quality and safety specifications. A significant risk of adulteration of natural colors exists, ranging from simple misbranding or misuse of the term "natural" on a product label to potentially serious cases of physical, chemical, and/or microbial contamination from raw material sources, improper processing methods, or intentional postproduction adulteration. Consistent industry-wide safety standards are needed to address the manufacturing, processing, application, and international trade of colors from natural sources to ensure quality and safety throughout the supply chain. © 2017 Institute of Food Technologists®.

  13. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  14. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  15. UV fatigue investigations with non-destructive tools in silica

    NASA Astrophysics Data System (ADS)

    Natoli, Jean-Yves; Beaudier, Alexandre; Wagner, Frank R.

    2017-08-01

    A fatigue effect is often observed under multiple laser irradiations, overall in UV. This decrease of LIDT, is a critical parameter for laser sources with high repetition rates and with a need of long-term life, as in spatial applications at 355nm. A challenge is also to replace excimer lasers by solid laser sources, this challenge requires to improve drastically the lifetime of optical materials at 266nm. Main applications of these sources are devoted to material surface nanostructuration, spectroscopy and medical surgeries. In this work we focus on the understanding of the laser matter interaction at 266nm in silica in order to predict the lifetime of components and study parameters links to these lifetimes to give keys of improvement for material suppliers. In order to study the mechanism involved in the case of multiple irradiations, an interesting approach is to involve the evolution of fluorescence, in order to observe the first stages of material changes just before breakdown. We will show that it is sometime possible to estimate the lifetime of component only with the fluorescence measurement, saving time and materials. Moreover, the data from the diagnostics give relevant informations to highlight "defects" induced by multiple laser irradiations.

  16. Design of HIFU Transducers for Generating Specified Nonlinear Ultrasound Fields.

    PubMed

    Rosnitskiy, Pavel B; Yuldashev, Petr V; Sapozhnikov, Oleg A; Maxwell, Adam D; Kreider, Wayne; Bailey, Michael R; Khokhlova, Vera A

    2017-02-01

    Various clinical applications of high-intensity focused ultrasound have different requirements for the pressure levels and degree of nonlinear waveform distortion at the focus. The goal of this paper is to determine transducer design parameters that produce either a specified shock amplitude in the focal waveform or specified peak pressures while still maintaining quasi-linear conditions at the focus. Multiparametric nonlinear modeling based on the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation with an equivalent source boundary condition was employed. Peak pressures, shock amplitudes at the focus, and corresponding source outputs were determined for different transducer geometries and levels of nonlinear distortion. The results are presented in terms of the parameters of an equivalent single-element spherically shaped transducer. The accuracy of the method and its applicability to cases of strongly focused transducers were validated by comparing the KZK modeling data with measurements and nonlinear full diffraction simulations for a single-element source and arrays with 7 and 256 elements. The results provide look-up data for evaluating nonlinear distortions at the focus of existing therapeutic systems as well as for guiding the design of new transducers that generate specified nonlinear fields.

  17. MultiElec: A MATLAB Based Application for MEA Data Analysis.

    PubMed

    Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R

    2015-01-01

    We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.

  18. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  19. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  20. Reuse of Winery Wastewater by Application to Vineyard Soils

    NASA Astrophysics Data System (ADS)

    Mosse, K. P.; Patti, A. F.; Parikh, S.; Steenwerth, K. L.; Buelow, M. C.; Cavagnaro, T. R.

    2010-12-01

    The ability to reuse winery wastewater (WWW) has potential benefits both with respect to treatment of a waste stream, as well as providing a beneficial water resource in water limited regions such as south-eastern Australia, California and South Africa. Our study in south-eastern Australia and California has focused on characterizing the physicochemical properties and microbial communities on soils following WWW application. Studies in the Yarra Valley, Victoria, Australia considered the effect of a single WWW application on paired soil sites, one of which was acclimatized to WWW application via 30 years of this practice, and the other of which was not. Soils that had received WWW appear to have a primed microbial population, with soil respiration showing a significantly greater spike following the single WWW application. In addition, the nitrate and ammonium spikes were impacted upon in the acclimatised site. Taken together, this information suggests that long-term WWW application causes an alteration to the microbial community, which may be more readily able to assimilate the carbon and nitrogen sources present in WWW. Studies are currently underway to assess the impacts of the application of a synthetic WWW on vineyard soils in Davis, California. In this study, four different synthetic WWWs are being applied as irrigation water, and soil will be sampled at the time of grape harvest. Results from this ongoing work will be presented with a view to informing long term vineyard management for sustainability.

  1. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumazaki, Y; Miyaura, K; Hirai, R

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less

  2. Where to Publish and Find Ontologies? A Survey of Ontology Libraries

    PubMed Central

    d'Aquin, Mathieu; Noy, Natalya F.

    2011-01-01

    One of the key promises of the Semantic Web is its potential to enable and facilitate data interoperability. The ability of data providers and application developers to share and reuse ontologies is a critical component of this data interoperability: if different applications and data sources use the same set of well defined terms for describing their domain and data, it will be much easier for them to “talk” to one another. Ontology libraries are the systems that collect ontologies from different sources and facilitate the tasks of finding, exploring, and using these ontologies. Thus ontology libraries can serve as a link in enabling diverse users and applications to discover, evaluate, use, and publish ontologies. In this paper, we provide a survey of the growing—and surprisingly diverse—landscape of ontology libraries. We highlight how the varying scope and intended use of the libraries a ects their features, content, and potential exploitation in applications. From reviewing eleven ontology libraries, we identify a core set of questions that ontology practitioners and users should consider in choosing an ontology library for finding ontologies or publishing their own. We also discuss the research challenges that emerge from this survey, for the developers of ontology libraries to address. PMID:22408576

  3. A computational geometry approach to pore network construction for granular packings

    NASA Astrophysics Data System (ADS)

    van der Linden, Joost H.; Sufian, Adnan; Narsilio, Guillermo A.; Russell, Adrian R.; Tordesillas, Antoinette

    2018-03-01

    Pore network construction provides the ability to characterize and study the pore space of inhomogeneous and geometrically complex granular media in a range of scientific and engineering applications. Various approaches to the construction have been proposed, however subtle implementational details are frequently omitted, open access to source code is limited, and few studies compare multiple algorithms in the context of a specific application. This study presents, in detail, a new pore network construction algorithm, and provides a comprehensive comparison with two other, well-established Delaunay triangulation-based pore network construction methods. Source code is provided to encourage further development. The proposed algorithm avoids the expensive non-linear optimization procedure in existing Delaunay approaches, and is robust in the presence of polydispersity. Algorithms are compared in terms of structural, geometrical and advanced connectivity parameters, focusing on the application of fluid flow characteristics. Sensitivity of the various networks to permeability is assessed through network (Stokes) simulations and finite-element (Navier-Stokes) simulations. Results highlight strong dependencies of pore volume, pore connectivity, throat geometry and fluid conductance on the degree of tetrahedra merging and the specific characteristics of the throats targeted by the merging algorithm. The paper concludes with practical recommendations on the applicability of the three investigated algorithms.

  4. Solar-powered irrigation systems. Technical progress report, July 1977--January 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1978-02-28

    Dispersed solar thermal power systems applied to farm irrigation energy needs are analyzed. The 17 western states, containing 84% of nationwide irrigated croplands and consuming 93% of nationwide irrigation energy, have been selected to determine were solar irrigation systems can compete most favorably with conventional energy sources. Financial analysis of farms, according to size and ownership, was accomplished to permit realistic comparative analyses of system lifetime costs. Market potential of optimized systems has been estimated for the 17-state region for near-term (1985) and intermediate-term (2000) applications. Technical, economic, and institutional factors bearing on penetration and capture of this market aremore » being identified.« less

  5. [Personalization in the medicine of the future : Opportunities and risks].

    PubMed

    Malek, N P

    2017-07-01

    Personalized medicine is not a new concept. The renaissance of the term is due to the enormous progress in gene sequencing technology and functional imaging, as well as the development of targeted therapies. Application of these technologies in clinical medicine will necessitate infrastructural as well as organizational and educational changes in the healthcare system. An important change required already in the short-term is the introduction of centralized structures, preferably in university clinics, which adopt these innovations and incorporate them into clinical care. Simultaneously, the collation and use of large quantities of relevant data from highly variable sources must be successfully mastered, in order to pave the way for disruptive technologies such as artificial intelligence.

  6. Toward a common language for biobanking.

    PubMed

    Fransson, Martin N; Rial-Sebbag, Emmanuelle; Brochhausen, Mathias; Litton, Jan-Eric

    2015-01-01

    To encourage the process of harmonization, the biobank community should support and use a common terminology. Relevant terms may be found in general thesauri for medicine, legal instruments or specific glossaries for biobanking. A comparison of the use of these sources has so far not been conducted and would be a useful instrument to further promote harmonization and data sharing. Thus, the purpose of the present study was to investigate the preference of definitions important for sharing biological samples and data. Definitions for 10 terms -[human] biobank, sample/specimen, sample collection, study, aliquot, coded, identifying information, anonymised, personal data and informed consent-were collected from several sources. A web-based questionnaire was sent to 560 European individuals working with biobanks asking to select their preferred definition for the terms. A total of 123 people participated in the survey, giving a response rate of 23%. The result was evaluated from four aspects: scope of definitions, potential regional differences, differences in semantics and definitions in the context of ontologies, guided by comments from responders. Indicative from the survey is the risk of focusing only on the research aspect of biobanking in definitions. Hence, it is recommended that important terms should be formulated in such a way that all areas of biobanking are covered to improve the bridges between research and clinical application. Since several of the terms investigated here within can also be found in a legal context, which may differ between countries, establishing what is a proper definition on how it adheres to law is also crucial.

  7. Radiation dosimetry properties of smartphone CMOS sensors.

    PubMed

    Van Hoey, Olivier; Salavrakos, Alexia; Marques, Antonio; Nagao, Alexandre; Willems, Ruben; Vanhavere, Filip; Cauwels, Vanessa; Nascimento, Luana F

    2016-03-01

    During the past years, several smartphone applications have been developed for radiation detection. These applications measure radiation using the smartphone camera complementary metal-oxide-semiconductor sensor. They are potentially useful for data collection and personal dose assessment in case of a radiological incident. However, it is important to assess these applications. Six applications were tested by means of irradiations with calibrated X-ray and gamma sources. It was shown that the measurement stabilises only after at least 10-25 min. All applications exhibited a flat dose rate response in the studied ambient dose equivalent range from 2 to 1000 μSv h(-1). Most applications significantly over- or underestimate the dose rate or are not calibrated in terms of dose rate. A considerable energy dependence was observed below 100 keV but not for the higher energy range more relevant for incident scenarios. Photon impact angle variation gave a measured signal variation of only about 10 %. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  9. LPE grown LSO:Tb scintillator films for high-resolution X-ray imaging applications at synchrotron light sources

    NASA Astrophysics Data System (ADS)

    Cecilia, A.; Rack, A.; Douissard, P.-A.; Martin, T.; Dos Santos Rolo, T.; Vagovič, P.; Hamann, E.; van de Kamp, T.; Riedel, A.; Fiederle, M.; Baumbach, T.

    2011-08-01

    Within the project ScinTAX of the 6th framework program (FP6) of the European Commission (SCINTAX—STRP 033 427) we have developed a new thin single crystal scintillator for high-resolution X-ray imaging. The scintillator is based on a Tb-doped Lu2SiO5 (LSO) film epitaxially grown on an adapted substrate. The high density, effective atomic number and light yield of the scintillating LSO significantly improves the efficiency of the X-ray imaging detectors currently used in synchrotron micro-imaging applications. In this work we present the characterization of the scintillating LSO films in terms of their spatial resolution performance and we provide two examples of high spatial and high temporal resolution applications.

  10. Exopolysaccharides enriched in rare sugars: bacterial sources, production, and applications

    PubMed Central

    Roca, Christophe; Alves, Vitor D.; Freitas, Filomena; Reis, Maria A. M.

    2015-01-01

    Microbial extracellular polysaccharides (EPS), produced by a wide range of bacteria, are high molecular weight biopolymers, presenting an extreme diversity in terms of chemical structure and composition. They may be used in many applications, depending on their chemical and physical properties. A rather unexplored aspect is the presence of rare sugars in the composition of some EPS. Rare sugars, such as rhamnose or fucose, may provide EPS with additional biological properties compared to those composed of more common sugar monomers. This review gives a brief overview of these specific EPS and their producing bacteria. Cultivation conditions are summarized, demonstrating their impact on the EPS composition, together with downstream processing. Finally, their use in different areas, including cosmetics, food products, pharmaceuticals, and biomedical applications, are discussed. PMID:25914689

  11. Aerosol Mapping From Space: Strengths, Limitations, and Applications

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph

    2010-01-01

    The aerosol data products from the NASA Earth Observing System's MISR and MODIS instruments provide significant advances in regional and global aerosol optical depth (AOD) mapping, aerosol type measurement, and source plume characterization from space. These products have been and are being used for many applications, ranging from regional air quality assessment, to aerosol air mass type identification and evolution, to wildfire smoke injection height and aerosol transport model validation. However, retrieval uncertainties and coverage gaps still limit the quantitative constraints these satellite data place on some important questions, such as global-scale long-term trends and direct aerosol radiative forcing. Major advances in these areas seem to require a different paradigm, involving the integration of satellite with suborbital data and with models. This presentation will briefly summarize where we stand, and what incremental improvements we can expect, with the current MISR and MODIS aerosol products, and will then elaborate on some initial steps aimed at the necessary integration of satellite data with data from other sources and with chemical transport models.

  12. Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds

    PubMed Central

    Poojary, Mahesha M.; Barba, Francisco J.; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A.; Juliano, Pablo

    2016-01-01

    Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability. PMID:27879659

  13. Biosurfactant production through Bacillus sp. MTCC 5877 and its multifarious applications in food industry.

    PubMed

    Anjum, Farhan; Gautam, Gunjan; Edgard, Gnansounou; Negi, Sangeeta

    2016-08-01

    In this study Bacillus sp. MTCC5877 was explored for the production of biosurfactant (BSs) and various carbon sources 1% (w/v), 0.5% (w/v) nitrogen sources were tested at different pH, and temperature. Yield was measured in terms of Emulsification index (EI), Oil Displacement Area (ODA) and Drop Collapse Area (DCA) and maximum emulsification activities of BSs were found (E24) 50%, 76% and 46%, respectively, and maximum ODA of 5.0, 6.2 and 4.7cm, were shown respectively. The BS was able to reduce the surface tension of water from 72 to 30mN/m and 72 to 32mN/m. Structural compositions of BS were confirmed by FTIR, GC-MS and NMR. Anti-adhesive property of BS was determined and found effective against biofilm formation. It could remove 73% Cd from vegetable which confirms its application in food industry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Low Data Drug Discovery with One-Shot Learning

    PubMed Central

    2017-01-01

    Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. Model.2015, 55, 263–27425635324). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https://github.com/deepchem/deepchem, 2016). PMID:28470045

  15. Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds.

    PubMed

    Poojary, Mahesha M; Barba, Francisco J; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A; Juliano, Pablo

    2016-11-22

    Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability.

  16. Direct design of aspherical lenses for extended non-Lambertian sources in three-dimensional rotational geometry

    PubMed Central

    Wu, Rengmao; Hua, Hong

    2016-01-01

    Illumination design used to redistribute the spatial energy distribution of light source is a key technique in lighting applications. However, there is still no effective illumination design method for extended sources, especially for extended non-Lambertian sources. What we present here is to our knowledge the first direct method for extended non-Lambertian sources in three-dimensional (3D) rotational geometry. In this method, both meridional rays and skew rays of the extended source are taken into account to tailor the lens profile in the meridional plane. A set of edge rays and interior rays emitted from the extended source which will take a given direction after the refraction of the aspherical lens are found by the Snell’s law, and the output intensity at this direction is then calculated to be the integral of the luminance function of the outgoing rays at this direction. This direct method is effective for both extended non-Lambertian sources and extended Lambertian sources in 3D rotational symmetry, and can directly find a solution to the prescribed design problem without cumbersome iterative illuminance compensation. Two examples are presented to demonstrate the effectiveness of the proposed method in terms of performance and capacity for tackling complex designs. PMID:26832484

  17. Towards an advanced e-Infrastructure for Civil Protection applications: Research Strategies and Innovation Guidelines

    NASA Astrophysics Data System (ADS)

    Mazzetti, P.; Nativi, S.; Verlato, M.; Angelini, V.

    2009-04-01

    In the context of the EU co-funded project CYCLOPS (http://www.cyclops-project.eu) the problem of designing an advanced e-Infrastructure for Civil Protection (CP) applications has been addressed. As a preliminary step, some studies about European CP systems and operational applications were performed in order to define their specific system requirements. At a higher level it was verified that CP applications are usually conceived to map CP Business Processes involving different levels of processing including data access, data processing, and output visualization. At their core they usually run one or more Earth Science models for information extraction. The traditional approach based on the development of monolithic applications presents some limitations related to flexibility (e.g. the possibility of running the same models with different input data sources, or different models with the same data sources) and scalability (e.g. launching several runs for different scenarios, or implementing more accurate and computing-demanding models). Flexibility can be addressed adopting a modular design based on a SOA and standard services and models, such as OWS and ISO for geospatial services. Distributed computing and storage solutions could improve scalability. Basing on such considerations an architectural framework has been defined. It is made of a Web Service layer providing advanced services for CP applications (e.g. standard geospatial data sharing and processing services) working on the underlying Grid platform. This framework has been tested through the development of prototypes as proof-of-concept. These theoretical studies and proof-of-concept demonstrated that although Grid and geospatial technologies would be able to provide significant benefits to CP applications in terms of scalability and flexibility, current platforms are designed taking into account requirements different from CP. In particular CP applications have strict requirements in terms of: a) Real-Time capabilities, privileging time-of-response instead of accuracy, b) Security services to support complex data policies and trust relationships, c) Interoperability with existing or planned infrastructures (e.g. e-Government, INSPIRE compliant, etc.). Actually these requirements are the main reason why CP applications differ from Earth Science applications. Therefore further research is required to design and implement an advanced e-Infrastructure satisfying those specific requirements. In particular five themes where further research is required were identified: Grid Infrastructure Enhancement, Advanced Middleware for CP Applications, Security and Data Policies, CP Applications Enablement, and Interoperability. For each theme several research topics were proposed and detailed. They are targeted to solve specific problems for the implementation of an effective operational European e-Infrastructure for CP applications.

  18. Wartime Tracking of Class I Surface Shipments from Production or Procurement to Destination

    DTIC Science & Technology

    1992-04-01

    Armed Forces I ICAF-FAP National Defense University 6c. ADDRESS (City, State, ard ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Fort Lesley J...INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable) 9c. ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK...COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP 19. ABSTRACT (Continue on reverse

  19. Large Eddy Simulations of Supercritical Mixing Layers for Air Force Applications

    DTIC Science & Technology

    2010-05-01

    Taskinoglu and J. Bellan Species m( gmo \\-]) 7). (K) P,(MPa) N2 28.013 126.3 3.399 C7H16 100.205...equation is that its source term, called the irreversible en- tropy production, which is by definition is the dissipation [3|, con - tains the full extent...with sim- ilar or even larger gradient magnitudes under fully turbulent con - ditions (the experimental data was for Re = O(104) - O(105)). Thus

  20. Exploring the Fundamental of Fatigue in Composites: Opportunities using X-Ray Computed Tomography Imaging

    DTIC Science & Technology

    2012-10-01

    matrix composite materials are employed in aerospace applications [1] and increasingly in other sectors such as sustainable energy (e.g. wind turbines ... ring . The difference in terms of source has important advantages for synchrotron radiation CT. One is represented by high spatial resolution...emery cloth between the specimen faces and the grip jaws (grit side toward specimen) was placed to avoid or reduce slip of the specimen in the grips

  1. European Science Notes. Volume 40, Number 4.

    DTIC Science & Technology

    1986-04-01

    OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (if applicable) 8c. ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF...Office, London ONRL 6c. ADDRESS (City, State, and ZIP Code ) 7b. ADDRESS (City, State, and ZIPCode) Box 39 FPO, NY 09510 Ba. NAME OF FUNDING/SPONSORING 8b...13..TYPj9 REPORT13bTIECVRD1.DTOFRPT(YaMnhDy)1.AGCUNMonthly FROM TO _ April 1986 32 16. SUPPLEMENTARY NOTATION 17. COSATI CODES 18. SUBJECT TERMS

  2. Force Identification from Structural Response

    DTIC Science & Technology

    1999-12-01

    STUDENT AT (If applicable) AFIT/CIA Univ of New Mexico A 6c. ADDRESS (City, State, and ZIP Code ) 7b. ADDRESS (City, State, and ZIP Code ) Wright...ADDRESS (City, State, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. 11. TITLE (h,,clude...FOR PUBLIC RELEASE IAW AFR 190-1 ERNEST A. HAYGOOD, 1st Lt, USAF Executive Officer, Civilian Institution Programs 17. COSATI CODES 18. SUBJECT TERMS

  3. Design and realization of disaster assessment algorithm after forest fire

    NASA Astrophysics Data System (ADS)

    Xu, Aijun; Wang, Danfeng; Tang, Lihua

    2008-10-01

    Based on GIS technology, this paper mainly focuses on the application of disaster assessment algorithm after forest fire and studies on the design and realization of disaster assessment based on GIS. After forest fire through the analysis and processing of multi-sources and heterogeneous data, this paper integrates the foundation that the domestic and foreign scholars laid of the research on assessment for forest fire loss with the related knowledge of assessment, accounting and forest resources appraisal so as to study and approach the theory framework and assessment index of the research on assessment for forest fire loss. The technologies of extracting boundary, overlay analysis, and division processing of multi-sources spatial data are available to realize the application of the investigation method of the burnt forest area and the computation of the fire area. The assessment provides evidence for fire cleaning in burnt areas and new policy making on restoration in terms of the direct and the indirect economic loss and ecological and environmental damage caused by forest fire under the condition of different fire danger classes and different amounts of forest accumulation, thus makes forest resources protection operated in a faster, more efficient and more economical way. Finally, this paper takes Lin'an city of Zhejiang province as a test area to confirm the method mentioned in the paper in terms of key technologies.

  4. Application of a Persistent Dissolved-phase Reactive Treatment Zone for Mitigation of Mass Discharge from Sources Located in Lower-Permeability Sediments

    PubMed Central

    Marble, J.C.; Brusseau, M.L.; Carroll, K.C.; Plaschke, M.; Fuhrig, L.; Brinker, F.

    2015-01-01

    The purpose of this study is to examine the development and effectiveness of a persistent dissolved-phase treatment zone, created by injecting potassium permanganate solution, for mitigating discharge of contaminant from a source zone located in a relatively deep, low-permeability formation. A localized 1,1-dichloroethene (DCE) source zone comprising dissolved- and sorbed-phase mass is present in lower permeability strata adjacent to a sand/gravel unit in a section of the Tucson International Airport Area (TIAA) Superfund Site. The results of bench-scale studies conducted using core material collected from boreholes drilled at the site indicated that natural oxidant demand was low, which would promote permanganate persistence. The reactive zone was created by injecting a permanganate solution into multiple wells screened across the interface between the lower-permeability and higher-permeability units. The site has been monitored for nine years to characterize the spatial distribution of DCE and permanganate. Permanganate continues to persist at the site, and a substantial and sustained decrease in DCE concentrations in groundwater has occurred after the permanganate injection.. These results demonstrate successful creation of a long-term, dissolved-phase reactive-treatment zone that reduced mass discharge from the source. This project illustrates the application of in-situ chemical oxidation as a persistent dissolved-phase reactive-treatment system for lower-permeability source zones, which appears to effectively mitigate persistent mass discharge into groundwater. PMID:26300570

  5. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  6. Katharsis of the skin: Peeling applications and agents of chemical peelings in Greek medical textbooks of Graeco-Roman antiquity.

    PubMed

    Ursin, F; Steger, F; Borelli, C

    2018-04-28

    Recipes for peelings date back to medical texts of old Egypt. The oldest medical papyri contain recipes for "improving beauty of the skin" and "removing wrinkles" by use of agents like salt and soda. The Egyptian Queen Cleopatra (69-30 BC) is said to have taken bathes in donkey's milk in order to improve the beauty of her skin. However, little is known about other agents and peeling applications in later Greek medical textbooks. We will discover new agents and describe ancient peeling applications. First, we will have to identify ancient Greek medical terms for the modern terms "peeling" and "chemical peeling". Second, based on the identified terms we will perform a systematic fulltext search for agents in original sources. Third, we will categorize the results into three peeling applications: (1) cleansing, (2) aesthetical improvement of the skin, and (3) therapy of dermatological diseases. We performed a full systematic keyword search with the identified Greek terms in databases of ancient Greek texts. Our keywords for peeling and chemical peeling are "smēxis" and "trīpsis". Our keywords for agents of peeling and chemical peeling are "smégmata", "rhýmmata", "kathartiká", and "trímmata". Diocles (4 th century BC) was the first one who mentioned "smēxis" and "trīpsis" as parts of daily cleansing routine. Criton (2 nd century AD) wrote about peeling applications, but any reference to the agents is lost. Antyllos (2 nd century AD) composed three lists of peeling applications including agents. Greek medical textbooks of Graeco-Roman antiquity report several peeling applications like cleansing, brightening, darkening, softening, and aesthetical improvement of the skin by use of peeling and chemical peeling, as well as therapy of dermatological diseases. There are 27 ancient agents for what is contemporarily called peeling and chemical peeling. We discovered more specific agents than hitherto known to research. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  7. Fiber optic systems in the UV region

    NASA Astrophysics Data System (ADS)

    Huebner, Michael; Meyer, H.; Klein, Karl-Friedrich; Hillrichs, G.; Ruetting, Martin; Veidemanis, M.; Spangenberg, Bernd; Clarkin, James P.; Nelson, Gary W.

    2000-05-01

    Mainly due to the unexpected progress in manufacturing of solarization-reduced all-silica fibers, new fiber-optic applications in the UV-region are feasible. However, the other components like the UV-sources and the detector- systems have to be improved, too. Especially, the miniaturization is very important fitting to the small-sized fiber-optic assemblies leading to compact and mobile UV- analytical systems. Based on independent improvements in the preform and fiber processing, UV-improved fibers with different properties have been developed. The best UV-fiber for the prosed applications is selectable by its short and long-term spectral behavior, especially in the region from 190 to 350 nm. The spectrum of the UV-source and the power density in the fiber have an influence on the nonlinear transmission and the damaging level; however, hydrogen can reduce the UV-defect concentration. After determining the diffusion processes in the fiber, the UV-lifetime in commercially available all-silica fibers can be predicted. Newest results with light from deuterium-lamps, excimer- lasers and 5th harmonics of Nd:YAG laser will be shown. Many activities are in the field of UV-sources. In addition to new UV-lasers like the Nd:YAG laser at 213 nm, a new low- power deuterium-lamp with smaller dimensions has been introduced last year. Properties of this lamp will be discussed, taking into account some of the application requirements. Finally, some new applications with UV-fiber optics will be shown; especially the TLC-method can be improved significantly, combining a 2-row fiber-array with a diode-array spectrometer optimized for fiber-optics.

  8. Study of travelling wave solutions for some special-type nonlinear evolution equations

    NASA Astrophysics Data System (ADS)

    Song, Junquan; Hu, Lan; Shen, Shoufeng; Ma, Wen-Xiu

    2018-07-01

    The tanh-function expansion method has been improved and used to construct travelling wave solutions of the form U={\\sum }j=0n{a}j{\\tanh }jξ for some special-type nonlinear evolution equations, which have a variety of physical applications. The positive integer n can be determined by balancing the highest order linear term with the nonlinear term in the evolution equations. We improve the tanh-function expansion method with n = 0 by introducing a new transform U=-W\\prime (ξ )/{W}2. A nonlinear wave equation with source terms, and mKdV-type equations, are considered in order to show the effectiveness of the improved scheme. We also propose the tanh-function expansion method of implicit function form, and apply it to a Harry Dym-type equation as an example.

  9. Modeling Interactions Among Turbulence, Gas-Phase Chemistry, Soot and Radiation Using Transported PDF Methods

    NASA Astrophysics Data System (ADS)

    Haworth, Daniel

    2013-11-01

    The importance of explicitly accounting for the effects of unresolved turbulent fluctuations in Reynolds-averaged and large-eddy simulations of chemically reacting turbulent flows is increasingly recognized. Transported probability density function (PDF) methods have emerged as one of the most promising modeling approaches for this purpose. In particular, PDF methods provide an elegant and effective resolution to the closure problems that arise from averaging or filtering terms that correspond to nonlinear point processes, including chemical reaction source terms and radiative emission. PDF methods traditionally have been associated with studies of turbulence-chemistry interactions in laboratory-scale, atmospheric-pressure, nonluminous, statistically stationary nonpremixed turbulent flames; and Lagrangian particle-based Monte Carlo numerical algorithms have been the predominant method for solving modeled PDF transport equations. Recent advances and trends in PDF methods are reviewed and discussed. These include advances in particle-based algorithms, alternatives to particle-based algorithms (e.g., Eulerian field methods), treatment of combustion regimes beyond low-to-moderate-Damköhler-number nonpremixed systems (e.g., premixed flamelets), extensions to include radiation heat transfer and multiphase systems (e.g., soot and fuel sprays), and the use of PDF methods as the basis for subfilter-scale modeling in large-eddy simulation. Examples are provided that illustrate the utility and effectiveness of PDF methods for physics discovery and for applications to practical combustion systems. These include comparisons of results obtained using the PDF method with those from models that neglect unresolved turbulent fluctuations in composition and temperature in the averaged or filtered chemical source terms and/or the radiation heat transfer source terms. In this way, the effects of turbulence-chemistry-radiation interactions can be isolated and quantified.

  10. Leveraging Terminological Resources for Mapping between Rare Disease Information Sources

    PubMed Central

    Rance, Bastien; Snyder, Michelle; Lewis, Janine; Bodenreider, Olivier

    2015-01-01

    Background Rare disease information sources are incompletely and inconsistently cross-referenced to one another, making it difficult for information seekers to navigate across them. The development of such cross-references established manually by experts is generally labor intensive and costly. Objectives To develop an automatic mapping between two of the major rare diseases information sources, GARD and Orphanet, by leveraging terminological resources, especially the UMLS. Methods We map the rare disease terms from Orphanet and ORDR to the UMLS. We use the UMLS as a pivot to bridge between the rare disease terminologies. We compare our results to a mapping obtained through manually established cross-references to OMIM. Results Our mapping has a precision of 94%, a recall of 63% and an F1-score of 76%. Our automatic mapping should help facilitate the development of more complete and consistent cross-references between GARD and Orphanet, and is applicable to other rare disease information sources as well. PMID:23920611

  11. Gamma Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Wu, S. T.

    2000-01-01

    The project has progressed successfully during this period of performance. The highlights of the Gamma Ray Astronomy teams efforts are: (1) Support daily BATSE data operations, including receipt, archival and dissemination of data, quick-look science analysis, rapid gamma-ray burst and transient monitoring and response efforts, instrument state-of-health monitoring, and instrument commanding and configuration; (2) On-going scientific analysis, including production and maintenance of gamma-ray burst, pulsed source and occultation source catalogs, gamma-ray burst spectroscopy, studies of the properties of pulsars and black holes, and long-term monitoring of hard x-ray sources; (3) Maintenance and continuous improvement of BATSE instrument response and calibration data bases; (4) Investigation of the use of solid state detectors for eventual application and instrument to perform all sky monitoring of X-Ray and Gamma sources with high sensitivity; and (5) Support of BATSE outreach activities, including seminars, colloquia and World Wide Web pages. The highlights of this efforts can be summarized in the publications and presentation list.

  12. Terahertz quantum-cascade lasers as high-power and wideband, gapless sources for spectroscopy.

    PubMed

    Röben, Benjamin; Lü, Xiang; Hempel, Martin; Biermann, Klaus; Schrottke, Lutz; Grahn, Holger T

    2017-07-10

    Terahertz (THz) quantum-cascade lasers (QCLs) are powerful radiation sources for high-resolution and high-sensitivity spectroscopy with a discrete spectrum between 2 and 5 THz as well as a continuous coverage of several GHz. However, for many applications, a radiation source with a continuous coverage of a substantially larger frequency range is required. We employed a multi-mode THz QCL operated with a fast ramped injection current, which leads to a collective tuning of equally-spaced Fabry-Pérot laser modes exceeding their separation. A continuous coverage over 72 GHz at about 4.7 THz was achieved. We demonstrate that the QCL is superior to conventional sources used in Fourier transform infrared spectroscopy in terms of the signal-to-noise ratio as well as the dynamic range by one to two orders of magnitude. Our results pave the way for versatile THz spectroscopic systems with unprecedented resolution and sensitivity across a wide frequency range.

  13. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  14. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  15. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  16. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  17. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  18. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  19. What is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology

    NASA Astrophysics Data System (ADS)

    Borisenko, Victor E.; Ossicini, Stefano

    2004-10-01

    This introductory, reference handbook summarizes the terms and definitions, most important phenomena, and regulations discovered in the physics, chemistry, technology, and application of nanostructures. These nanostructures are typically inorganic and organic structures at the atomic scale. Fast progressing nanoelectronics and optoelectronics, molecular electronics and spintronics, nanotechnology and quantum processing of information, are of strategic importance for the information society of the 21st century. The short form of information taken from textbooks, special encyclopedias, recent original books and papers provides fast support in understanding "old" and new terms of nanoscience and technology widely used in scientific literature on recent developments. Such support is indeed important when one reads a scientific paper presenting new results in nanoscience. A representative collection of fundamental terms and definitions from quantum physics, and quantum chemistry, special mathematics, organic and inorganic chemistry, solid state physics, material science and technology accompanies recommended second sources (books, reviews, websites) for an extended study of a subject. Each entry interprets the term or definition under consideration and briefly presents main features of the phenomena behind it. Additional information in the form of notes ("First described in: ?", "Recognition: ?", "More details in: ?") supplements entries and gives a historical retrospective of the subject with reference to further sources. Ideal for answering questions related to unknown terms and definitions of undergraduate and Ph.D. students studying the physics of low-dimensional structures, nanoelectronics, nanotechnology. The handbook provides fast support, when one likes to know or to remind the essence of a scientific term, especially when it contains a personal name in its title, like in terms "Anderson localization", "Aharonov-Bohm effect", "Bose-Einstein condensate", e.t.c. More than 1000 entries, from a few sentences to a page in length.

  20. Catchment-scale herbicides transport: Theory and application

    NASA Astrophysics Data System (ADS)

    Bertuzzo, E.; Thomet, M.; Botter, G.; Rinaldo, A.

    2013-02-01

    This paper proposes and tests a model which couples the description of hydrologic flow and transport of herbicides at catchment scales. The model accounts for streamflow components' age to characterize short and long term fluctuations of herbicide flux concentrations in stream waters, whose peaks exceeding a toxic threshold are key to exposure risk of aquatic ecosystems. The model is based on a travel time formulation of transport embedding a source zone that describes near surface herbicide dynamics. To this aim we generalize a recently proposed scheme for the analytical derivation of travel time distributions to the case of solutes that can be partially taken up by transpiration and undergo chemical degradation. The framework developed is evaluated by comparing modeled hydrographs and atrazine chemographs with those measured in the Aabach agricultural catchment (Switzerland). The model proves reliable in defining complex transport features shaped by the interplay of long term processes, related to the persistence of solute components in soils, and short term dynamics related to storm inter-arrivals. The effects of stochasticity in rainfall patterns and application dates on concentrations and loads in runoff are assessed via Monte Carlo simulations, highlighting the crucial role played by the first rainfall event occurring after herbicide application. A probabilistic framework for critical determinants of exposure risk to aquatic communities is defined. Modeling of herbicides circulation at catchment scale thus emerges as essential tools for ecological risk assessment.

  1. Drinking water studies: a review on heavy metal, application of biomarker and health risk assessment (a special focus in Malaysia).

    PubMed

    Ab Razak, Nurul Hafiza; Praveena, Sarva Mangala; Aris, Ahmad Zaharin; Hashim, Zailina

    2015-12-01

    Malaysia has abundant sources of drinking water from river and groundwater. However, rapid developments have deteriorated quality of drinking water sources in Malaysia. Heavy metal studies in terms of drinking water, applications of health risk assessment and bio-monitoring in Malaysia were reviewed from 2003 to 2013. Studies on heavy metal in drinking water showed the levels are under the permissible limits as suggested by World Health Organization and Malaysian Ministry of Health. Future studies on the applications of health risk assessment are crucial in order to understand the risk of heavy metal exposure through drinking water to Malaysian population. Among the biomarkers that have been reviewed, toenail is the most useful tool to evaluate body burden of heavy metal. Toenails are easy to collect, store, transport and analysed. This review will give a clear guidance for future studies of Malaysian drinking water. In this way, it will help risk managers to minimize the exposure at optimum level as well as the government to formulate policies in safe guarding the population. Copyright © 2015 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  2. Lithium-Ion Batteries for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Surampudi, S.; Halpert, G.; Marsh, R. A.; James, R.

    1999-01-01

    This presentation reviews: (1) the goals and objectives, (2) the NASA and Airforce requirements, (3) the potential near term missions, (4) management approach, (5) the technical approach and (6) the program road map. The objectives of the program include: (1) develop high specific energy and long life lithium ion cells and smart batteries for aerospace and defense applications, (2) establish domestic production sources, and to demonstrate technological readiness for various missions. The management approach is to encourage the teaming of universities, R&D organizations, and battery manufacturing companies, to build on existing commercial and government technology, and to develop two sources for manufacturing cells and batteries. The technological approach includes: (1) develop advanced electrode materials and electrolytes to achieve improved low temperature performance and long cycle life, (2) optimize cell design to improve specific energy, cycle life and safety, (3) establish manufacturing processes to ensure predictable performance, (4) establish manufacturing processes to ensure predictable performance, (5) develop aerospace lithium ion cells in various AH sizes and voltages, (6) develop electronics for smart battery management, (7) develop a performance database required for various applications, and (8) demonstrate technology readiness for the various missions. Charts which review the requirements for the Li-ion battery development program are presented.

  3. Real-Time N2O Gas Detection System for Agricultural Production Using a 4.6-μm-Band Laser Source Based on a Periodically Poled LiNbO3 Ridge Waveguide

    PubMed Central

    Tokura, Akio; Asobe, Masaki; Enbutsu, Koji; Yoshihara, Toshihiro; Hashida, Shin-nosuke; Takenouchi, Hirokazu

    2013-01-01

    This article describes a gas monitoring system for detecting nitrous oxide (N2O) gas using a compact mid-infrared laser source based on difference-frequency generation in a quasi-phase-matched LiNbO3 waveguide. We obtained a stable output power of 0.62 mW from a 4.6-μm-band continuous-wave laser source operating at room temperature. This laser source enabled us to detect atmospheric N2O gas at a concentration as low as 35 parts per billion. Using this laser source, we constructed a new real-time in-situ monitoring system for detecting N2O gas emitted from potted plants. A few weeks of monitoring with the developed detection system revealed a strong relationship between nitrogen fertilization and N2O emission. This system is promising for the in-situ long-term monitoring of N2O in agricultural production, and it is also applicable to the detection of other greenhouse gases. PMID:23921829

  4. Real-time N2O gas detection system for agricultural production using a 4.6-µm-band laser source based on a periodically poled LiNbO3 ridge waveguide.

    PubMed

    Tokura, Akio; Asobe, Masaki; Enbutsu, Koji; Yoshihara, Toshihiro; Hashida, Shin-nosuke; Takenouchi, Hirokazu

    2013-08-05

    This article describes a gas monitoring system for detecting nitrous oxide (N2O) gas using a compact mid-infrared laser source based on difference-frequency generation in a quasi-phase-matched LiNbO3 waveguide. We obtained a stable output power of 0.62 mW from a 4.6-μm-band continuous-wave laser source operating at room temperature. This laser source enabled us to detect atmospheric N2O gas at a concentration as low as 35 parts per billion. Using this laser source, we constructed a new real-time in-situ monitoring system for detecting N2O gas emitted from potted plants. A few weeks of monitoring with the developed detection system revealed a strong relationship between nitrogen fertilization and N2O emission. This system is promising for the in-situ long-term monitoring of N2O in agricultural production, and it is also applicable to the detection of other greenhouse gases.

  5. Quantitative proteomics reveals the importance of nitrogen source to control glucosinolate metabolism in Arabidopsis thaliana and Brassica oleracea

    PubMed Central

    Marino, Daniel; Ariz, Idoia; Lasa, Berta; Santamaría, Enrique; Fernández-Irigoyen, Joaquín; González-Murua, Carmen; Aparicio Tejo, Pedro M.

    2016-01-01

    Accessing different nitrogen (N) sources involves a profound adaptation of plant metabolism. In this study, a quantitative proteomic approach was used to further understand how the model plant Arabidopsis thaliana adjusts to different N sources when grown exclusively under nitrate or ammonium nutrition. Proteome data evidenced that glucosinolate metabolism was differentially regulated by the N source and that both TGG1 and TGG2 myrosinases were more abundant under ammonium nutrition, which is generally considered to be a stressful situation. Moreover, Arabidopsis plants displayed glucosinolate accumulation and induced myrosinase activity under ammonium nutrition. Interestingly, these results were also confirmed in the economically important crop broccoli (Brassica oleracea var. italica). Moreover, these metabolic changes were correlated in Arabidopsis with the differential expression of genes from the aliphatic glucosinolate metabolic pathway. This study underlines the importance of nitrogen nutrition and the potential of using ammonium as the N source in order to stimulate glucosinolate metabolism, which may have important applications not only in terms of reducing pesticide use, but also for increasing plants’ nutritional value. PMID:27085186

  6. Survey on the Performance of Source Localization Algorithms.

    PubMed

    Fresno, José Manuel; Robles, Guillermo; Martínez-Tarifa, Juan Manuel; Stewart, Brian G

    2017-11-18

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton-Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm.

  7. Survey on the Performance of Source Localization Algorithms

    PubMed Central

    2017-01-01

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton–Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm. PMID:29156565

  8. SU-D-19A-05: The Dosimetric Impact of Using Xoft Axxent® Electronic Brachytherapy Source TG-43 Dosimetry Parameters for Treatment with the Xoft 30 Mm Diameter Vaginal Applicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simiele, S; Micka, J; Culberson, W

    2014-06-01

    Purpose: A full TG-43 dosimetric characterization has not been performed for the Xoft Axxent ® electronic brachytherapy source (Xoft, a subsidiary of iCAD, San Jose, CA) within the Xoft 30 mm diameter vaginal applicator. Currently, dose calculations are performed using the bare-source TG-43 parameters and do not account for the presence of the applicator. This work focuses on determining the difference between the bare-source and sourcein- applicator TG-43 parameters. Both the radial dose function (RDF) and polar anisotropy function (PAF) were computationally determined for the source-in-applicator and bare-source models to determine the impact of using the bare-source dosimetry data. Methods:more » MCNP5 was used to model the source and the Xoft 30 mm diameter vaginal applicator. All simulations were performed using 0.84p and 0.03e cross section libraries. All models were developed based on specifications provided by Xoft. The applicator is made of a proprietary polymer material and simulations were performed using the most conservative chemical composition. An F6 collision-kerma tally was used to determine the RDF and PAF values in water at various dwell positions. The RDF values were normalized to 2.0 cm from the source to accommodate the applicator radius. Source-in-applicator results were compared with bare-source results from this work as well as published baresource results. Results: For a 0 mm source pullback distance, the updated bare-source model and source-in-applicator RDF values differ by 2% at 3 cm and 4% at 5 cm. The largest PAF disagreements were observed at the distal end of the source and applicator with up to 17% disagreement at 2 cm and 8% at 8 cm. The bare-source model had RDF values within 2.6% of the published TG-43 data and PAF results within 7.2% at 2 cm. Conclusion: Results indicate that notable differences exist between the bare-source and source-in-applicator TG-43 simulated parameters. Xoft Inc. provided partial funding for this work.« less

  9. Estimation of the Cesium-137 Source Term from the Fukushima Daiichi Power Plant Using Air Concentration and Deposition Data

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2013-04-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativeness of the measurements, the instrumental errors, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, and specially in a situation of sparse observability, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. In Winiarek et al. (2012), we proposed to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We applied the method to the estimation of the Fukushima Daiichi cesium-137 and iodine-131 source terms using activity concentrations in the air. The results were compared to an L-curve estimation technique, and to Desroziers's scheme. Additionally to the estimations of released activities, we provided related uncertainties (12 PBq with a std. of 15 - 20 % for cesium-137 and 190 - 380 PBq with a std. of 5 - 10 % for iodine-131). We also enlightened that, because of the low number of available observations (few hundreds) and even if orders of magnitude were consistent, the reconstructed activities significantly depended on the method used to estimate the prior errors. In order to use more data, we propose to extend the methods to the use of several data types, such as activity concentrations in the air and fallout measurements. The idea is to simultaneously estimate the prior errors related to each dataset, in order to fully exploit the information content of each one. Using the activity concentration measurements, but also daily fallout data from prefectures and cumulated deposition data over a region lying approximately 150 km around the nuclear power plant, we can use a few thousands of data in our inverse modeling algorithm to reconstruct the Cesium-137 source term. To improve the parameterization of removal processes, rainfall fields have also been corrected using outputs from the mesoscale meteorological model WRF and ground station rainfall data. As expected, the different methods yield closer results as the number of data increases. Reference : Winiarek, V., M. Bocquet, O. Saunier, A. Mathieu (2012), Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant : Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant, J. Geophys. Res., 117, D05122, doi:10.1029/2011JD016932.

  10. Review of clinical brachytherapy uncertainties: Analysis guidelines of GEC-ESTRO and the AAPM☆

    PubMed Central

    Kirisits, Christian; Rivard, Mark J.; Baltas, Dimos; Ballester, Facundo; De Brabandere, Marisol; van der Laarse, Rob; Niatsetski, Yury; Papagiannis, Panagiotis; Hellebust, Taran Paulsen; Perez-Calatayud, Jose; Tanderup, Kari; Venselaar, Jack L.M.; Siebert, Frank-André

    2014-01-01

    Background and purpose A substantial reduction of uncertainties in clinical brachytherapy should result in improved outcome in terms of increased local control and reduced side effects. Types of uncertainties have to be identified, grouped, and quantified. Methods A detailed literature review was performed to identify uncertainty components and their relative importance to the combined overall uncertainty. Results Very few components (e.g., source strength and afterloader timer) are independent of clinical disease site and location of administered dose. While the influence of medium on dose calculation can be substantial for low energy sources or non-deeply seated implants, the influence of medium is of minor importance for high-energy sources in the pelvic region. The level of uncertainties due to target, organ, applicator, and/or source movement in relation to the geometry assumed for treatment planning is highly dependent on fractionation and the level of image guided adaptive treatment. Most studies to date report the results in a manner that allows no direct reproduction and further comparison with other studies. Often, no distinction is made between variations, uncertainties, and errors or mistakes. The literature review facilitated the drafting of recommendations for uniform uncertainty reporting in clinical BT, which are also provided. The recommended comprehensive uncertainty investigations are key to obtain a general impression of uncertainties, and may help to identify elements of the brachytherapy treatment process that need improvement in terms of diminishing their dosimetric uncertainties. It is recommended to present data on the analyzed parameters (distance shifts, volume changes, source or applicator position, etc.), and also their influence on absorbed dose for clinically-relevant dose parameters (e.g., target parameters such as D90 or OAR doses). Publications on brachytherapy should include a statement of total dose uncertainty for the entire treatment course, taking into account the fractionation schedule and level of image guidance for adaptation. Conclusions This report on brachytherapy clinical uncertainties represents a working project developed by the Brachytherapy Physics Quality Assurances System (BRAPHYQS) subcommittee to the Physics Committee within GEC-ESTRO. Further, this report has been reviewed and approved by the American Association of Physicists in Medicine. PMID:24299968

  11. Evolution of air pollution source contributions over one decade, derived by PM10 and PM2.5 source apportionment in two metropolitan urban areas in Greece

    NASA Astrophysics Data System (ADS)

    Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.

    2017-09-01

    Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.

  12. Multi-watt, multi-octave, mid-infrared femtosecond source

    PubMed Central

    Hussain, Syed A.; Hartung, Alexander; Zawilski, Kevin T.; Schunemann, Peter G.; Habel, Florian; Pervak, Vladimir

    2018-01-01

    Spectroscopy in the wavelength range from 2 to 11 μm (900 to 5000 cm−1) implies a multitude of applications in fundamental physics, chemistry, as well as environmental and life sciences. The related vibrational transitions, which all infrared-active small molecules, the most common functional groups, as well as biomolecules like proteins, lipids, nucleic acids, and carbohydrates exhibit, reveal information about molecular structure and composition. However, light sources and detectors in the mid-infrared have been inferior to those in the visible or near-infrared, in terms of power, bandwidth, and sensitivity, severely limiting the performance of infrared experimental techniques. This article demonstrates the generation of femtosecond radiation with up to 5 W at 4.1 μm and 1.3 W at 8.5 μm, corresponding to an order-of-magnitude average power increase for ultrafast light sources operating at wavelengths longer than 5 μm. The presented concept is based on power-scalable near-infrared lasers emitting at a wavelength near 1 μm, which pump optical parametric amplifiers. In addition, both wavelength tunability and supercontinuum generation are reported, resulting in spectral coverage from 1.6 to 10.2 μm with power densities exceeding state-of-the-art synchrotron sources over the entire range. The flexible frequency conversion scheme is highly attractive for both up-conversion and frequency comb spectroscopy, as well as for a variety of time-domain applications. PMID:29713685

  13. Levels and distributions of organochlorine pesticides in the soil-groundwater system of vegetable planting area in Tianjin City, Northern China.

    PubMed

    Pan, Hong-Wei; Lei, Hong-Jun; He, Xiao-Song; Xi, Bei-Dou; Han, Yu-Ping; Xu, Qi-Gong

    2017-04-01

    To study the influence of long-term pesticide application on the distribution of organochlorine pesticides (OCPs) in the soil-groundwater system, 19 soil samples and 19 groundwater samples were collected from agricultural area with long-term pesticide application history in Northern China. Results showed that the composition of OCPs changed significantly from soil to groundwater. For example, ∑DDT, ∑HCH, and ∑heptachlor had high levels in the soil and low levels in the groundwater; in contrast, endrin had low level in the soil and high level in the groundwater. Further study showed that OCP distribution in the soil was significantly influenced by its residue time, soil organic carbon level, and small soil particle contents (i.d. <0.0002 mm). Correlation analysis also indicates that the distribution of OCPs in the groundwater was closely related to the levels of OCPs in the soil layer, which may act as a pollution source.

  14. Extremum seeking with bounded update rates

    DOE PAGES

    Scheinker, Alexander; Krstić, Miroslav

    2013-11-16

    In this work, we present a form of extremum seeking (ES) in which the unknown function being minimized enters the system’s dynamics as the argument of a cosine or sine term, thereby guaranteeing known bounds on update rates and control efforts. We present general n-dimensional optimization and stabilization results as well as 2D vehicle control, with bounded velocity and control efforts. For application to autonomous vehicles, tracking a source in a GPS denied environment with unknown orientation, this ES approach allows for smooth heading angle actuation, with constant velocity, and in application to a unicycle-type vehicle results in control abilitymore » as if the vehicle is fully actuated. Our stability analysis is made possible by the classic results of Kurzweil, Jarnik, Sussmann, and Liu, regarding systems with highly oscillatory terms. In our stability analysis, we combine the averaging results with a semi-global practical stability result under small parametric perturbations developed by Moreau and Aeyels.« less

  15. Full range line-field parallel swept source imaging utilizing digital refocusing

    NASA Astrophysics Data System (ADS)

    Fechtig, Daniel J.; Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A.

    2015-12-01

    We present geometric optics-based refocusing applied to a novel off-axis line-field parallel swept source imaging (LPSI) system. LPSI is an imaging modality based on line-field swept source optical coherence tomography, which permits 3-D imaging at acquisition speeds of up to 1 MHz. The digital refocusing algorithm applies a defocus-correcting phase term to the Fourier representation of complex-valued interferometric image data, which is based on the geometrical optics information of the LPSI system. We introduce the off-axis LPSI system configuration, the digital refocusing algorithm and demonstrate the effectiveness of our method for refocusing volumetric images of technical and biological samples. An increase of effective in-focus depth range from 255 μm to 4.7 mm is achieved. The recovery of the full in-focus depth range might be especially valuable for future high-speed and high-resolution diagnostic applications of LPSI in ophthalmology.

  16. Modeling and observations of an elevated, moving infrasonic source: Eigenray methods.

    PubMed

    Blom, Philip; Waxler, Roger

    2017-04-01

    The acoustic ray tracing relations are extended by the inclusion of auxiliary parameters describing variations in the spatial ray coordinates and eikonal vector due to changes in the initial conditions. Computation of these parameters allows one to define the geometric spreading factor along individual ray paths and assists in identification of caustic surfaces so that phase shifts can be easily identified. A method is developed leveraging the auxiliary parameters to identify propagation paths connecting specific source-receiver geometries, termed eigenrays. The newly introduced method is found to be highly efficient in cases where propagation is non-planar due to horizontal variations in the propagation medium or the presence of cross winds. The eigenray method is utilized in analysis of infrasonic signals produced by a multi-stage sounding rocket launch with promising results for applications of tracking aeroacoustic sources in the atmosphere and specifically to analysis of motor performance during dynamic tests.

  17. Design, production, and testing of field effect transistors. [cryogenic MOSFETS

    NASA Technical Reports Server (NTRS)

    Sclar, N.

    1982-01-01

    Cryogenic MOSFETS (CRYOFETS), specifically designed for low temperature preamplifier application with infrared extrinsic detectors were produced and comparatively tested with p-channel MOSFETs under matched conditions. The CRYOFETs exhibit lower voltage thresholds, high source-follower gains at lower bias voltage, and lower dc offset source voltage. The noise of the CRYOFET is found to be 2 to 4 times greater than the MOSFET with a correspondingly lower figure of merit (which is established for source-follower amplifiers). The device power dissipation at a gain of 0.98 is some two orders of magnitude lower than for the MOSFET. Further, CRYOFETs are free of low temperature I vs V character hysteresis and balky conduction turn-on effects and operate effectively in the 2.4 to 20 K range. These devices have promise for use on long term duration sensor missions and for on-focal-plane signal processing at low temperatures.

  18. Work Life Stress and Career Resilience of Licensed Nursing Facility Administrators.

    PubMed

    Myers, Dennis R; Rogers, Rob; LeCrone, Harold H; Kelley, Katherine; Scott, Joel H

    2018-04-01

    Career resilience provided a frame for understanding how Licensed Nursing Facility Administrators (LNFAs) sustain role performance and even thrive in stressful skilled nursing facility work environments. Quantitative and qualitative analyses of in-depth interviews with18 LNFAs, averaging 24 years of experience were conducted by a five-member research team. Analysis was informed by evidence-based frameworks for career resilience in the health professions as well as the National Association of Long-Term Care Administrator Boards' (NAB) five domains of competent administrative practice. Findings included six sources of work stressors and six sources of professional satisfaction. Also, participants identified seven strategic principles and 10 administrative practices for addressing major sources of stress. Recommendations are provided for research and evidence-based application of the career resilience perspective to LNFA practice aimed at reducing role abandonment and energizing the delivery of the quality of care that each resident deserves.

  19. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    PubMed

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2018-03-01

    The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.

  20. Current switching ratio optimization using dual pocket doping engineering

    NASA Astrophysics Data System (ADS)

    Dash, Sidhartha; Sahoo, Girija Shankar; Mishra, Guru Prasad

    2018-01-01

    This paper presents a smart idea to maximize current switching ratio of cylindrical gate tunnel FET (CGT) by growing pocket layers in both source and channel region. The pocket layers positioned in the source and channel of the device provides significant improvement in ON-state and OFF-state current respectively. The dual pocket doped cylindrical gate TFET (DP-CGT) exhibits much superior performance in term of drain current, transconductance and current ratio as compared to conventional CGT, channel pocket doped CGT (CP-CGT) and source pocket doped CGT (SP-CGT). Further, the current ratio has been optimized w.r.t. width and instantaneous position both the pocket layers. The much improved current ratio and low power consumption makes the proposed device suitable for low-power and high speed application. The simulation work of DP-CGT is done using 3D Sentaurus TCAD device simulator from Synopsys.

  1. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  2. Euthanasia: an overview and the jewish perspective.

    PubMed

    Gesundheit, Benjamin; Steinberg, Avraham; Glick, Shimon; Or, Reuven; Jotkovitz, Alan

    2006-10-01

    End-of-life care poses fundamental ethical problems to clinicians. Defining euthanasia is a difficult and complex task, which causes confusion in its practical clinical application. Over the course of history, abuse of the term has led to medical atrocities. Familiarity with the relevant bioethical issues and the development of practical guidelines might improve clinical performance. To define philosophical concepts, to present historical events, to discuss the relevant attitudes in modern bioethics and law that may be helpful in elaborating practical guidelines for clinicians regarding euthanasia and end-of-life care. Concepts found in the classic sources of Jewish tradition might shed additional light on the issue and help clinicians in their decision-making process. An historical overview defines the concepts of active versus passive euthanasia, physician-assisted suicide and related terms. Positions found in classical Jewish literature are presented and analyzed with their later interpretations. The relevance and application in modern clinical medicine of both the general and Jewish approaches are discussed. The overview of current bioethical concepts demonstrates the variety of approaches in western culture and legal systems. Philosophically and conceptually, there is a crucial distinction between active and passive euthanasia. The legitimacy of active euthanasia has been the subject of major controversy in recent times in various countries and religious traditions. The historical overview and the literature review demonstrate the need to provide clearer definitions of the concepts relating to euthanasia, for in the past the term has led to major confusion and uncontrolled abuse. Bioethical topics should, therefore, be included in medical training and continuing education. There are major debates and controversies regarding the current clinical and legal approaches. We trust that classical Jewish sources might contribute to the establishment of clinical definitions, meaningful approaches and practical guidelines for clinicians.

  3. Recent Advances in Omega-3: Health Benefits, Sources, Products and Bioavailability

    PubMed Central

    Nichols, Peter D.; McManus, Alexandra; Krail, Kevin; Sinclair, Andrew J.; Miller, Matt

    2014-01-01

    The joint symposium of The Omega-3 Centre and the Australasian Section American Oil Chemists Society; Recent Advances in Omega-3: Health Benefits, Sources, Products and Bioavailability, was held November 7, 2013 in Newcastle, NSW, Australia. Over 115 attendees received new information on a range of health benefits, aquaculture as a sustainable source of supply, and current and potential new and novel sources of these essential omega-3 long-chain (LC, ≥C20) polyunsaturated fatty acid nutrients (also termed LC omega-3). The theme of “Food versus Fuel” was an inspired way to present a vast array of emerging and ground breaking Omega-3 research that has application across many disciplines. Eleven papers submitted following from the Omega-3 Symposium are published in this Special Issue volume, with topics covered including: an update on the use of the Omega-3 Index (O3I), the effects of dosage and concurrent intake of vitamins/minerals on omega-3 incorporation into red blood cells, the possible use of the O3I as a measure of risk for adiposity, the need for and progress with new land plant sources of docosahexaenoic acid (DHA, 22:6ω3), the current status of farmed Australian and New Zealand fish, and also supplements, in terms of their LC omega-3 and persistent organic pollutants (POP) content, progress with cheap carbon sources in the culture of DHA-producing single cell organisms, a detailed examination of the lipids of the New Zealand Greenshell mussel, and a pilot investigation of the purification of New Zealand hoki liver oil by short path distillation. The selection of papers in this Special Issue collectively highlights a range of forward looking and also new and including positive scientific outcomes occurring in the omega-3 field. PMID:25255830

  4. Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions

    NASA Astrophysics Data System (ADS)

    Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.

    2017-12-01

    Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.

  5. Recent advances in omega-3: Health Benefits, Sources, Products and Bioavailability.

    PubMed

    Nichols, Peter D; McManus, Alexandra; Krail, Kevin; Sinclair, Andrew J; Miller, Matt

    2014-09-16

    The joint symposium of The Omega-3 Centre and the Australasian Section American Oil Chemists Society; Recent Advances in Omega-3: Health Benefits, Sources, Products and Bioavailability, was held November 7, 2013 in Newcastle, NSW, Australia. Over 115 attendees received new information on a range of health benefits, aquaculture as a sustainable source of supply, and current and potential new and novel sources of these essential omega-3 long-chain (LC, ≥ C20) polyunsaturated fatty acid nutrients (also termed LC omega-3). The theme of "Food versus Fuel" was an inspired way to present a vast array of emerging and ground breaking Omega-3 research that has application across many disciplines. Eleven papers submitted following from the Omega-3 Symposium are published in this Special Issue volume, with topics covered including: an update on the use of the Omega-3 Index (O3I), the effects of dosage and concurrent intake of vitamins/minerals on omega-3 incorporation into red blood cells, the possible use of the O3I as a measure of risk for adiposity, the need for and progress with new land plant sources of docosahexaenoic acid (DHA, 22:6ω3), the current status of farmed Australian and New Zealand fish, and also supplements, in terms of their LC omega-3 and persistent organic pollutants (POP) content, progress with cheap carbon sources in the culture of DHA-producing single cell organisms, a detailed examination of the lipids of the New Zealand Greenshell mussel, and a pilot investigation of the purification of New Zealand hoki liver oil by short path distillation. The selection of papers in this Special Issue collectively highlights a range of forward looking and also new and including positive scientific outcomes occurring in the omega-3 field.

  6. Investigation of Magnetotelluric Source Effect Based on Twenty Years of Telluric and Geomagnetic Observation

    NASA Astrophysics Data System (ADS)

    Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.

    2016-12-01

    Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.

  7. Reuse: A knowledge-based approach

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.

  8. The myths of 'big data' in health care.

    PubMed

    Jacofsky, D J

    2017-12-01

    'Big data' is a term for data sets that are so large or complex that traditional data processing applications are inadequate. Billions of dollars have been spent on attempts to build predictive tools from large sets of poorly controlled healthcare metadata. Companies often sell reports at a physician or facility level based on various flawed data sources, and comparative websites of 'publicly reported data' purport to educate the public. Physicians should be aware of concerns and pitfalls seen in such data definitions, data clarity, data relevance, data sources and data cleaning when evaluating analytic reports from metadata in health care. Cite this article: Bone Joint J 2017;99-B:1571-6. ©2017 The British Editorial Society of Bone & Joint Surgery.

  9. Biomass burning contributions estimated by synergistic coupling of daily and hourly aerosol composition records.

    PubMed

    Nava, S; Lucarelli, F; Amato, F; Becagli, S; Calzolai, G; Chiari, M; Giannoni, M; Traversi, R; Udisti, R

    2015-04-01

    Biomass burning (BB) is a significant source of particulate matter (PM) in many parts of the world. Whereas numerous studies demonstrate the relevance of BB emissions in central and northern Europe, the quantification of this source has been assessed only in few cities in southern European countries. In this work, the application of Positive Matrix Factorisation (PMF) allowed a clear identification and quantification of an unexpected very high biomass burning contribution in Tuscany (central Italy), in the most polluted site of the PATOS project. In this urban background site, BB accounted for 37% of the mass of PM10 (particulate matter with aerodynamic diameter<10 μm) as annual average, and more than 50% during winter, being the main cause of all the PM10 limit exceedances. Due to the chemical complexity of BB emissions, an accurate assessment of this source contribution is not always easily achievable using just a single tracer. The present work takes advantage of the combination of a long-term daily data-set, characterized by an extended chemical speciation, with a short-term high time resolution (1-hour) and size-segregated data-set, obtained by PIXE analyses of streaker samples. The hourly time pattern of the BB source, characterised by a periodic behaviour with peaks starting at about 6 p.m. and lasting all the evening-night, and its strong seasonality, with higher values in the winter period, clearly confirmed the hypothesis of a domestic heating source (also excluding important contributions from wildfires and agricultural wastes burning). Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Fabrication and In Situ Testing of Scalable Nitrate-Selective Electrodes for Distributed Observations

    NASA Astrophysics Data System (ADS)

    Harmon, T. C.; Rat'ko, A.; Dietrich, H.; Park, Y.; Wijsboom, Y. H.; Bendikov, M.

    2008-12-01

    Inorganic nitrogen (nitrate (NO3-) and ammonium (NH+)) from chemical fertilizer and livestock waste is a major source of pollution in groundwater, surface water and the air. While some sources of these chemicals, such as waste lagoons, are well-defined, their application as fertilizer has the potential to create distributed or non-point source pollution problems. Scalable nitrate sensors (small and inexpensive) would enable us to better assess non-point source pollution processes in agronomic soils, groundwater and rivers subject to non-point source inputs. This work describes the fabrication and testing of inexpensive PVC-membrane- based ion selective electrodes (ISEs) for monitoring nitrate levels in soil water environments. ISE-based sensors have the advantages of being easy to fabricate and use, but suffer several shortcomings, including limited sensitivity, poor precision, and calibration drift. However, modern materials have begun to yield more robust ISE types in laboratory settings. This work emphasizes the in situ behavior of commercial and fabricated sensors in soils subject to irrigation with dairy manure water. Results are presented in the context of deployment techniques (in situ versus soil lysimeters), temperature compensation, and uncertainty analysis. Observed temporal responses of the nitrate sensors exhibited diurnal cycling with elevated nitrate levels at night and depressed levels during the day. Conventional samples collected via lysimeters validated this response. It is concluded that while modern ISEs are not yet ready for long-term, unattended deployment, short-term installations (on the order of 2 to 4 days) are viable and may provide valuable insights into nitrogen dynamics in complex soil systems.

  11. A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Valente, E. G.; Chettri, S. S.

    2011-12-01

    We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in data volumes or user demand, but that computing capacity (and hourly costs) can be dropped almost instantly once the surge passes. Cloud computing also allows low-risk experimentation with a variety of machine architectures (processor types; bandwidth, memory, and storage capacities, etc.) and of system configurations (including massively parallel computing patterns). Finally, our service-based approach (in which user applications invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored products on demand. To maximize the usefulness and impact of our technology, we have emphasized open, industry-standard software interfaces. We are also using and developing open source software to facilitate the widespread adoption of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sources.

  12. Computation of nonlinear ultrasound fields using a linearized contrast source method.

    PubMed

    Verweij, Martin D; Demi, Libertario; van Dongen, Koen W A

    2013-08-01

    Nonlinear ultrasound is important in medical diagnostics because imaging of the higher harmonics improves resolution and reduces scattering artifacts. Second harmonic imaging is currently standard, and higher harmonic imaging is under investigation. The efficient development of novel imaging modalities and equipment requires accurate simulations of nonlinear wave fields in large volumes of realistic (lossy, inhomogeneous) media. The Iterative Nonlinear Contrast Source (INCS) method has been developed to deal with spatiotemporal domains measuring hundreds of wavelengths and periods. This full wave method considers the nonlinear term of the Westervelt equation as a nonlinear contrast source, and solves the equivalent integral equation via the Neumann iterative solution. Recently, the method has been extended with a contrast source that accounts for spatially varying attenuation. The current paper addresses the problem that the Neumann iterative solution converges badly for strong contrast sources. The remedy is linearization of the nonlinear contrast source, combined with application of more advanced methods for solving the resulting integral equation. Numerical results show that linearization in combination with a Bi-Conjugate Gradient Stabilized method allows the INCS method to deal with fairly strong, inhomogeneous attenuation, while the error due to the linearization can be eliminated by restarting the iterative scheme.

  13. An experimental study on the near-source region of lazy turbulent plumes

    NASA Astrophysics Data System (ADS)

    Ciriello, Francesco; Hunt, Gary R.

    2017-11-01

    The near-source region of a `lazy' turbulent buoyant plume issuing from a circular source is examined for source Richardson numbers in the range of 101 to 107. New data is acquired for the radial contraction and streamwise variation of volume flux through an experimental programme of dye visualisations and particle image velocimetry. This data reveals the limited applicability of traditional entrainment laws used in integral modelling approaches for the description of the near-source region for these source Richardson numbers. A revised entrainment function is proposed, based on which we introduce a classification of plume behaviour whereby the degree of `laziness' may be expressed in terms of the excess dilution that occurs compared to a `pure' constant Richardson number plume. The increased entrainment measured in lazy plumes is attributed to Rayleigh-Taylor instabilities developing along the contraction of the plume which promote the additional engulfment of ambient fluid into the plume. This work was funded by an EPSRC Industial Case Award sponsored by Dyson Technology Ltd. Special thanks go to the members of the Dyson Environmental Control Group that regularly visit us in Cambridge for discussions about our work.

  14. Block 4 solar cell module design and test specification for residential applications

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Near-term design, qualification and acceptance requirements are provided for terrestrial solar cell modules suitable for incorporation in photovoltaic power sources (2 kW to 10 kW) applied to single family residential installations. Requirement levels and recommended design limits for selected performance criteria are specified for modules intended principally for rooftop installations. Modules satisfying the requirements of this specification fall into one of two categories, residential panel or residential shingle, both meeting general performance requirements plus additional category peculiar constraints.

  15. The NASA Space Radiation Health Program

    NASA Technical Reports Server (NTRS)

    Schimmerling, W.; Sulzman, F. M.

    1994-01-01

    The NASA Space Radiation Health Program is a part of the Life Sciences Division in the Office of Space Science and Applications (OSSA). The goal of the Space Radiation Health Program is development of scientific bases for assuring adequate radiation protection in space. A proposed research program will determine long-term health risks from exposure to cosmic rays and other radiation. Ground-based animal models will be used to predict risk of exposures at varying levels from various sources and the safe levels for manned space flight.

  16. Validating the Usefulness of Combined Japanese GMS Data For Long-Term Global Change Studies

    NASA Technical Reports Server (NTRS)

    Simpson, James J.; Dodge, James C. (Technical Monitor)

    2001-01-01

    The primary objectives of the Geostationary Meteorological Satellite (GMS)-5 Pathfinder Project were the following: (1) to evaluate GMS-5 data for sources of error and develop methods for minimizing any such errors in GMS-5 data; (2) to prepare a GMS-5 Pathfinder data set for the GMS-5 Pathfinder Benchmark Period (1 July 95 - 30 June 96); and (3) show the usefulness of the improved Pathfinder data set in at least one geophysical application. All objectives were met.

  17. Analysis of the Performance of Heat Pipes and Phase-Change Materials with Multiple Localized Heat Sources for Space Applications

    DTIC Science & Technology

    1989-05-01

    NUMERICAL ANALYSIS OF STEFAN PROBLEMS FOR GENERALIZED MULTI- DIMENSIONAL PHASE-CHANGE STRUCTURES USING THE ENTHALPY TRANSFORMING MODEL 4.1 Summary...equation St Stefan number, cs(Tm-Tw)/H or cs(Tm-Ti)/H s circumferential distance coordinate, m, Section III s dimensionless interface position along...fluid, kg/m 3 0 viscous dissipation term in the energy eqn. (1.4), Section I; dummy variable, Section IV r dimensionless time, ta/L 2 a Stefan -Boltzmann

  18. Ray-optical theory of broadband partially coherent emission

    NASA Astrophysics Data System (ADS)

    Epstein, Ariel; Tessler, Nir; Einziger, Pinchas D.

    2013-04-01

    We present a rigorous formulation of the effects of spectral broadening on emission of partially coherent source ensembles embedded in multilayered formations with arbitrarily shaped interfaces, provided geometrical optics is valid. The resulting ray-optical theory, applicable to a variety of optical systems from terahertz lenses to photovoltaic cells, quantifies the fundamental interplay between bandwidth and layer dimensions, and sheds light on common practices in optical analysis of statistical fields, e.g., disregarding multiple reflections or neglecting interference cross terms.

  19. Application of Electrical Resistivity Method (ERM) in Groundwater Exploration

    NASA Astrophysics Data System (ADS)

    Izzaty Riwayat, Akhtar; Nazri, Mohd Ariff Ahmad; Hazreek Zainal Abidin, Mohd

    2018-04-01

    The geophysical method which dominant by geophysicists become one of most popular method applied by engineers in civil engineering fields. Electrical Resistivity Method (ERM) is one of geophysical tool that offer very attractive technique for subsurface profile characterization in larger area. Applicable alternative technique in groundwater exploration such as ERM which complement with existing conventional method may produce comprehensive and convincing output thus effective in terms of cost, time, data coverage and sustainable. ERM has been applied by various application in groundwater exploration. Over the years, conventional method such as excavation and test boring are the tools used to obtain information of earth layer especially during site investigation. There are several problems regarding the application of conventional technique as it only provides information at actual drilling point only. This review paper was carried out to expose the application of ERM in groundwater exploration. Results from ERM could be additional information to respective expert for their problem solving such as the information on groundwater pollution, leachate, underground and source of water supply.

  20. Long range laser propagation: power scaling and beam quality issues

    NASA Astrophysics Data System (ADS)

    Bohn, Willy L.

    2010-09-01

    This paper will address long range laser propagation applications where power and, in particular beam quality issues play a major role. Hereby the power level is defined by the specific mission under consideration. I restrict myself to the following application areas: (1)Remote sensing/Space based LIDAR, (2) Space debris removal (3)Energy transmission, and (4)Directed energy weapons Typical examples for space based LIDARs are the ADM Aeolus ESA mission using the ALADIN Nd:YAG laser with its third harmonic at 355 nm and the NASA 2 μm Tm:Ho:LuLiF convectively cooled solid state laser. Space debris removal has attracted more attention in the last years due to the dangerous accumulation of debris in orbit which become a threat to the satellites and the ISS space station. High power high brightness lasers may contribute to this problem by partially ablating the debris material and hence generating an impulse which will eventually de-orbit the debris with their subsequent disintegration in the lower atmosphere. Energy transmission via laser beam from space to earth has long been discussed as a novel long term approach to solve the energy problem on earth. In addition orbital transfer and stationkeeping are among the more mid-term applications of high power laser beams. Finally, directed energy weapons are becoming closer to reality as corresponding laser sources have matured due to recent efforts in the JHPSSL program. All of this can only be realized if he laser sources fulfill the necessary power requirements while keeping the beam quality as close as possible to the diffraction limited value. And this is the rationale and motivation of this paper.

  1. Design of a Nutrient Reclamation System for the Cultivation of Microalgae for Biofuel Production and Other Industrial Applications

    NASA Astrophysics Data System (ADS)

    Sandefur, Heather Nicole

    Microalgal biomass has been identified as a promising feedstock for a number of industrial applications, including the synthesis of new pharmaceutical and biofuel products. However, there are several economic limitations associated with the scale up of existing algal production processes. Critical economic studies of algae-based industrial processes highlight the high cost of supplying essential nutrients to microalgae cultures. With microalgae cells having relatively high nitrogen contents (4 to 8%), the N fertilizer cost in industrial-scale production is significant. In addition, the disposal of the large volumes of cell residuals that are generated during product extraction stages can pose other economic challenges. While waste streams can provide a concentrated source of nutrients, concerns about the presence of biological contaminants and the expense of heat treatment pose challenges to processes that use wastewater as a nutrient source in microalgae cultures. The goal of this study was to evaluate the potential application of ultrafiltration technology to aid in the utilization of agricultural wastewater in the cultivation of a high-value microalgae strain. An ultrafiltration system was used to remove inorganic solids and biological contaminants from wastewater taken from a swine farm in Savoy, Arkansas. The permeate from the system was then used as the nutrient source for the cultivation of the marine microalgae Porphyridium cruentum. During the ultrafiltration system operation, little membrane fouling was observed, and permeate fluxes remained relatively constant during both short-term and long-term tests. The complete rejection of E. coli and coliforms from the wastewater was also observed, in addition to a 75% reduction in total solids, including inorganic materials. The processed permeate was shown to have very high concentrations of total nitrogen (695.6 mg L-1) and total phosphorus (69.1 mg L-1 ). In addition, the growth of P. cruentum was analyzed in a medium containing swine waste permeate, and was compared to P. cruentum growth in a control medium. A higher biomass productivity, lipid productivity, and lipid content were observed in the microalgae cultivated in the swine waste medium compared to that of the control medium. These results suggest that, through the use of ultrafiltration technology as an alternative to traditional heat treatment, agricultural wastewaters could be effectively utilized as a nutrient source for microalgae cultivation.

  2. Evaluation of the performance of small diode pumped UV solid state (DPSS) Nd:YAG lasers as new radiation sources for atmospheric pressure laser ionization mass spectrometry (APLI-MS).

    PubMed

    Kersten, Hendrik; Lorenz, Matthias; Brockmann, Klaus J; Benter, Thorsten

    2011-06-01

    The performance of a KrF* bench top excimer laser and a compact diode pumped UV solid state (DPSS) Nd:YAG laser as photo-ionizing source in LC-APLI MS is compared. The commonly applied bench-top excimer laser, operating at 248 nm, provides power densities of the order of low MW/cm(2) on an illuminated area of 0.5 cm(2) (8 mJ/pulse, 5 ns pulse duration, beam waist area 0.5 cm(2), 3 MW/cm(2)). The DPSS laser, operating at 266 nm, provides higher power densities, however, on a two orders of magnitude smaller illuminated area (60 μJ/pulse, 1 ns pulse duration, beam waist area 2 × 10(-3) cm(2), 30 MW/cm(2)). In a common LC-APLI MS setup with direct infusion of a 10 nM pyrene solution, the DPSS laser yields a significantly smaller ion signal (0.9%) and signal to noise ratio (1.4%) compared with the excimer laser. With respect to the determined low detection limits (LODs) for PAHs of 0.1 fmol using an excimer laser, LODs in DPSS laser LC-APLI MS in the low pmol regime are expected. The advantages of the DPSS laser with respect to applicability (size, cost, simplicity) may render this light source the preferred one for APLI applications not focusing on ultimately high sensitivities. Furthermore, the impact of adjustable ion source parameters on the performance of both laser systems is discussed in terms of the spatial sensitivity distribution described by the distribution of ion acceptance (DIA) measurements. Perspectives concerning the impact on future APLI-MS applications are given.

  3. SU-E-T-547: Rotating Shield Brachytherapy (RSBT) for Cervical Cancer.

    PubMed

    Yang, W; Kim, Y; Liu, Y; Wu, X; Flynn, R

    2012-06-01

    To assess rotating shield brachytherapy (RSBT) delivered with the electronic brachytherapy (eBT) source comparing to intracavitary (IC) and intracavitary plus supplemental interstitial brachytherapy (IC+IS BT) delivered with conventional isotope radiation source. IC, IC+IS and RSBT plan was simulated for 5 patients with advanced cervical cancer (>40cc). One BT plan for each patient (fraction 1) guided by magnetic resonance imaging (MRI) was used in our treatment planning system (TPS). A bio- and MRI-compatible polycarbonate (Makrolon Rx3158) intrauterine applicator was simulated for IC and RSBT, and the vienna applicator was simulated for IC+IS BT. 192Ir was used as the radiation source of IC and IC+IS BT; Xoft AxxentTM eBT source was used for RSBT. A 0.5 mm thick tungsten shield was used for RS-BT with different azimuthal and zenith angles. The total dose for each plan was escalated as the external beam radiation therapy (EBRT) plus BT times fraction number (5 in our case). RSBT and IC+IS BT had higher dose conformity in terms of D90 than IC BT for all the patients. The advantage of RSBT over IC+IS BT was dependent on the shield emission angle, tumor shape and tandem applicator location. The delivery time of RSBT was increased as finer emission angle was selected. RSBT is a less-invasive potential alternative to conventional IC and IC+IS BT for treating bulky (>40cc) cervical cancer. RSBT can provide better treatment outcome with clinically acceptable increased delivery time if proper emission angle is selected based on the tumor shape and tandem applicator location. supported in part by NSF grants CCF-0830402 and CCF-0844765; and the NIH grant K25-CA123112, and American Cancer Society seed grant (IRG-77-004-31). © 2012 American Association of Physicists in Medicine.

  4. Sensitivity Analysis Tailored to Constrain 21st Century Terrestrial Carbon-Uptake

    NASA Astrophysics Data System (ADS)

    Muller, S. J.; Gerber, S.

    2013-12-01

    The long-term fate of terrestrial carbon (C) in response to climate change remains a dominant source of uncertainty in Earth-system model projections. Increasing atmospheric CO2 could be mitigated by long-term net uptake of C, through processes such as increased plant productivity due to "CO2-fertilization". Conversely, atmospheric conditions could be exacerbated by long-term net release of C, through processes such as increased decomposition due to higher temperatures. This balance is an important area of study, and a major source of uncertainty in long-term (>year 2050) projections of planetary response to climate change. We present results from an innovative application of sensitivity analysis to LM3V, a dynamic global vegetation model (DGVM), intended to identify observed/observable variables that are useful for constraining long-term projections of C-uptake. We analyzed the sensitivity of cumulative C-uptake by 2100, as modeled by LM3V in response to IPCC AR4 scenario climate data (1860-2100), to perturbations in over 50 model parameters. We concurrently analyzed the sensitivity of over 100 observable model variables, during the extant record period (1970-2010), to the same parameter changes. By correlating the sensitivities of observable variables with the sensitivity of long-term C-uptake we identified model calibration variables that would also constrain long-term C-uptake projections. LM3V employs a coupled carbon-nitrogen cycle to account for N-limitation, and we find that N-related variables have an important role to play in constraining long-term C-uptake. This work has implications for prioritizing field campaigns to collect global data that can help reduce uncertainties in the long-term land-atmosphere C-balance. Though results of this study are specific to LM3V, the processes that characterize this model are not completely divorced from other DGVMs (or reality), and our approach provides valuable insights into how data can be leveraged to be better constrain projections for the land carbon sink.

  5. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  6. Enriching semantic knowledge bases for opinion mining in big data applications.

    PubMed

    Weichselbraun, A; Gindl, S; Scharl, A

    2014-10-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process.

  7. Two Mechanisms Determine Quantum Dot Blinking.

    PubMed

    Yuan, Gangcheng; Gómez, Daniel E; Kirkwood, Nicholas; Boldt, Klaus; Mulvaney, Paul

    2018-04-24

    Many potential applications of quantum dots (QDs) can only be realized once the luminescence from single nanocrystals (NCs) is understood. These applications include the development of quantum logic devices, single-photon sources, long-life LEDs, and single-molecule biolabels. At the single-nanocrystal level, random fluctuations in the QD photoluminescence occur, a phenomenon termed blinking. There are two competing models to explain this blinking: Auger recombination and surface trap induced recombination. Here we use lifetime scaling on core-shell chalcogenide NCs to demonstrate that both types of blinking occur in the same QDs. We prove that Auger-blinking can yield single-exponential on/off times in contrast to earlier work. The surface passivation strategy determines which blinking mechanism dominates. This study summarizes earlier studies on blinking mechanisms and provides some clues that stable single QDs can be engineered for optoelectronic applications.

  8. ALARA implementation throughout project life cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haynes, M.J.

    1995-03-01

    A strength of radiation protection programs generally has been endorsement and application of the ALARA principle. In Ontario Hydro, which currently operates 20 commercial size nuclear units, great strides have been made in the last three decades in reducing occupational radiation exposure per unit of electricity generated. This paper will discuss specific applications of elements of the overall ALARA program which have most contributed to dose reduction as the nuclear program has expanded. This includes such things as management commitment, ALARA application in the design phase and major rehabilitation work, the benefits of the self protection concept, a specific examplemore » of elimination (or reduction) of the source term and the importance of dose targets. Finally, it is concluded that the major opportunities for further improvements may lie in the area of information management.« less

  9. Status of the advanced Stirling conversion system project for 25 kW dish Stirling applications

    NASA Technical Reports Server (NTRS)

    Shaltens, Richard K.; Schreiber, Jeffrey G.

    1991-01-01

    Technology development for Stirling convertors directed toward a dynamic power source for space applications is discussed. Space power requirements include high reliability with very long life, low vibration, and high system efficiency. The free-piston Stirling engine has the potential for future high power space conversion systems, either nuclear or solar powered. Although these applications appear to be quite different, their requirements complement each other. The advanced Stirling conversion system (ASCS) project at NASA Lewis Research Center is described. Each system design features a solar receiver/liquid metal heat transport system and a free-piston Stirling convertor with a means to provide nominally 25 kW of electric power to utility grid while meeting the US Department of Energy (DOE) performance and long term cost goals. The design is compared with other ASCS designs.

  10. Long-term nitrogen fertilization decreases bacterial diversity and favors the growth of Actinobacteria and Proteobacteria in agro-ecosystems across the globe.

    PubMed

    Dai, Zhongmin; Su, Weiqin; Chen, Huaihai; Barberán, Albert; Zhao, Haochun; Yu, Mengjie; Yu, Lu; Brookes, Philip C; Schadt, Christopher W; Chang, Scott X; Xu, Jianming

    2018-04-12

    Long-term elevated nitrogen (N) input from anthropogenic sources may cause soil acidification and decrease crop yield, yet the response of the belowground microbial community to long-term N input alone or in combination with phosphorus (P) and potassium (K) is poorly understood. We explored the effect of long-term N and NPK fertilization on soil bacterial diversity and community composition using meta-analysis of a global dataset. Nitrogen fertilization decreased soil pH, and increased soil organic carbon (C) and available N contents. Bacterial taxonomic diversity was decreased by N fertilization alone, but was increased by NPK fertilization. The effect of N fertilization on bacterial diversity varied with soil texture and water management, but was independent of crop type or N application rate. Changes in bacterial diversity were positively related to both soil pH and organic C content under N fertilization alone, but only to soil organic C under NPK fertilization. Microbial biomass C decreased with decreasing bacterial diversity under long-term N fertilization. Nitrogen fertilization increased the relative abundance of Proteobacteria and Actinobacteria, but reduced the abundance of Acidobacteria, consistent with the general life history strategy theory for bacteria. The positive correlation between N application rate and the relative abundance of Actinobacteria indicates that increased N availability favored the growth of Actinobacteria. This first global analysis of long-term N and NPK fertilization that differentially affects bacterial diversity and community composition provides a reference for nutrient management strategies for maintaining belowground microbial diversity in agro-ecosystems worldwide. © 2018 John Wiley & Sons Ltd.

  11. Frequency standard stability for Doppler measurements on-board the shuttle

    NASA Technical Reports Server (NTRS)

    Harton, P. L.

    1974-01-01

    The short and long term stability characteristics of crystal and atomic standards are described. Emphasis is placed on crystal oscillators because of the selection which was made for the shuttle baseline and the complexities which are introduced by the shuttle environment. Attention is given, first, to the definitions of stability and the application of these definitions to the shuttle system and its mission. Data from time domain measurements are used to illustrate the definitions. Results of a literature survey to determine environmental effects on frequency reference sources are then presented. Finally, methods of standard frequency dissemination over radio frequency carriers are noted as a possible means of measuring absolute accuracy and long term stability characteristics during on one way Doppler equipment.

  12. Chemistry and combustion of fit-for-purpose biofuels.

    PubMed

    Rothamer, David A; Donohue, Timothy J

    2013-06-01

    From the inception of internal combustion engines, biologically derived fuels (biofuels) have played a role. Nicolaus Otto ran a predecessor to today's spark-ignition engine with an ethanol fuel blend in 1860. At the 1900 Paris world's fair, Rudolf Diesel ran his engine on peanut oil. Over 100 years of petroleum production has led to consistency and reliability of engines that demand standardized fuels. New biofuels can displace petroleum-based fuels and produce positive impacts on the environment, the economy, and the use of local energy sources. This review discusses the combustion, performance and other requirements of biofuels that will impact their near-term and long-term ability to replace petroleum fuels in transportation applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    NASA Astrophysics Data System (ADS)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  14. Seismic interferometry by crosscorrelation and by multidimensional deconvolution: a systematic comparison

    NASA Astrophysics Data System (ADS)

    Wapenaar, Kees; van der Neut, Joost; Ruigrok, Elmer; Draganov, Deyan; Hunziker, Juerg; Slob, Evert; Thorbecke, Jan; Snieder, Roel

    2010-05-01

    In recent years, seismic interferometry (or Green's function retrieval) has led to many applications in seismology (exploration, regional and global), underwater acoustics and ultrasonics. One of the explanations for this broad interest lies in the simplicity of the methodology. In passive data applications a simple crosscorrelation of responses at two receivers gives the impulse response (Green's function) at one receiver as if there were a source at the position of the other. In controlled-source applications the procedure is similar, except that it involves in addition a summation along the sources. It has also been recognized that the simple crosscorrelation approach has its limitations. From the various theoretical models it follows that there are a number of underlying assumptions for retrieving the Green's function by crosscorrelation. The most important assumptions are that the medium is lossless and that the waves are equipartitioned. In heuristic terms the latter condition means that the receivers are illuminated isotropically from all directions, which is for example achieved when the sources are regularly distributed along a closed surface, the sources are mutually uncorrelated and their power spectra are identical. Despite the fact that in practical situations these conditions are at most only partly fulfilled, the results of seismic interferometry are generally quite robust, but the retrieved amplitudes are unreliable and the results are often blurred by artifacts. Several researchers have proposed to address some of the shortcomings by replacing the correlation process by deconvolution. In most cases the employed deconvolution procedure is essentially 1-D (i.e., trace-by-trace deconvolution). This compensates the anelastic losses, but it does not account for the anisotropic illumination of the receivers. To obtain more accurate results, seismic interferometry by deconvolution should acknowledge the 3-D nature of the seismic wave field. Hence, from a theoretical point of view, the trace-by-trace process should be replaced by a full 3-D wave field deconvolution process. Interferometry by multidimensional deconvolution is more accurate than the trace-by-trace correlation and deconvolution approaches but the processing is more involved. In the presentation we will give a systematic analysis of seismic interferometry by crosscorrelation versus multi-dimensional deconvolution and discuss applications of both approaches.

  15. Nitrogen Source and Rate Management Improve Maize Productivity of Smallholders under Semiarid Climates.

    PubMed

    Amanullah; Iqbal, Asif; Ali, Ashraf; Fahad, Shah; Parmar, Brajendra

    2016-01-01

    Nitrogen is one of the most important factor affecting maize ( Zea mays L.) yield and income of smallholders under semiarid climates. Field experiments were conducted to investigate the impact of different N-fertilizer sources [urea, calcium ammonium nitrate (CAN), and ammonium sulfate (AS)] and rates (50, 100, 150, and 200 kg ha -1 ) on umber of rows ear -1 (NOR ear -1 ), number of seeds row -1 (NOS row -1 ), number of seeds ear -1 (NOS ear -1 ), number of ears per 100 plants (NOEP 100 plants -1 ), grain yield plant -1 , stover yield (kg ha -1 ), and shelling percentage (%) of maize genotypes "Local cultivars (Azam and Jalal) vs. hybrid (Pioneer-3025)." The experiment was conducted at the Agronomy Research Farm of the University of Agriculture Peshawar during summers of 2008 (year one) and 2010 (year two). The results revealed that the N treated (rest) plots (the average of all the experimental plots treated with N) had produced higher yield and yield components, and shelling percentage over N-control plots (plots where N was not applied). Application of nitrogen at the higher rate increased yield and yield components in maize (200 > 150 > 100 > 50 kg N ha -1 ). Application of AS and CAN had more beneficial impact on yield and yield components of maize as compared to urea (AS > CAN > urea). Hybrid maize (P-3025) produced significantly higher yield and yield components as well as higher shelling percentage than the two local cultivars (P-3025 > Jalal = Azam). Application of ammonium sulfate at the rate of 200 kg N ha -1 to hybrid maize was found most beneficial in terms of higher productivity and grower's income in the study area. For the two local cultivars, application of 150 kg N ha -1 was found more beneficial over 120 kg N ha -1 (recommended N rate) in terms of greater productivity and growers income.

  16. Nitrogen Source and Rate Management Improve Maize Productivity of Smallholders under Semiarid Climates

    PubMed Central

    Amanullah; Iqbal, Asif; Ali, Ashraf; Fahad, Shah; Parmar, Brajendra

    2016-01-01

    Nitrogen is one of the most important factor affecting maize (Zea mays L.) yield and income of smallholders under semiarid climates. Field experiments were conducted to investigate the impact of different N-fertilizer sources [urea, calcium ammonium nitrate (CAN), and ammonium sulfate (AS)] and rates (50, 100, 150, and 200 kg ha−1) on umber of rows ear−1 (NOR ear−1), number of seeds row−1 (NOS row−1), number of seeds ear−1 (NOS ear−1), number of ears per 100 plants (NOEP 100 plants−1), grain yield plant−1, stover yield (kg ha−1), and shelling percentage (%) of maize genotypes “Local cultivars (Azam and Jalal) vs. hybrid (Pioneer-3025).” The experiment was conducted at the Agronomy Research Farm of the University of Agriculture Peshawar during summers of 2008 (year one) and 2010 (year two). The results revealed that the N treated (rest) plots (the average of all the experimental plots treated with N) had produced higher yield and yield components, and shelling percentage over N-control plots (plots where N was not applied). Application of nitrogen at the higher rate increased yield and yield components in maize (200 > 150 > 100 > 50 kg N ha−1). Application of AS and CAN had more beneficial impact on yield and yield components of maize as compared to urea (AS > CAN > urea). Hybrid maize (P-3025) produced significantly higher yield and yield components as well as higher shelling percentage than the two local cultivars (P-3025 > Jalal = Azam). Application of ammonium sulfate at the rate of 200 kg N ha−1 to hybrid maize was found most beneficial in terms of higher productivity and grower's income in the study area. For the two local cultivars, application of 150 kg N ha−1 was found more beneficial over 120 kg N ha−1 (recommended N rate) in terms of greater productivity and growers income. PMID:27965685

  17. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. 7 CFR 1.412 - Institution of proceedings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Adjudication of Sourcing Area Applications and Formal Review of Sourcing Areas Pursuant to the Forest Resources...) Sourcing area applications. The proceeding for determining sourcing areas shall be instituted by receipt of a sourcing area application by the Office of Administrative Law Judges, pursuant to 36 CFR 223.190...

  19. 7 CFR 1.411 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Sourcing area applicant means a person who submits a sourcing area application pursuant to these rules, or a person who sourcing area is subject to formal review pursuant to 36 CFR 223.191(e). (b) Decision... Sourcing Area Applications and Formal Review of Sourcing Areas Pursuant to the Forest Resources...

  20. System alignment using the Talbot effect

    NASA Astrophysics Data System (ADS)

    Chevallier, Raymond; Le Falher, Eric; Heggarty, Kevin

    1990-08-01

    The Talbot effect is utilized to correct an alignment problem related to a neural network used for image recognition, which required the alignment of a spatial light modulator (SLM) with the input module. A mathematical model which employs the Fresnel diffraction theory is presented to describe the method. The calculation of the diffracted amplitude describes the wavefront sphericity and the original object transmittance function in order to qualify the lateral shift of the Talbot image. Another explanation is set forth in terms of plane-wave illumination in the neural network. Using a Fourier series and by describing planes where all the harmonics are in phase, the reconstruction of Talbot images is explained. The alignment is effective when the lenslet array is aligned on the even Talbot images of the SLM pixels and the incident wave is a plane wave. The alignment is evaluated in terms of source and periodicity errors, tilt of the incident plane waves, and finite object dimensions. The effects of the error sources are concluded to be negligible, the lenslet array is shown to be successfully aligned with the SLM, and other alignment applications are shown to be possible.

  1. High-resolution synchrotron-based Fourier transform spectroscopy of [image omitted] in the 120-350 cm-1 far-infrared region

    NASA Astrophysics Data System (ADS)

    Moruzzi, G.; Murphy, R. J.; Lees, R. M.; Predoi-Cross, A.; Billinghurst, B. E.

    2010-09-01

    The Fourier transform spectrum of the ? isotopologue of methanol has been recorded in the 120-350 cm-1 far-infrared region at a resolution of 0.00096 cm-1 using synchrotron source radiation at the Canadian Light Source. The study, motivated by astrophysical applications, is aimed at generating a sufficiently accurate set of energy level term values for the ground vibrational state to allow prediction of the centres of the quadrupole hyperfine multiplets for astronomically observable sub-millimetre transitions to within an uncertainty of a few MHz. To expedite transition identification, a new function was added to the Ritz program in which predicted spectral line positions were generated by an adjustable interpolation between the known assignments for the ? and ? isotopologues. By displaying the predictions along with the experimental spectrum on the computer monitor and adjusting the predictions to match observed features, rapid assignment of numerous ? sub-bands was possible. The least squares function of the Ritz program was then used to generate term values for the identified levels. For each torsion-K-rotation substate, the term values were fitted to a Taylor-series expansion in powers of J(J + 1) to determine the substate origin energy and effective B-value. In this first phase of the study we did not attempt a full global fit to the assigned transitions, but instead fitted the sub-band J-independent origins to a restricted Hamiltonian containing the principal torsional and K-dependent terms. These included structural and torsional potential parameters plus quartic distortional and torsion-rotation interaction terms.

  2. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  3. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  4. Microseismic source locations with deconvolution migration

    NASA Astrophysics Data System (ADS)

    Wu, Shaojiang; Wang, Yibo; Zheng, Yikang; Chang, Xu

    2018-03-01

    Identifying and locating microseismic events are critical problems in hydraulic fracturing monitoring for unconventional resources exploration. In contrast to active seismic data, microseismic data are usually recorded with unknown source excitation time and source location. In this study, we introduce deconvolution migration by combining deconvolution interferometry with interferometric cross-correlation migration (CCM). This method avoids the need for the source excitation time and enhances both the spatial resolution and robustness by eliminating the square term of the source wavelets from CCM. The proposed algorithm is divided into the following three steps: (1) generate the virtual gathers by deconvolving the master trace with all other traces in the microseismic gather to remove the unknown excitation time; (2) migrate the virtual gather to obtain a single image of the source location and (3) stack all of these images together to get the final estimation image of the source location. We test the proposed method on complex synthetic and field data set from the surface hydraulic fracturing monitoring, and compare the results with those obtained by interferometric CCM. The results demonstrate that the proposed method can obtain a 50 per cent higher spatial resolution image of the source location, and more robust estimation with smaller errors of the localization especially in the presence of velocity model errors. This method is also beneficial for source mechanism inversion and global seismology applications.

  5. Design of 1 MHz Solid State High Frequency Power Supply

    NASA Astrophysics Data System (ADS)

    Parmar, Darshan; Singh, N. P.; Gajjar, Sandip; Thakar, Aruna; Patel, Amit; Raval, Bhavin; Dhola, Hitesh; Dave, Rasesh; Upadhay, Dishang; Gupta, Vikrant; Goswami, Niranjan; Mehta, Kush; Baruah, Ujjwal

    2017-04-01

    High Frequency Power supply (HFPS) is used for various applications like AM Transmitters, metallurgical applications, Wireless Power Transfer, RF Ion Sources etc. The Ion Source for a Neutral beam Injector at ITER-India uses inductively coupled power source at High Frequency (∼1 MHz). Switching converter based topology used to generate 1 MHz sinusoidal output is expected to have advantages on efficiency and reliability as compared to traditional RF Tetrode tubes based oscillators. In terms of Power Electronics, thermal and power coupling issues are major challenges at such a high frequency. A conceptual design for a 200 kW, 1 MHz power supply and a prototype design for a 600 W source been done. The prototype design is attempted with Class-E amplifier topology where a MOSFET is switched resonantly. The prototype uses two low power modules and a ferrite combiner to add the voltage and power at the output. Subsequently solution with Class-D H-Bridge configuration have been evaluated through simulation where module design is stable as switching device do not participate in resonance, further switching device voltage rating is substantially reduced. The rating of the modules is essentially driven by the maximum power handling capacity of the MOSFETs and ferrites in the combiner circuit. The output passive network including resonance tuned network and impedance matching network caters for soft switching and matches the load impedance to 50ohm respectively. This paper describes the conceptual design of a 200 kW high frequency power supply and experimental results of the prototype 600 W, 1 MHz source.

  6. Second-order singular pertubative theory for gravitational lenses

    NASA Astrophysics Data System (ADS)

    Alard, C.

    2018-03-01

    The extension of the singular perturbative approach to the second order is presented in this paper. The general expansion to the second order is derived. The second-order expansion is considered as a small correction to the first-order expansion. Using this approach, it is demonstrated that in practice the second-order expansion is reducible to a first order expansion via a re-definition of the first-order pertubative fields. Even if in usual applications the second-order correction is small the reducibility of the second-order expansion to the first-order expansion indicates a potential degeneracy issue. In general, this degeneracy is hard to break. A useful and simple second-order approximation is the thin source approximation, which offers a direct estimation of the correction. The practical application of the corrections derived in this paper is illustrated by using an elliptical NFW lens model. The second-order pertubative expansion provides a noticeable improvement, even for the simplest case of thin source approximation. To conclude, it is clear that for accurate modelization of gravitational lenses using the perturbative method the second-order perturbative expansion should be considered. In particular, an evaluation of the degeneracy due to the second-order term should be performed, for which the thin source approximation is particularly useful.

  7. Beamed Energy Propulsion: Research Status And Needs--Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkan, Mitat

    One promising solution to the operationally responsive space is the application of remote electromagnetic energy to propel a launch vehicle into orbit. With beamed energy propulsion, one can leave the power source stationary on the ground or space, and direct heat propellant on the spacecraft with a beam from a fixed station. This permits the spacecraft to leave its power source at home, saving significant amounts of mass, greatly improving performance. This concept, which removes the mass penalty of carrying the propulsion energy source on board the vehicle, was first proposed by Arthur Kantrowitz in 1972; he invoked an extremelymore » powerful ground based laser. The same year Michael Minovich suggested a conceptually similar 'in-space' laser rocket system utilizing a remote laser power station. In the late 1980's, Air Force Office of Scientific Research (AFOSR) funded continuous, double pulse laser and microwave propulsion while Strategic Defense Initiative Office (SDIO) funded ablative laser rocket propulsion. Currently AFOSR has been funding the concept initiated by Leik Myrabo, repetitively pulsed laser propulsion, which has been universally perceived, arguably, to be the closest for mid-term applications. This 2-part paper examines the investment strategies in beamed energy propulsion and technical challenges to be covers Part 2 covers the present research status and needs.« less

  8. Development of a novel low-flow ion source/sampling cone geometry for inductively coupled plasma mass spectrometry and application in hyphenated techniques

    NASA Astrophysics Data System (ADS)

    Pfeifer, Thorben; Janzen, Rasmus; Steingrobe, Tobias; Sperling, Michael; Franze, Bastian; Engelhard, Carsten; Buscher, Wolfgang

    2012-10-01

    A novel ion source/sampling cone device for inductively coupled plasma mass spectrometry (ICP-MS) especially operated in the hyphenated mode as a detection system coupled with different separation modules is presented. Its technical setup is described in detail. Its main feature is the very low total argon consumption of less than 1.5 L min- 1, leading to significant reduction of operational costs especially when time-consuming speciation analysis is performed. The figures of merit of the new system with respect to sensitivity, detection power, long-term stability and working range were explored. Despite the profound differences of argon consumption of the new system in comparison to the conventional ICP-MS system, many of the characteristic features of the conventional ICP-MS could be maintained to a great extent. To demonstrate the ion source's capabilities, it was used as an element-selective detector for gas (GC) and high performance liquid chromatography (HPLC) where organic compounds of mercury and cobalt, respectively, were separated and detected with the new low-flow ICP-MS detection system. The corresponding chromatograms are shown. The applicability for trace element analysis has been validated with the certified reference material NIST 1643e.

  9. Nonlinear derating of high-intensity focused ultrasound beams using Gaussian modal sums.

    PubMed

    Dibaji, Seyed Ahmad Reza; Banerjee, Rupak K; Soneson, Joshua E; Myers, Matthew R

    2013-11-01

    A method is introduced for using measurements made in water of the nonlinear acoustic pressure field produced by a high-intensity focused ultrasound transducer to compute the acoustic pressure and temperature rise in a tissue medium. The acoustic pressure harmonics generated by nonlinear propagation are represented as a sum of modes having a Gaussian functional dependence in the radial direction. While the method is derived in the context of Gaussian beams, final results are applicable to general transducer profiles. The focal acoustic pressure is obtained by solving an evolution equation in the axial variable. The nonlinear term in the evolution equation for tissue is modeled using modal amplitudes measured in water and suitably reduced using a combination of "source derating" (experiments in water performed at a lower source acoustic pressure than in tissue) and "endpoint derating" (amplitudes reduced at the target location). Numerical experiments showed that, with proper combinations of source derating and endpoint derating, direct simulations of acoustic pressure and temperature in tissue could be reproduced by derating within 5% error. Advantages of the derating approach presented include applicability over a wide range of gains, ease of computation (a single numerical quadrature is required), and readily obtained temperature estimates from the water measurements.

  10. Compact atomic clocks and stabilised laser for space applications

    NASA Astrophysics Data System (ADS)

    Mileti, Gaetano; Affolderbach, Christoph; Matthey-de-l'Endroit, Renaud

    2016-07-01

    We present our developments towards next generation compact vapour-cell based atomic frequency standards using a tunable laser diode instead of a traditional discharge lamp. The realisation of two types of Rubidium clocks addressing specific applications is in progress: high performance frequency standards for demanding applications such as satellite navigation, and chip-scale atomic clocks, allowing further miniaturisation of the system. The stabilised laser source constitutes the main technological novelty of these new standards, allowing a more efficient preparation and interrogation of the atoms and hence an improvement of the clock performances. However, before this key component may be employed in a commercial and ultimately in a space-qualified instrument, further studies are necessary to demonstrate their suitability, in particular concerning their reliability and long-term operation. The talk will present our preliminary investigations on this subject. The stabilised laser diode technology developed for our atomic clocks has several other applications on ground and in space. We will conclude our talk by illustrating this for the example of a recently completed ESA project on a 1.6 microns wavelength reference for a future space-borne Lidar. This source is based on a Rubidium vapour cell providing the necessary stability and accuracy, while a second harmonic generator and a compact optical comb generated from an electro-optic modulator allow to transfer these properties from the Rubidium wavelength (780nm) to the desired spectral range.

  11. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  12. Spectral radiance source based on supercontinuum laser and wavelength tunable bandpass filter: the spectrally tunable absolute irradiance and radiance source.

    PubMed

    Levick, Andrew P; Greenwell, Claire L; Ireland, Jane; Woolliams, Emma R; Goodman, Teresa M; Bialek, Agnieszka; Fox, Nigel P

    2014-06-01

    A new spectrally tunable source for calibration of radiometric detectors in radiance, irradiance, or power mode has been developed and characterized. It is termed the spectrally tunable absolute irradiance and radiance source (STAIRS). It consists of a supercontinuum laser, wavelength tunable bandpass filter, power stabilization feedback control scheme, and output coupling optics. It has the advantages of relative portability and a collimated beam (low étendue), and is an alternative to conventional sources such as tungsten lamps, blackbodies, or tunable lasers. The supercontinuum laser is a commercial Fianium SC400-6-02, which has a wavelength range between 400 and 2500 nm and a total power of 6 W. The wavelength tunable bandpass filter, a PhotonEtc laser line tunable filter (LLTF), is tunable between 400 and 1000 nm and has a bandwidth of 1 or 2 nm depending on the wavelength selected. The collimated laser beam from the LLTF filter is converted to an appropriate spatial and angular distribution for the application considered (i.e., for radiance, irradiance, or power mode calibration of a radiometric sensor) with the output coupling optics, for example, an integrating sphere, and the spectral radiance/irradiance/power of the source is measured using a calibration optical sensor. A power stabilization feedback control scheme has been incorporated that stabilizes the source to better than 0.01% for averaging times longer than 100 s. The out-of-band transmission of the LLTF filter is estimated to be < -65 dB (0.00003%), and is sufficiently low for many end-user applications, for example the spectral radiance calibration of earth observation imaging radiometers and the stray light characterization of array spectrometers (the end-user optical sensor). We have made initial measurements of two end-user instruments with the STAIRS source, an array spectrometer and ocean color radiometer.

  13. [Using the CAS (computer-assisted surgery) system in arthroscopic cruciate ligament surgery--adaptation and application in clinical practice].

    PubMed

    Bernsmann, K; Rosenthal, A; Sati, M; Ansari, B; Wiese, M

    2001-01-01

    The anterior cruciate ligament (ACL) is of great importance for the knee joint function. In the case of a complete ligament injury there is hardly any chance for complete recovery. The clear advantages of an operative reconstruction by replacing the ACL has been shown in many trails. The accurate placement of the graft's insertions has a significant effect on the mid- and probably long-term outcome of this procedure. Reviewing the literature, there are poor long-term results of ACL replacement in 5 to 52% of all cases, depending on the score system. One of the main reasons for unacceptable results is graft misplacement. This led to the construction of a CAS system for ACL replacement. The system assists this surgical procedure by navigating the exact position of the drilling holes. The Potential deformation quantity of the transplant can be controlled by this system in real time. 40 computer-assisted ACL replacements have been performed under active use of the CAS system. The short-term results are encouraging, no special complications have been seen so far. Prospective long-term follow-up studies are ongoing. ACL reconstruction by manual devices has many sources of error. The CAS system is able to give the surgeon reasonable views that are unachieveable by conventional surgery. He is therefore able to control a source of error and to optimise the results. The feasibility of this device in clinical routine use has been proven.

  14. Recent Progress on Spherical Torus Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ono, Masayuki; Kaita, Robert

    2014-01-01

    The spherical torus or spherical tokamak (ST) is a member of the tokamak family with its aspect ratio (A = R0/a) reduced to A ~ 1.5, well below the normal tokamak operating range of A ≥ 2.5. As the aspect ratio is reduced, the ideal tokamak beta β (radio of plasma to magnetic pressure) stability limit increases rapidly, approximately as β ~ 1/A. The plasma current it can sustain for a given edge safety factor q-95 also increases rapidly. Because of the above, as well as the natural elongation κ, which makes its plasma shape appear spherical, the ST configurationmore » can yield exceptionally high tokamak performance in a compact geometry. Due to its compactness and high performance, the ST configuration has various near term applications, including a compact fusion neutron source with low tritium consumption, in addition to its longer term goal of attractive fusion energy power source. Since the start of the two megaampere class ST facilities in 2000, National Spherical Torus Experiment (NSTX) in the US and Mega Ampere Spherical Tokamak (MAST) in UK, active ST research has been conducted worldwide. More than sixteen ST research facilities operating during this period have achieved remarkable advances in all of fusion science areas, involving fundamental fusion energy science as well as innovation. These results suggest exciting future prospects for ST research both near term and longer term. The present paper reviews the scientific progress made by the worldwide ST research community during this new mega-ampere-ST era.« less

  15. Coupling Aggressive Mass Removal with Microbial Reductive Dechlorination for Remediation of DNAPL Source Zones: A Review and Assessment

    PubMed Central

    Christ, John A.; Ramsburg, C. Andrew; Abriola, Linda M.; Pennell, Kurt D.; Löffler, Frank E.

    2005-01-01

    The infiltration of dense non-aqueous-phase liquids (DNAPLs) into the saturated subsurface typically produces a highly contaminated zone that serves as a long-term source of dissolved-phase groundwater contamination. Applications of aggressive physical–chemical technologies to such source zones may remove > 90% of the contaminant mass under favorable conditions. The remaining contaminant mass, however, can create a rebounding of aqueous-phase concentrations within the treated zone. Stimulation of microbial reductive dechlorination within the source zone after aggressive mass removal has recently been proposed as a promising staged-treatment remediation technology for transforming the remaining contaminant mass. This article reviews available laboratory and field evidence that supports the development of a treatment strategy that combines aggressive source-zone removal technologies with subsequent promotion of sustained microbial reductive dechlorination. Physical–chemical source-zone treatment technologies compatible with posttreatment stimulation of microbial activity are identified, and studies examining the requirements and controls (i.e., limits) of reductive dechlorination of chlorinated ethenes are investigated. Illustrative calculations are presented to explore the potential effects of source-zone management alternatives. Results suggest that, for the favorable conditions assumed in these calculations (i.e., statistical homogeneity of aquifer properties, known source-zone DNAPL distribution, and successful bioenhancement in the source zone), source longevity may be reduced by as much as an order of magnitude when physical–chemical source-zone treatment is coupled with reductive dechlorination. PMID:15811838

  16. Trends for Electron Beam Accelerator Applications in Industry

    NASA Astrophysics Data System (ADS)

    Machi, Sueo

    2011-02-01

    Electron beam (EB) accelerators are major pieces of industrial equipment used for many commercial radiation processing applications. The industrial use of EB accelerators has a history of more than 50 years and is still growing in terms of both its economic scale and new applications. Major applications involve the modification of polymeric materials to create value-added products, such as heat-resistant wires, heat-shrinkable sheets, automobile tires, foamed plastics, battery separators and hydrogel wound dressing. The surface curing of coatings and printing inks is a growing application for low energy electron accelerators, resulting in an environmentally friendly and an energy-saving process. Recently there has been the acceptance of the use of EB accelerators in lieu of the radioactive isotope cobalt-60 as a source for sterilizing disposable medical products. Environmental protection by the use of EB accelerators is a new and important field of application. A commercial plant for the cleaning flue gases from a coal-burning power plant is in operation in Poland, employing high power EB accelerators. In Korea, a commercial plant uses EB to clean waste water from a dye factory.

  17. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  18. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  19. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  20. SU-E-T-459: Impact of Source Position and Traveling Time On HDR Skin Surface Applicator Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, J; Barker, C; Zaider, M

    Purpose: Observed dosimetric discrepancy between measured and treatment planning system (TPS) predicted values, during applicator commissioning, were traced to source position uncertainty in the applicator. We quantify the dosimetric impact of this geometric uncertainty, and of the source traveling time inside the applicator, and propose corrections for clinical use. Methods: We measured the dose profiles from the Varian Leipzig-style (horizontal) HDR skin applicator, using EBT3 film, photon diode, and optically stimulated luminescence dosimeter (OSLD) and three different GammaMed HDR afterloders. The dose profiles and depth dose of each aperture were measured at several depths (up to about 10 mm, dependingmore » on the dosimeter). The measured dose profiles were compared with Acuros calculated profiles in BrachyVision TPS. For the impact of the source position, EBT3 film measurements were performed with applicator, facing-down and facing-up orientations. The dose with and without source traveling was measured with diode detector using HDR timer and electrometer timer, respectively. Results: Depth doses measured using the three dosimeters were in good agreement, but were consistently higher than the Acuros dose calculations. Measurements with the applicator facing-up were significantly lower than those in the facing-down position with maximum difference of about 18% at the surface, due to source sag inside the applicator. Based on the inverse-square law, the effective source sag was evaluated to be about 0.5 mm from the planned position. The additional dose from the source traveling was about 2.8% for 30 seconds with 10 Ci source, decreasing with increased dwelling time and decreased source activity. Conclusion: Due to the short source-to-surface distance of the applicator, the small source sag inside the applicator has significant dosimetric impact, which should be considered before the clinical use of the applicator. Investigation of the effect for other applicators that have relatively large source lumen inner diameter may be warranted. Christopher Barker and Gil’ad Cohen are receiving research support for a study of skin surface brachytherapy from Elekta.« less

  1. QuakeSim 2.0

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay W.; Lyzenga, Gregory A.; Granat, Robert A.; Norton, Charles D.; Rundle, John B.; Pierce, Marlon E.; Fox, Geoffrey C.; McLeod, Dennis; Ludwig, Lisa Grant

    2012-01-01

    QuakeSim 2.0 improves understanding of earthquake processes by providing modeling tools and integrating model applications and various heterogeneous data sources within a Web services environment. QuakeSim is a multisource, synergistic, data-intensive environment for modeling the behavior of earthquake faults individually, and as part of complex interacting systems. Remotely sensed geodetic data products may be explored, compared with faults and landscape features, mined by pattern analysis applications, and integrated with models and pattern analysis applications in a rich Web-based and visualization environment. Integration of heterogeneous data products with pattern informatics tools enables efficient development of models. Federated database components and visualization tools allow rapid exploration of large datasets, while pattern informatics enables identification of subtle, but important, features in large data sets. QuakeSim is valuable for earthquake investigations and modeling in its current state, and also serves as a prototype and nucleus for broader systems under development. The framework provides access to physics-based simulation tools that model the earthquake cycle and related crustal deformation. Spaceborne GPS and Inter ferometric Synthetic Aperture (InSAR) data provide information on near-term crustal deformation, while paleoseismic geologic data provide longerterm information on earthquake fault processes. These data sources are integrated into QuakeSim's QuakeTables database system, and are accessible by users or various model applications. UAVSAR repeat pass interferometry data products are added to the QuakeTables database, and are available through a browseable map interface or Representational State Transfer (REST) interfaces. Model applications can retrieve data from Quake Tables, or from third-party GPS velocity data services; alternatively, users can manually input parameters into the models. Pattern analysis of GPS and seismicity data has proved useful for mid-term forecasting of earthquakes, and for detecting subtle changes in crustal deformation. The GPS time series analysis has also proved useful as a data-quality tool, enabling the discovery of station anomalies and data processing and distribution errors. Improved visualization tools enable more efficient data exploration and understanding. Tools provide flexibility to science users for exploring data in new ways through download links, but also facilitate standard, intuitive, and routine uses for science users and end users such as emergency responders.

  2. Useful integral function and its application in thermal radiation calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S.L.; Rhee, K.T.

    1983-07-01

    In applying the Planck formula for computing the energy radiated from an isothermal source, the emissivity of the source must be found. This emissivity is expressed in terms of its spectral emissivity. This spectral emissivity of an isothermal volume with a given optical length containing radiating gases and/or soot, is computed through a relation (Sparrow and Cess, 1978) that contains the optical length and the spectral volume absorption coefficient. An exact solution is then offered to the equation that results from introducing the equation for the spectral emissivity into the equation for the emissivity. The function obtained is shown tomore » be useful in computing the spectral emissivity of an isothermal volume containing either soot or gaseous species, or both. Examples are presented.« less

  3. Monitoring the Low-Energy Gamma-Ray Sky Using Earth Occultation with GLAST GBM

    NASA Technical Reports Server (NTRS)

    Case, G.; Wilson-Hodge, C.; Cherry, M.; Kippen, M.; Ling, J.; Radocinski, R.; Wheaton, W.

    2007-01-01

    Long term all-sky monitoring of the 20 keV - 2 MeV gamma-ray sky using the Earth occultation technique was demonstrated by the BATSE instrument on the Compton Gamma Ray Observatory. The principles and techniques used for the development of an end-to-end earth occultation data analysis system for BATSE can be extended to the GLAST Gamma-ray Burst Monitor (GBM), resulting in multiband light curves and time-resolved spectra in the energy range 8 keV to above 1 MeV for known gamma-ray sources and transient outbursts, as well as the discovery of new sources of gamma-ray emission. In this paper we describe the application of the technique to the GBM. We also present the expected sensitivity for the GBM.

  4. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    NASA Technical Reports Server (NTRS)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  5. Non-Gaussian limit fluctuations in active swimmer suspensions

    NASA Astrophysics Data System (ADS)

    Kurihara, Takashi; Aridome, Msato; Ayade, Heev; Zaid, Irwin; Mizuno, Daisuke

    2017-03-01

    We investigate the hydrodynamic fluctuations in suspensions of swimming microorganisms (Chlamydomonas) by observing the probe particles dispersed in the media. Short-term fluctuations of probe particles were superdiffusive and displayed heavily tailed non-Gaussian distributions. The analytical theory that explains the observed distribution was derived by summing the power-law-decaying hydrodynamic interactions from spatially distributed field sources (here, swimming microorganisms). The summing procedure, which we refer to as the physical limit operation, is applicable to a variety of physical fluctuations to which the classical central limiting theory does not apply. Extending the analytical formula to compare to experiments in active swimmer suspensions, we show that the non-Gaussian shape of the observed distribution obeys the analytic theory concomitantly with independently determined parameters such as the strength of force generations and the concentration of Chlamydomonas. Time evolution of the distributions collapsed to a single master curve, except for their extreme tails, for which our theory presents a qualitative explanation. Investigations thereof and the complete agreement with theoretical predictions revealed broad applicability of the formula to dispersions of active sources of fluctuations.

  6. The impact of circulation control on rotary aircraft controls systems

    NASA Technical Reports Server (NTRS)

    Kingloff, R. F.; Cooper, D. E.

    1987-01-01

    Application of circulation to rotary wing systems is a new development. Efforts to determine the near and far field flow patterns and to analytically predict those flow patterns have been underway for some years. Rotary wing applications present a new set of challenges in circulation control technology. Rotary wing sections must accommodate substantial Mach number, free stream dynamic pressure and section angle of attack variation at each flight condition within the design envelope. They must also be capable of short term circulation blowing modulation to produce control moments and vibration alleviation in addition to a lift augmentation function. Control system design must provide this primary control moment, vibration alleviation and lift augmentation function. To accomplish this, one must simultaneously control the compressed air source and its distribution. The control law algorithm must therefore address the compressor as the air source, the plenum as the air pressure storage and the pneumatic flow gates or valves that distribute and meter the stored pressure to the rotating blades. Also, mechanical collective blade pitch, rotor shaft angle of attack and engine power control must be maintained.

  7. Application of the modified transient plane source technique for early detection of liquid explosives

    NASA Astrophysics Data System (ADS)

    Bateman, Robert; Harris, Adam; Lee, Linda; Howle, Christopher R.; Ackermann, Sarah L. G.

    2016-05-01

    The paper will review the feasibility of adapting the Modified Transient Plane Source (MTPS) method as a screening tool for early-detection of explosives and hazardous materials. Materials can be distinguished from others based on their inherent thermal properties (e.g. thermal effusivity) in testing through different types of barrier materials. A complimentary advantage to this technique relative to other traditional detection technologies is that it can penetrate reflective barrier materials, such as aluminum, easily. A strong proof-of-principle is presented on application of the MTPS transient thermal property measuring in the early-screening of liquid explosives. The work demonstrates a significant sensitivity to distinguishing a wide range of fluids based on their thermal properties through a barrier material. The work covers various complicating factors to the longer-term adoption of such a method including the impact of carbonization and viscosity. While some technical challenges remain, the technique offers significant advantages in complimenting existing detection methods in being able to penetrate reflective metal containers (e.g. aluminum soft drinkscans) with ease.

  8. Utilizing Radioisotope Power Systems for Human Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Schreiner, Timothy M.

    2005-01-01

    The Vision for Space Exploration has a goal of sending crewed missions to the lunar surface as early as 2015 and no later than 2020. The use of nuclear power sources could aid in assisting crews in exploring the surface and performing In-Situ Resource Utilization (ISRU) activities. Radioisotope Power Systems (RPS) provide constant sources of electrical power and thermal energy for space applications. RPSs were carried on six of the crewed Apollo missions to power surface science packages, five of which still remain on the lunar surface. Future RPS designs may be able to play a more active role in supporting a long-term human presence. Due to its lower thermal and radiation output, the planned Stirling Radioisotope Generator (SRG) appears particularly attractive for manned applications. The MCNPX particle transport code has been used to model the current SRG design to assess its use in proximity with astronauts operating on the surface. Concepts of mobility and ISRU infrastructure were modeled using MCNPX to analyze the impact of RPSs on crewed mobility systems. Strategies for lowering the radiation dose were studied to determine methods of shielding the crew from the RPSs.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barry, Kenneth

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted tomore » the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced settling by particle growth are the dominant processes for determining DFs for expected conditions in an iPWR containment. These processes are dependent on the areato-volume (A/V) ratio, which should benefit iPWR designs because these reactors have higher A/Vs compared to existing LWRs.« less

  10. Radioisotope Power Sources for MEMS Devices,

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, J.P.

    2001-06-17

    Microelectromechanical systems (MEMS) comprise a rapidly expanding research field with potential applications varying from sensors in airbags to more recent optical applications. Depending on the application, these devices often require an on-board power source for remote operation, especially in cases requiring operation for an extended period of time. Previously suggested power sources include fossil fuels and solar energy, but nuclear power sources may provide significant advantages for certain applications. Hence, the objective of this study is to establish the viability of using radioisotopes to power realistic MEMS devices. A junction-type battery was constructed using silicon and a {sup 63}Ni liquidmore » source. A source volume containing 64 {micro}Ci provided a power of {approx}0.07 nW. A more novel application of nuclear sources for MEMS applications involves the creation of a resonator that is driven by charge collection in a cantilever beam. Preliminary results have established the feasibility of this concept, and future work will optimize the design for various applications.« less

  11. Improvement of nutritional quality of greenhouse-grown lettuce by arbuscular mycorrhizal fungi is conditioned by the source of phosphorus nutrition.

    PubMed

    Baslam, Marouane; Pascual, Inmaculada; Sánchez-Díaz, Manuel; Erro, Javier; García-Mina, José María; Goicoechea, Nieves

    2011-10-26

    The improvement of the nutritional quality of lettuce by its association with arbuscular mycorrhizal fungi (AMF) has been recently reported in a previous study. The aim of this research was to evaluate if the fertilization with three P sources differing in water solubility affects the effectiveness of AMF for improving lettuce growth and nutritional quality. The application of either water-soluble P sources (Hewitt's solution and single superphosphate) or the water-insoluble (WI) fraction of a "rhizosphere-controlled fertilizer" did not exert negative effects on the establishment of the mycorrhizal symbiosis. AMF improved lettuce growth and nutritional quality. Nevertheless, the effect was dependent on the source of P and cultivar. Batavia Rubia Munguía (green cultivar) benefited more than Maravilla de Verano (red cultivar) in terms of mineral nutrients, total soluble sugars, and ascorbate contents. The association of lettuce with AMF resulted in greater quantities of anthocyanins in plants fertilized with WI, carotenoids when plants received either Hewitt's solution or WI, and phenolics regardless of the P fertilizer applied.

  12. Frequency tunable electronic sources working at room temperature in the 1 to 3 THz band

    NASA Astrophysics Data System (ADS)

    Maestrini, Alain; Mehdi, Imran; Siles, José V.; Lin, Robert; Lee, Choonsup; Chattopadhyay, Goutam; Pearson, John; Siegel, Peter

    2012-10-01

    Compact, room temperature terahertz sources are much needed in the 1 to 3 THz band for developing multi-pixel heterodyne receivers for astrophysics and planetary science or for building short-range high spatial resolution THz imaging systems able to see through low water content and non metallic materials, smoke or dust for a variety of applications ranging from the inspection of art artifacts to the detection of masked or concealed objects. All solid-sate electronic sources based on a W-band synthesizer followed by a high-power W-band amplifier and a cascade of Schottky diode based THz frequency multipliers are now capable of producing more than 1 mW at 0.9THz, 50 μW at 2 THz and 18 μW at 2.6 THz without the need of any cryogenic system. These sources are frequency agile and have a relative bandwidth of 10 to 15%, limited by the high power W-band amplifiers. The paper will present the latest developments of this technology and its perspective in terms of frequency range, bandwidth and power.

  13. Analysis and Design of Symmetrical Capacitor Diode Voltage Multiplier Driven by LCL-T Resonant Converter

    NASA Astrophysics Data System (ADS)

    Malviya, Devesh; Borage, Mangesh Balkrishna; Tiwari, Sunil

    2017-12-01

    This paper investigates the possibility of application of Resonant Immittance Converters (RICs) as a current source for the current-fed symmetrical Capacitor-Diode Voltage Multiplier (CDVM) with LCL-T Resonant Converter (RC) as an example. Firstly, detailed characterization of the current-fed symmetrical CDVM is carried out using repeated simulations followed by the normalization of the simulation results in order to derive the closed-form curve fit equations to predict the operating modes, output voltage and ripple in terms of operating parameters. RICs, due to their ability to convert voltage source into a current source, become a possible candidate for the realization of current source for the current-fed symmetrical CDVM. Detailed analysis, optimization and design of LCL-T RC with CDVM is performed in this paper. A step by step design procedure for the design of CDVM and the converter is proposed. A 5-stage prototype symmetrical CDVM driven by LCL-T RC to produce 2.5 kV, 50 mA dc output voltage is designed, built and tested to validate the findings of the analysis and simulation.

  14. Energy issues in microwave food processing: A review of developments and the enabling potentials of solid-state power delivery.

    PubMed

    Atuonwu, J C; Tassou, S A

    2018-01-23

    The enormous magnitude and variety of microwave applications in household, commercial and industrial food processing creates a strong motivation for improving the energy efficiency and hence, sustainability of the process. This review critically assesses key energy issues associated with microwave food processing, focusing on previous energy performance studies, energy performance metrics, standards and regulations. Factors affecting energy-efficiency are categorised into source, load and source-load matching factors. This highlights the need for highly-flexible and controllable power sources capable of receiving real-time feedback on load properties, and effecting rapid control actions to minimise reflections, heating non-uniformities and other imperfections that lead to energy losses. A case is made for the use of solid-state amplifiers as alternatives to conventional power sources, magnetrons. By a full-scale techno-economic analysis, including energy aspects, it is shown that the use of solid-state amplifiers as replacements to magnetrons is promising, not only from an energy and overall technical perspective, but also in terms of economics.

  15. Parameterized source term in the diffusion approximation for enhanced near-field modeling of collimated light

    NASA Astrophysics Data System (ADS)

    Jia, Mengyu; Wang, Shuang; Chen, Xueying; Gao, Feng; Zhao, Huijuan

    2016-03-01

    Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we have reported on an improved explicit model, referred to as "Virtual Source" (VS) diffuse approximation (DA), to inherit the mathematical simplicity of the DA while considerably extend its validity in modeling the near-field photon migration in low-albedo medium. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the nearfield to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. The proposed VS-DA model is validated by comparing with the Monte Carlo simulations, and further introduced in the image reconstruction of the Laminar Optical Tomography system.

  16. Analysis of field of view limited by a multi-line X-ray source and its improvement for grating interferometry.

    PubMed

    Du, Yang; Huang, Jianheng; Lin, Danying; Niu, Hanben

    2012-08-01

    X-ray phase-contrast imaging based on grating interferometry is a technique with the potential to provide absorption, differential phase contrast, and dark-field signals simultaneously. The multi-line X-ray source used recently in grating interferometry has the advantage of high-energy X-rays for imaging of thick samples for most clinical and industrial investigations. However, it has a drawback of limited field of view (FOV), because of the axial extension of the X-ray emission area. In this paper, we analyze the effects of axial extension of the multi-line X-ray source on the FOV and its improvement in terms of Fresnel diffraction theory. Computer simulation results show that the FOV limitation can be overcome by use of an alternative X-ray tube with a specially designed multi-step anode. The FOV of this newly designed X-ray source can be approximately four times larger than that of the multi-line X-ray source in the same emission area. This might be beneficial for the applications of X-ray phase contrast imaging in materials science, biology, medicine, and industry.

  17. Cancellation of spurious arrivals in Green's function extraction and the generalized optical theorem

    USGS Publications Warehouse

    Snieder, R.; Van Wijk, K.; Haney, M.; Calvert, R.

    2008-01-01

    The extraction of the Green's function by cross correlation of waves recorded at two receivers nowadays finds much application. We show that for an arbitrary small scatterer, the cross terms of scattered waves give an unphysical wave with an arrival time that is independent of the source position. This constitutes an apparent inconsistency because theory predicts that such spurious arrivals do not arise, after integration over a complete source aperture. This puzzling inconsistency can be resolved for an arbitrary scatterer by integrating the contribution of all sources in the stationary phase approximation to show that the stationary phase contributions to the source integral cancel the spurious arrival by virtue of the generalized optical theorem. This work constitutes an alternative derivation of this theorem. When the source aperture is incomplete, the spurious arrival is not canceled and could be misinterpreted to be part of the Green's function. We give an example of how spurious arrivals provide information about the medium complementary to that given by the direct and scattered waves; the spurious waves can thus potentially be used to better constrain the medium. ?? 2008 The American Physical Society.

  18. Promising features of Moringa oleifera oil: recent updates and perspectives.

    PubMed

    Nadeem, Muhammad; Imran, Muhammad

    2016-12-08

    Lipids are the concentrated source of energy, fat soluble vitamins, essential fatty acids, carriers of flavours and many bio-active compounds with important role in maintaining physiological functions of biological body. Moringa oleifera is native to Himalaya and widely grown in many Asian and African countries with seed oil content range from 35-40%. Moringa oleifera oil (MOO) has light yellow colour with mild nutty flavour and fatty acids composition suggests that MOO is highly suitable for both edible and non-edible applications. MOO is extremely resistant to autoxidation which can be used as an antioxidant for the long term stabilization of commercial edible oils. Thermal stability of MOO is greater than soybean, sunflower, canola and cottonseed oils. High oleic contents of MOO are believed to have the capability of increasing beneficial HDL cholesterol and decreased the serum cholesterol and triglycerides. MOO applications have also been explored in cosmetics, folk medicines and skin care formulations. Overall, this review focuses on commercial production status, food applications, antioxidant characteristics, health benefits, thermal stability, fractionation, cholesterol contents, medicinal, nutraceutical action, toxicological evaluation, biodiesel production, personal care formulations and future perspectives of the MOO for the stake holders to process and utilize MOO as a new source of edible oil for industrial purpose.

  19. Cellulose nanocrystals in nanocomposite approach: Green and high-performance materials for industrial, biomedical and agricultural applications

    NASA Astrophysics Data System (ADS)

    Fortunati, E.; Torre, L.

    2016-05-01

    The need to both avoid wastes and find new renewable resources has led to a new and promising research based on the possibility to revalorize the biomass producing sustainable chemicals and/or materials which may play a major role in replacing systems traditionally obtained from non-renewable sources. Most of the low-value biomass is termed lignocellulosic, referring to its main constituent biopolymers: cellulose, hemicelluloses and lignin. In this context, nanocellulose, and in particular cellulose nanocrystals (CNC), have gain considerable attention as nanoreinforcement for polymer matrices, mainly biodegradable. Derived from the most abundant polymeric resource in nature and with inherent biodegradability, nanocellulose is an interesting nanofiller for the development of nanocomposites for industrial, biomedical and agricultural applications. Due to the high amount of hydroxyl groups on their surface, cellulose nanocrystals are easy to functionalize. Well dispersed CNC are able, in fact, to enhance several properties of polymers, i.e.: thermal, mechanical, barrier, surface wettability, controlled of active compound and/or drug release. The main objective here is to give a general overview of CNC applications, summarizing our recent developments of bio-based nanocomposite formulations reinforced with cellulose nanocrystals extracted from different natural sources and/or wastes for food packaging, medical and agricultural sectors.

  20. On the relevance of source effects in geomagnetic pulsations for induction soundings

    NASA Astrophysics Data System (ADS)

    Neska, Anne; Tadeusz Reda, Jan; Leszek Neska, Mariusz; Petrovich Sumaruk, Yuri

    2018-03-01

    This study is an attempt to close a gap between recent research on geomagnetic pulsations and their usage as source signals in electromagnetic induction soundings (i.e., magnetotellurics, geomagnetic depth sounding, and magnetovariational sounding). The plane-wave assumption as a precondition for the proper performance of these methods is partly violated by the local nature of field line resonances which cause a considerable portion of pulsations at mid latitudes. It is demonstrated that and explained why in spite of this, the application of remote reference stations in quasi-global distances for the suppression of local correlated-noise effects in induction arrows is possible in the geomagnetic pulsation range. The important role of upstream waves and of the magnetic equatorial region for such applications is emphasized. Furthermore, the principal difference between application of reference stations for local transfer functions (which result in sounding curves and induction arrows) and for inter-station transfer functions is considered. The preconditions for the latter are much stricter than for the former. Hence a failure to estimate an inter-station transfer function to be interpreted in terms of electromagnetic induction, e.g., because of field line resonances, does not necessarily prohibit use of the station pair for a remote reference estimation of the impedance tensor.

  1. Development challenges for Low Temperature Plasma Sources ``from Idea to Prototype''

    NASA Astrophysics Data System (ADS)

    Gerling, T.; Baudler, J.-S.; Horn, S.; Schmidt, M.; Weltmann, K.-D.

    2015-09-01

    While plasma medicine is a well-motivated and intensively investigated topic, the requirements on the plasma sources change for individual applications. For example in dermatology, a large scale treatment is favored, while in dentistry, a localized application of plasma sources is required. Meanwhile, plasma source development is based on feasibility and not on the application. When a source is developed, it is usually motivated towards an application instead of considering an application and designing a plasma source to fit its needs. Each approach has its advantage and can lead to an advance in the field. With this contribution, we will present an approach from idea to prototype and show challenges in the plasma source development. For example, the consideration of legal regulations, adaption of the plasma source for a specific field of application and the interplay of gas flow dynamics with electrical field distribution. The solution was developed within several iterations to optimize it for different requirements. The obstacles that occurred during the development process will be highlighted and discussed. Afterwards the final source is characterized for a potential medical application and compared directly with a plasma source certified as a medical product. Acknowledging grants: AU 11 038; ESF/IV-BM-B35-0010/13.

  2. An assessment and comparison of fuel cells for transportation applications

    NASA Astrophysics Data System (ADS)

    Krumpelt, M.; Christianson, C. C.

    1989-09-01

    Fuel cells offer the potential of a clean, efficient power source for buses, cars, and other transportation applications. When the fuel cell is run on methanol, refueling would be as rapid as with gasoline-powered internal combustion engines, providing a virtually unlimited range while still maintaining the smooth and quiet acceleration that is typical for electric vehicles. The advantages and disadvantages of five types of fuel cells are reviewed and analyzed for a transportation application: alkaline, phosphoric acid, proton exchange membrane, molten carbonate, and solid oxide. The status of each technology is discussed, system designs are reviewed, and preliminary comparisons of power densities, start-up times, and dynamic response capabilities are made. To test the concept, a fuel cell/battery powered urban bus appears to be a good first step that can be realized today with phosphoric acid cells. In the longer term, the proton exchange membrane and solid oxide fuel cells appear to be superior.

  3. Self-assembled peptide nanostructures for functional materials

    NASA Astrophysics Data System (ADS)

    Sardan Ekiz, Melis; Cinar, Goksu; Aref Khalily, Mohammad; Guler, Mustafa O.

    2016-10-01

    Nature is an important inspirational source for scientists, and presents complex and elegant examples of adaptive and intelligent systems created by self-assembly. Significant effort has been devoted to understanding these sophisticated systems. The self-assembly process enables us to create supramolecular nanostructures with high order and complexity, and peptide-based self-assembling building blocks can serve as suitable platforms to construct nanostructures showing diverse features and applications. In this review, peptide-based supramolecular assemblies will be discussed in terms of their synthesis, design, characterization and application. Peptide nanostructures are categorized based on their chemical and physical properties and will be examined by rationalizing the influence of peptide design on the resulting morphology and the methods employed to characterize these high order complex systems. Moreover, the application of self-assembled peptide nanomaterials as functional materials in information technologies and environmental sciences will be reviewed by providing examples from recently published high-impact studies.

  4. Towards on-chip time-resolved thermal mapping with micro-/nanosensor arrays

    PubMed Central

    2012-01-01

    In recent years, thin-film thermocouple (TFTC) array emerged as a versatile candidate in micro-/nanoscale local temperature sensing for its high resolution, passive working mode, and easy fabrication. However, some key issues need to be taken into consideration before real instrumentation and industrial applications of TFTC array. In this work, we will demonstrate that TFTC array can be highly scalable from micrometers to nanometers and that there are potential applications of TFTC array in integrated circuits, including time-resolvable two-dimensional thermal mapping and tracing the heat source of a device. Some potential problems and relevant solutions from a view of industrial applications will be discussed in terms of material selection, multiplexer reading, pattern designing, and cold-junction compensation. We show that the TFTC array is a powerful tool for research fields such as chip thermal management, lab-on-a-chip, and other novel electrical, optical, or thermal devices. PMID:22931306

  5. Web 2.0 Applications in China

    NASA Astrophysics Data System (ADS)

    Zhai, Dongsheng; Liu, Chen

    Since 2005, the term Web 2.0 has gradually become a hot topic on the Internet. Web 2.0 lets users create web contents as distinct from webmasters or web coders. Web 2.0 has come to our work, our life and even has become an indispensable part of our web-life. Its applications have already been widespread in many fields on the Internet. So far, China has about 137 million netizens [1], therefore its Web 2.0 market is so attractive that many sources of venture capital flow into the Chinese Web 2.0 market and there are also a lot of new Web 2.0 companies in China. However, the development of Web 2.0 in China is accompanied by some problems and obstacles. In this paper, we will mainly discuss Web 2.0 applications in China, with their current problems and future development trends.

  6. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  7. A review of the use of handheld computers in medical nutrition.

    PubMed

    Holubar, Stefan; Harvey-Banchik, Lillian

    2007-08-01

    Handheld computers, or personal digital assistants (PDAs), have been used to assist clinicians in medical nutrition since the early 1980s. The term PDA was originally applied to programmable calculators; over time, the capabilities of these devices were expanded to allow for the use of more complicated programs such as databases, spreadsheets, and electronic books. Slowly, the device evolved into what is more commonly thought of as a PDA, that is, a device such as a PalmOS (PalmSource, Inc, Tokyo, Japan) or PocketPC (Microsoft, Redmond, WA) unit. We present a review of the literature about the use of PDAs in medical nutrition, followed by a discussion of the different types of PDAs and mobile technologies that are commercially available. This is followed by a discussion of software applications that are currently available for use by nutrition clinicians, focusing on freeware applications. Finally, future technologies and applications are discussed.

  8. Enhancing resolution in coherent x-ray diffraction imaging.

    PubMed

    Noh, Do Young; Kim, Chan; Kim, Yoonhee; Song, Changyong

    2016-12-14

    Achieving a resolution near 1 nm is a critical issue in coherent x-ray diffraction imaging (CDI) for applications in materials and biology. Albeit with various advantages of CDI based on synchrotrons and newly developed x-ray free electron lasers, its applications would be limited without improving resolution well below 10 nm. Here, we review the issues and efforts in improving CDI resolution including various methods for resolution determination. Enhancing diffraction signal at large diffraction angles, with the aid of interference between neighboring strong scatterers or templates, is reviewed and discussed in terms of increasing signal-to-noise ratio. In addition, we discuss errors in image reconstruction algorithms-caused by the discreteness of the Fourier transformations involved-which degrade the spatial resolution, and suggest ways to correct them. We expect this review to be useful for applications of CDI in imaging weakly scattering soft matters using coherent x-ray sources including x-ray free electron lasers.

  9. A genetic algorithm-based job scheduling model for big data analytics.

    PubMed

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  10. Enriching semantic knowledge bases for opinion mining in big data applications

    PubMed Central

    Weichselbraun, A.; Gindl, S.; Scharl, A.

    2014-01-01

    This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process. PMID:25431524

  11. 40 CFR 74.16 - Application requirements for combustion sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... combustion sources. 74.16 Section 74.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... for combustion sources. (a) Opt-in permit application. Each complete opt-in permit application for a combustion source shall contain the following elements in a format prescribed by the Administrator: (1...

  12. 40 CFR 74.16 - Application requirements for combustion sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... combustion sources. 74.16 Section 74.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... for combustion sources. (a) Opt-in permit application. Each complete opt-in permit application for a combustion source shall contain the following elements in a format prescribed by the Administrator: (1...

  13. 40 CFR 74.16 - Application requirements for combustion sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... combustion sources. 74.16 Section 74.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... for combustion sources. (a) Opt-in permit application. Each complete opt-in permit application for a combustion source shall contain the following elements in a format prescribed by the Administrator: (1...

  14. 40 CFR 74.16 - Application requirements for combustion sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... combustion sources. 74.16 Section 74.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... for combustion sources. (a) Opt-in permit application. Each complete opt-in permit application for a combustion source shall contain the following elements in a format prescribed by the Administrator: (1...

  15. 40 CFR 74.16 - Application requirements for combustion sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... combustion sources. 74.16 Section 74.16 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... for combustion sources. (a) Opt-in permit application. Each complete opt-in permit application for a combustion source shall contain the following elements in a format prescribed by the Administrator: (1...

  16. Comparison of a new integrated current source with the modified Howland circuit for EIT applications.

    PubMed

    Hong, Hongwei; Rahal, Mohamad; Demosthenous, Andreas; Bayford, Richard H

    2009-10-01

    Multi-frequency electrical impedance tomography (MF-EIT) systems require current sources that are accurate over a wide frequency range (1 MHz) and with large load impedance variations. The most commonly employed current source design in EIT systems is the modified Howland circuit (MHC). The MHC requires tight matching of resistors to achieve high output impedance and may suffer from instability over a wide frequency range in an integrated solution. In this paper, we introduce a new integrated current source design in CMOS technology and compare its performance with the MHC. The new integrated design has advantages over the MHC in terms of power consumption and area. The output current and the output impedance of both circuits were determined through simulations and measurements over the frequency range of 10 kHz to 1 MHz. For frequencies up to 1 MHz, the measured maximum variation of the output current for the integrated current source is 0.8% whereas for the MHC the corresponding value is 1.5%. Although the integrated current source has an output impedance greater than 1 MOmega up to 1 MHz in simulations, in practice, the impedance is greater than 160 kOmega up to 1 MHz due to the presence of stray capacitance.

  17. Short-term emergency response planning and risk assessment via an integrated modeling system for nuclear power plants in complex terrain

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Weng, Yu-Chi

    2013-03-01

    Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns. However, most of the hazardous materials being released into the environment from nuclear power plants are not neutral pollutants, so the particle and multi-segment puff models can be regarded as the most suitable models to incorporate into the output of the diagnostic wind field model in a modern emergency preparedness and response system. The proposed SDSS illustrates the state-of-the-art system design based on the situation of complex terrain in South Taiwan. This system design of SDSS with 3-dimensional animation capability using a tailored source term model in connection with ArcView® Geographical Information System map layers and remote sensing images is useful for meeting the design goal of nuclear power plants located in complex terrain.

  18. Weighted Regressions on Time, Discharge, and Season (WRTDS), with an application to Chesapeake Bay River inputs

    USGS Publications Warehouse

    Hirsch, Robert M.; Moyer, Douglas; Archfield, Stacey A.

    2010-01-01

    A new approach to the analysis of long-term surface water-quality data is proposed and implemented. The goal of this approach is to increase the amount of information that is extracted from the types of rich water-quality datasets that now exist. The method is formulated to allow for maximum flexibility in representations of the long-term trend, seasonal components, and discharge-related components of the behavior of the water-quality variable of interest. It is designed to provide internally consistent estimates of the actual history of concentrations and fluxes as well as histories that eliminate the influence of year-to-year variations in streamflow. The method employs the use of weighted regressions of concentrations on time, discharge, and season. Finally, the method is designed to be useful as a diagnostic tool regarding the kinds of changes that are taking place in the watershed related to point sources, groundwater sources, and surface-water nonpoint sources. The method is applied to datasets for the nine large tributaries of Chesapeake Bay from 1978 to 2008. The results show a wide range of patterns of change in total phosphorus and in dissolved nitrate plus nitrite. These results should prove useful in further examination of the causes of changes, or lack of changes, and may help inform decisions about future actions to reduce nutrient enrichment in the Chesapeake Bay and its watershed.

  19. Integrating new Storage Technologies into EOS

    NASA Astrophysics Data System (ADS)

    Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul

    2015-12-01

    The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.

  20. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak

    NASA Astrophysics Data System (ADS)

    Xiong, C. Y.; Chen, J.; Li, Q.; Liu, Y.; Gao, L.

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (˜100-˜500 kHz/10 min) and decay of laser power (˜10%-˜20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  1. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak.

    PubMed

    Xiong, C Y; Chen, J; Li, Q; Liu, Y; Gao, L

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (∼100-∼500 kHz/10 min) and decay of laser power (∼10%-∼20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  2. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas

    2016-11-01

    Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  3. Ramses-GPU: Second order MUSCL-Handcock finite volume fluid solver

    NASA Astrophysics Data System (ADS)

    Kestener, Pierre

    2017-10-01

    RamsesGPU is a reimplementation of RAMSES (ascl:1011.007) which drops the adaptive mesh refinement (AMR) features to optimize 3D uniform grid algorithms for modern graphics processor units (GPU) to provide an efficient software package for astrophysics applications that do not need AMR features but do require a very large number of integration time steps. RamsesGPU provides an very efficient C++/CUDA/MPI software implementation of a second order MUSCL-Handcock finite volume fluid solver for compressible hydrodynamics as a magnetohydrodynamics solver based on the constraint transport technique. Other useful modules includes static gravity, dissipative terms (viscosity, resistivity), and forcing source term for turbulence studies, and special care was taken to enhance parallel input/output performance by using state-of-the-art libraries such as HDF5 and parallel-netcdf.

  4. Singular values behaviour optimization in the diagnosis of feed misalignments in radioastronomical reflectors

    NASA Astrophysics Data System (ADS)

    Capozzoli, Amedeo; Curcio, Claudio; Liseno, Angelo; Savarese, Salvatore; Schipani, Pietro

    2016-07-01

    The communication presents an innovative method for the diagnosis of reflector antennas in radio astronomical applications. The approach is based on the optimization of the number and the distribution of the far field sampling points exploited to retrieve the antenna status in terms of feed misalignments, this to drastically reduce the time length of the measurement process and minimize the effects of variable environmental conditions and simplifying the tracking process of the source. The feed misplacement is modeled in terms of an aberration function of the aperture field. The relationship between the unknowns and the far field pattern samples is linearized thanks to a Principal Component Analysis. The number and the position of the field samples are then determined by optimizing the Singular Values behaviour of the relevant operator.

  5. Applicability of existing C3 (command, control and communications) vulnerability and hardness analyses to sentry system issues. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, R.C.

    1983-01-13

    This report is a compilation of abstracts resulting from a literature search of reports relevant to Sentry Ballistic missile system C3 vulnerability and hardness. Primary sources consulted were the DOD Nuclear Information Analysis Center (DASIAC) and the Defense Technical Information Center (DTIC). Approximately 175 reports were reviewed and abstracted, including several related to computer programs for estimating nuclear effects on electromagnetic propagation. The reports surveyed were ranked in terms of their importance for Sentry C3 VandH issues.

  6. Urban stormwater runoff study at Davenport, Iowa

    USGS Publications Warehouse

    Schaap, Bryan D.

    1995-01-01

    Urban storm water runoff is being investigated as a nonpoint source of pollution across the country as urban areas with populations over 100,000 conduct studies designed to meet U.S. Environmental Protection Agency guidelines for National Pollutant Discharge Elimination System permits for their stormwater discharges. From 1991 through 1994, the City of Davenport, Iowa (fig. 1), and the U.S. Geological Survey cooperatively conducted a study designed to meet technical conditions of the permit application and to develop the criteria for ongoing monitoring during the term of the permit. 

  7. Xanthos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-30

    Xanthos is a Python package designed to quantify and analyze global water availability in history and in future at 0.5° × 0.5° spatial resolution and a monthly time step under a changing climate. Its performance was also tested through real applications. It is open-source, extendable and convenient to researchers who work on long-term climate data for studies of global water supply, and Global Change Assessment Model (GCAM). This package integrates inherent global gridded data maps, I/O modules, Water-Balance Model modules and diagnostics modules by user-defined configuration.

  8. Unit with Fluidized Bed for Gas-Vapor Activation of Different Carbonaceous Materials for Various Purposes: Design, Computation, Implementation

    NASA Astrophysics Data System (ADS)

    Strativnov, Eugene

    2017-02-01

    We propose the technology of obtaining the promising material with wide specter of application-activated nanostructured carbon. In terms of technical indicators, it will stand next to the materials produced by complex regulations with the use of costly chemical operations. It can be used for the following needs: as a sorbent for hemosorption and enterosorption, for creation of the newest source of electric current (lithium and zinc air batteries, supercapacitors), and for processes of short-cycle adsorption gas separation.

  9. Aircraft gas-turbine engines: Noise reduction and vibration control. (Latest citations from Information Services in Mechanical Engineering data base). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-06-01

    The bibliography contains citations concerning the design and analysis of aircraft gas turbine engines with respect to noise and vibration control. Included are studies regarding the measurement and reduction of noise at its source, within the aircraft, and on the ground. Inlet, nozzle and core aerodynamic studies are cited. Propfan, turbofan, turboprop engines, and applications in short take-off and landing (STOL) aircraft are included. (Contains a minimum of 202 citations and includes a subject term index and title list.)

  10. Comparison of digital signal processing modules in gamma-ray spectrometry.

    PubMed

    Lépy, Marie-Christine; Cissé, Ousmane Ibrahima; Pierre, Sylvie

    2014-05-01

    Commercial digital signal-processing modules have been tested for their applicability to gamma-ray spectrometry. The tests were based on the same n-type high purity germanium detector. The spectrum quality was studied in terms of energy resolution and peak area versus shaping parameters, using a Eu-152 point source. The stability of a reference peak count rate versus the total count rate was also examined. The reliability of the quantitative results is discussed for their use in measurement at the metrological level. © 2013 Published by Elsevier Ltd.

  11. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  12. Influence of Factors of Cryopreservation and Hypothermic Storage on Survival and Functional Parameters of Multipotent Stromal Cells of Placental Origin

    PubMed Central

    Pogozhykh, Olena; Mueller, Thomas; Prokopyuk, Olga

    2015-01-01

    Human placenta is a highly perspective source of multipotent stromal cells (MSCs) both for the purposes of patient specific auto-banking and allogeneic application in regenerative medicine. Implementation of new GMP standards into clinical practice enforces the search for relevant methods of cryopreservation and short-term hypothermic storage of placental MSCs. In this paper we analyze the effect of different temperature regimes and individual components of cryoprotective media on viability, metabolic and culture properties of placental MSCs. We demonstrate (I) the possibility of short-term hypothermic storage of these cells; (II) determine DMSO and propanediol as the most appropriate cryoprotective agents; (III) show the possibility of application of volume expanders (plasma substituting solutions based on dextran or polyvinylpyrrolidone); (IV) reveal the priority of ionic composition over the serum content in cryopreservation media; (V) determine a cooling rate of 1°C/min down to -40°C followed by immersion into liquid nitrogen as the optimal cryopreservation regime for this type of cells. This study demonstrates perspectives for creation of new defined cryopreservation methods towards GMP standards. PMID:26431528

  13. A new Caputo time fractional model for heat transfer enhancement of water based graphene nanofluid: An application to solar energy

    NASA Astrophysics Data System (ADS)

    Aman, Sidra; Khan, Ilyas; Ismail, Zulkhibri; Salleh, Mohd Zuki; Tlili, I.

    2018-06-01

    In this article the idea of Caputo time fractional derivatives is applied to MHD mixed convection Poiseuille flow of nanofluids with graphene nanoparticles in a vertical channel. The applications of nanofluids in solar energy are argued for various solar thermal systems. It is argued in the article that using nanofluids is an alternate source to produce solar energy in thermal engineering and solar energy devices in industries. The problem is modelled in terms of PDE's with initial and boundary conditions and solved analytically via Laplace transform method. The obtained solutions for velocity, temperature and concentration are expressed in terms of Wright's function. These solutions are significantly controlled by the variations of parameters including thermal Grashof number, Solutal Grashof number and nanoparticles volume fraction. Expressions for skin-friction, Nusselt and Sherwood numbers are also determined on left and right walls of the vertical channel with important numerical results in tabular form. It is found that rate of heat transfer increases with increasing nanoparticles volume fraction and Caputo time fractional parameters.

  14. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    PubMed

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  15. Field mappers for laser material processing

    NASA Astrophysics Data System (ADS)

    Blair, Paul; Currie, Matthew; Trela, Natalia; Baker, Howard J.; Murphy, Eoin; Walker, Duncan; McBride, Roy

    2016-03-01

    The native shape of the single-mode laser beam used for high power material processing applications is circular with a Gaussian intensity profile. Manufacturers are now demanding the ability to transform the intensity profile and shape to be compatible with a new generation of advanced processing applications that require much higher precision and control. We describe the design, fabrication and application of a dual-optic, beam-shaping system for single-mode laser sources, that transforms a Gaussian laser beam by remapping - hence field mapping - the intensity profile to create a wide variety of spot shapes including discs, donuts, XY separable and rotationally symmetric. The pair of optics transform the intensity distribution and subsequently flatten the phase of the beam, with spot sizes and depth of focus close to that of a diffraction limited beam. The field mapping approach to beam-shaping is a refractive solution that does not add speckle to the beam, making it ideal for use with single mode laser sources, moving beyond the limits of conventional field mapping in terms of spot size and achievable shapes. We describe a manufacturing process for refractive optics in fused silica that uses a freeform direct-write process that is especially suited for the fabrication of this type of freeform optic. The beam-shaper described above was manufactured in conventional UV-fused silica using this process. The fabrication process generates a smooth surface (<1nm RMS), leading to laser damage thresholds of greater than 100J/cm2, which is well matched to high power laser sources. Experimental verification of the dual-optic filed mapper is presented.

  16. Microbiological assessment of the application of quicklime and limestone as a measure to stabilize the structure of compaction-prone soils

    NASA Astrophysics Data System (ADS)

    Deltedesco, Evi; Bauer, Lisa-Maria; Unterfrauner, Hans; Peticzka, Robert; Zehetner, Franz; Keiblinger, Katharina Maria

    2014-05-01

    Compaction of soils is caused by increasing mechanization of agriculture and forestry, construction of pipelines, surface mining and land recultivation. This results in degradation of aggregate stability and a decrease of pore space, esp. of macropores. It further impairs the water- and air permeability, and restricts the habitat of soil organisms. A promising approach to stabilize the structure and improve the permeability of soils is the addition of polyvalent ions like Ca2+ which can be added in form of quicklime (CaO) and limestone (CaCO3). In this study, we conducted a greenhouse pot experiment using these two different sources of calcium ions in order to evaluate their effect over time on physical properties and soil microbiology. We sampled silty and clayey soils from three different locations in Austria and incubated them with and without the liming materials (application 12.5 g) for 3 months in four replicates. In order to assess short-term and medium-term effects, soil samples were taken 2 days, 1 month and 3 months after application of quicklime and limestone, respectively. For these samples, we determined pH, bulk density, aggregate stability and water retention characteristics. Further, we measured microbiological parameters, such as potential enzyme activities (cellulase, phosphatase, chitinase, protease, phenoloxidase and peroxidase activity), PLFAs, microbial biomass carbon and nitrogen, dissolved organic carbon and nitrogen, nitrate nitrogen and ammonium nitrogen. In contrast to limestone, quicklime significantly improved soil aggregate stability in all tested soils only 2 days after application. Initially, soil pH was strongly increased by quicklime; however, after the second sampling (one month) the pH values of all tested soils returned to levels comparable to the soils treated with limestone. Our preliminary microbiological results show an immediate inhibition effect of quicklime on most potential hydrolytic enzyme activities and an increase in oxidative enzyme activities. These effects seem to be less pronounced in the medium term. In summary our results indicate, that the application of quicklime is a feasible measure for immediate stabilization of the structure of compaction-prone soils, showing only short-term impact on most microbial parameters.

  17. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  18. On multidisciplinary research on the application of remote sensing to water resources problems. [Wisconsin

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1973-01-01

    Research objectives during 1972-73 were to: (1) Ascertain the extent to which special aerial photography can be operationally used in monitoring water pollution parameters. (2) Ascertain the effectiveness of remote sensing in the investigation of nearshore mixing and coastal entrapment in large water bodies. (3) Develop an explicit relationship of the extent of the mixing zone in terms of the outfall, effluent and water body characteristics. (4) Develop and demonstrate the use of the remote sensing method as an effective legal implement through which administrative agencies and courts can not only investigate possible pollution sources but also legally prove the source of water pollution. (5) Evaluate the field potential of remote sensing techniques in monitoring algal blooms and aquatic macrophytes, and the use of these as indicators of lake eutrophication level. (6) Develop a remote sensing technique for the determination of the location and extent of hydrologically active source areas in a watershed.

  19. Landscape genomic prediction for restoration of a Eucalyptus foundation species under climate change.

    PubMed

    Supple, Megan Ann; Bragg, Jason G; Broadhurst, Linda M; Nicotra, Adrienne B; Byrne, Margaret; Andrew, Rose L; Widdup, Abigail; Aitken, Nicola C; Borevitz, Justin O

    2018-04-24

    As species face rapid environmental change, we can build resilient populations through restoration projects that incorporate predicted future climates into seed sourcing decisions. Eucalyptus melliodora is a foundation species of a critically endangered community in Australia that is a target for restoration. We examined genomic and phenotypic variation to make empirical based recommendations for seed sourcing. We examined isolation by distance and isolation by environment, determining high levels of gene flow extending for 500 km and correlations with climate and soil variables. Growth experiments revealed extensive phenotypic variation both within and among sampling sites, but no site-specific differentiation in phenotypic plasticity. Model predictions suggest that seed can be sourced broadly across the landscape, providing ample diversity for adaptation to environmental change. Application of our landscape genomic model to E. melliodora restoration projects can identify genomic variation suitable for predicted future climates, thereby increasing the long term probability of successful restoration. © 2018, Supple et al.

  20. The effect of barriers on wave propagation phenomena: With application for aircraft noise shielding

    NASA Technical Reports Server (NTRS)

    Mgana, C. V. M.; Chang, I. D.

    1982-01-01

    The frequency spectrum was divided into high and low frequency regimes and two separate methods were developed and applied to account for physical factors associated with flight conditions. For long wave propagation, the acoustic filed due to a point source near a solid obstacle was treated in terms of an inner region which where the fluid motion is essentially incompressible, and an outer region which is a linear acoustic field generated by hydrodynamic disturbances in the inner region. This method was applied to a case of a finite slotted plate modelled to represent a wing extended flap for both stationary and moving media. Ray acoustics, the Kirchhoff integral formulation, and the stationary phase approximation were combined to study short wave length propagation in many limiting cases as well as in the case of a semi-infinite plate in a uniform flow velocity with a point source above the plate and embedded in a different flow velocity to simulate an engine exhaust jet stream surrounding the source.

  1. Thermophoresis on boundary layer heat and mass transfer flow of Walters-B fluid past a radiate plate with heat sink/source

    NASA Astrophysics Data System (ADS)

    Vasu, B.; Gorla, Rama Subba Reddy; Murthy, P. V. S. N.

    2017-05-01

    The Walters-B liquid model is employed to simulate medical creams and other rheological liquids encountered in biotechnology and chemical engineering. This rheological model introduces supplementary terms into the momentum conservation equation. The combined effects of thermal radiation and heat sink/source on transient free convective, laminar flow and mass transfer in a viscoelastic fluid past a vertical plate are presented by taking thermophoresis effect into account. The transformed conservation equations are solved using a stable, robust finite difference method. A parametric study illustrating the influence of viscoelasticity parameter ( Γ), thermophoretic parameter ( τ), thermal radiation parameter ( F), heat sink/source ( ϕ), Prandtl number ( Pr), Schmidt number ( Sc), thermal Grashof number ( Gr), solutal Grashof number ( Gm), temperature and concentration profiles as well as local skin-friction, Nusselt and Sherwood number is conducted. The results of this parametric study are shown graphically and inform of table. The study has applications in polymer materials processing.

  2. Biomass burning source characterization requirements in air quality models with and without data assimilation: challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Zhang, J. L.; Reid, J. S.; Curtis, C. A.; Westphal, D. L.

    2007-12-01

    Quantitative models of the transport and evolution of atmospheric pollution have graduated from the laboratory to become a part of the operational activity of forecast centers. Scientists studying the composition and variability of the atmosphere put great efforts into developing methods for accurately specifying sources of pollution, including natural and anthropogenic biomass burning. These methods must be adapted for use in operational contexts, which impose additional strictures on input data and methods. First, only input data sources available in near real-time are suitable for use in operational applications. Second, operational applications must make use of redundant data sources whenever possible. This is a shift in philosophy: in a research context, the most accurate and complete data set will be used, whereas in an operational context, the system must be designed with maximum redundancy. The goal in an operational context is to produce, to the extent possible, consistent and timely output, given sometimes inconsistent inputs. The Naval Aerosol Analysis and Prediction System (NAAPS), a global operational aerosol analysis and forecast system, recently began incorporating assimilation of satellite-derived aerosol optical depth. Assimilation of satellite AOD retrievals has dramatically improved aerosol analyses and forecasts from this system. The use of aerosol data assimilation also changes the strategy for improving the smoke source function. The absolute magnitude of emissions events can be refined through feedback from the data assimilation system, both in real- time operations and in post-processing analysis of data assimilation results. In terms of the aerosol source functions, the largest gains in model performance are now to be gained by reducing data latency and minimizing missed detections. In this presentation, recent model development work on the Fire Locating and Monitoring of Burning Emissions (FLAMBE) system that provides smoke aerosol boundary conditions for NAAPS is described, including redundant integration of multiple satellite platforms and development of feedback loops between the data assimilation system and smoke source.

  3. Post-Test Analysis of a 10-Year Sodium Heat Pipe Life Test

    NASA Technical Reports Server (NTRS)

    Rosenfeld, John H.; Locci, Ivan E.; Sanzi, James L.; Hull, David R.; Geng, Steven M.

    2011-01-01

    High-temperature heat pipes are being evaluated for use in energy conversion applications such as fuel cells, gas turbine re-combustors, Stirling cycle heat sources; and with the resurgence of space nuclear power both as reactor heat removal elements and as radiator elements. Long operating life and reliable performance are critical requirements for these applications. Accordingly, long-term materials compatibility is being evaluated through the use of high-temperature life test heat pipes. Thermacore, Inc., has carried out a sodium heat pipe 10-year life test to establish long-term operating reliability. Sodium heat pipes have demonstrated favorable materials compatibility and heat transport characteristics at high operating temperatures in air over long time periods. A representative one-tenth segment Stirling Space Power Converter heat pipe with an Inconel 718 envelope and a stainless steel screen wick has operated for over 87,000 hr (10 years) at nearly 700 C. These life test results have demonstrated the potential for high-temperature heat pipes to serve as reliable energy conversion system components for power applications that require long operating lifetime with high reliability. Detailed design specifications, operating history, and post-test analysis of the heat pipe and sodium working fluid are described. Lessons learned and future life test plans are also discussed.

  4. Ten Year Operating Test Results and Post-Test Analysis of a 1/10 Segment Stirling Sodium Heat Pipe, Phase III

    NASA Technical Reports Server (NTRS)

    Rosenfeld, John, H; Minnerly, Kenneth, G; Dyson, Christopher, M.

    2012-01-01

    High-temperature heat pipes are being evaluated for use in energy conversion applications such as fuel cells, gas turbine re-combustors, Stirling cycle heat sources; and with the resurgence of space nuclear power both as reactor heat removal elements and as radiator elements. Long operating life and reliable performance are critical requirements for these applications. Accordingly, long-term materials compatibility is being evaluated through the use of high-temperature life test heat pipes. Thermacore, Inc., has carried out a sodium heat pipe 10-year life test to establish long-term operating reliability. Sodium heat pipes have demonstrated favorable materials compatibility and heat transport characteristics at high operating temperatures in air over long time periods. A representative one-tenth segment Stirling Space Power Converter heat pipe with an Inconel 718 envelope and a stainless steel screen wick has operated for over 87,000 hr (10 yr) at nearly 700 C. These life test results have demonstrated the potential for high-temperature heat pipes to serve as reliable energy conversion system components for power applications that require long operating lifetime with high reliability. Detailed design specifications, operating history, and post-test analysis of the heat pipe and sodium working fluid are described.

  5. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  6. Radionuclides in the Arctic seas from the former Soviet Union: Potential health and ecological risks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Layton, D W; Edson, R; Varela, M

    1999-11-15

    The primary goal of the assessment reported here is to evaluate the health and environmental threat to coastal Alaska posed by radioactive-waste dumping in the Arctic and Northwest Pacific Oceans by the FSU. In particular, the FSU discarded 16 nuclear reactors from submarines and an icebreaker in the Kara Sea near the island of Novaya Zemlya, of which 6 contained spent nuclear fuel (SNF); disposed of liquid and solid wastes in the Sea of Japan; lost a {sup 90}Sr-powered radioisotope thermoelectric generator at sea in the Sea of Okhotsk; and disposed of liquid wastes at several sites in the Pacificmore » Ocean, east of the Kamchatka Peninsula. In addition to these known sources in the oceans, the RAIG evaluated FSU waste-disposal practices at inland weapons-development sites that have contaminated major rivers flowing into the Arctic Ocean. The RAIG evaluated these sources for the potential for release to the environment, transport, and impact to Alaskan ecosystems and peoples through a variety of scenarios, including a worst-case total instantaneous and simultaneous release of the sources under investigation. The risk-assessment process described in this report is applicable to and can be used by other circumpolar countries, with the addition of information about specific ecosystems and human life-styles. They can use the ANWAP risk-assessment framework and approach used by ONR to establish potential doses for Alaska, but add their own specific data sets about human and ecological factors. The ANWAP risk assessment addresses the following Russian wastes, media, and receptors: dumped nuclear submarines and icebreaker in Kara Sea--marine pathways; solid reactor parts in Sea of Japan and Pacific Ocean--marine pathways; thermoelectric generator in Sea of Okhotsk--marine pathways; current known aqueous wastes in Mayak reservoirs and Asanov Marshes--riverine to marine pathways; and Alaska as receptor. For these waste and source terms addressed, other pathways, such as atmospheric transport, could be considered under future-funded research efforts for impacts to Alaska. The ANWAP risk assessment does not address the following wastes, media, and receptors: radioactive sources in Alaska (except to add perspective for Russian source term); radioactive wastes associated with Russian naval military operations and decommissioning; Russian production reactor and spent-fuel reprocessing facilities nonaqueous source terms; atmospheric, terrestrial and nonaqueous pathways; and dose calculations for any circumpolar locality other than Alaska. These other, potentially serious sources of radioactivity to the Arctic environment, while outside the scope of the current ANWAP mandate, should be considered for future funding research efforts.« less

  7. SU-F-T-28: Evaluation of BEBIG HDR Co-60 After-Loading System for Skin Cancer Treatment Using Conical Surface Applicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safigholi, H; Soliman, A; Song, W Y

    Purpose: To evaluate the possibility of utilizing the BEBIG HDR 60Co remote after-loading system for malignant skin surface treatment using Monte Carlo (MC) simulation technique. Methods: First TG-43 parameters of BEBIG-Co-60 and Nucletron Ir-192-mHDR-V2 brachytherapy sources were simulated using MCNP6 code to benchmark the sources against the literature. Second a conical tungsten-alloy with 3-cm diameter of Planning-Target-Volume (PTV) at surface for use with a single stepping HDR source is designed. The HDR source is modeled parallel to treatment plane at the center of the conical applicator with a source surface distance (SSD) of 1.5-cm and a removable plastic end-cap withmore » a 1-mm thickness. Third, MC calculated dose distributions from HDR Co-60 for conical surface applicator were compared with the simulated data using HDR Ir-192 source. The initial calculations were made with the same conical surface applicator (standard-applicator) dimensions as the ones used with the Ir-192 system. Fourth, the applicator wall-thickness for the Co-60 system was increased (doubled) to diminish leakage dose to levels received when using the Ir-192 system. With this geometry, percentage depth dose (PDD), and relative 2D-dose profiles in transverse/coronal planes were normalized at 3-mm prescription-depth evaluated along the central axis. Results: PDD for Ir-192 and Co-60 were similar with standard and thick-walled applicator. 2D-relative dose distribution of Co-60, inside the standard-conical-applicator, generated higher penumbra (7.6%). For thick-walled applicator, it created smaller penumbra (<4%) compared to Ir-192 source in the standard-conicalapplicator. Dose leakage outside of thick-walled applicator with Co-60 source was approximately equal (≤3%) with standard applicator using Ir-192 source. Conclusion: Skin cancer treatment with equal quality can be performed with Co-60 source and thick-walled conical applicators instead of Ir-192 with standard applicators. These conical surface applicator must be used with a protective plastic end-cap to eliminate electron contamination and over-dosage of the skin.« less

  8. Development of a Persistent Reactive Treatment Zone for Containment of Sources Located in Lower-Permeability Strata

    NASA Astrophysics Data System (ADS)

    Marble, J.; Carroll, K. C.; Brusseau, M. L.; Plaschke, M.; Brinker, F.

    2013-12-01

    Source zones located in relatively deep, low-permeability formations provide special challenges for remediation. Application of permeable reactive barriers, in-situ thermal, or electrokinetic methods would be expensive and generally impractical. In addition, the use of enhanced mass-removal approaches based on reagent injection (e.g., ISCO, enhanced-solubility reagents) is likely to be ineffective. One possible approach for such conditions is to create a persistent treatment zone for purposes of containment. This study examines the efficacy of this approach for containment and treatment of contaminants in a lower permeability zone using potassium permanganate (KMnO4) as the reactant. A localized 1,1-dichloroethene (DCE) source zone is present in a section of the Tucson International Airport Area (TIAA) Superfund Site. Characterization studies identified the source of DCE to be located in lower-permeability strata adjacent to the water table. Bench-scale studies were conducted using core material collected from boreholes drilled at the site to measure DCE concentrations and determine natural oxidant demand. The reactive zone was created by injecting ~1.7% KMnO4 solution into multiple wells screened within the lower-permeability unit. The site has been monitored for ~8 years to characterize the spatial distribution of DCE and permanganate. KMnO4 continues to persist at the site, demonstrating successful creation of a long-term reactive zone. Additionally, the footprint of the DCE contaminant plume in groundwater has decreased continuously with time. This project illustrates the application of ISCO as a reactive-treatment system for lower-permeability source zones, which appears to effectively mitigate persistent mass flux into groundwater.

  9. Carbon Nanotube Based Nanotechnology for NASA Mission Needs and Societal Applications

    NASA Technical Reports Server (NTRS)

    Li, Jing; Meyyappan, M.

    2011-01-01

    Carbon nanotubes (CNT) exhibit extraordinary mechanical properties and unique electronic properties and therefore, have received much attention for more than a decade now for a variety of applications ranging from nanoelectronics, composites to meeting needs in energy, environmental and other sectors. In this talk, we focus on some near term potential of CNT applications for both NASA and other Agency/societal needs. The most promising and successful application to date is a nano chem sensor at TRL 6 that uses a 16-256 sensor array in the construction of an electronic nose. Pristine, doped, functionalized and metal-loaded SWCNTs are used as conducting materials to provide chemical variation across the individual elements of the sensor array. This miniaturized sensor has been incorporated in an iPhone for homeland security applications. Gases and vapors relevant to leak detection in crew vehicles, biomedical, mining, chemical threats, industrial spills and others have been demonstrated. SWCNTs also respond to radiation exposure via a change in conductivity and therefore, a similar strategy is being pursued to construct a radiation nose to identify radiation sources (gamma, protons, neutrons, X-ray, etc.) with their energy levels. Carbon nanofibers (CNFs) grown using plasma enhanced CVD typically are vertical, individual, freestanding structures and therefore, are ideal for construction of nanoelectrodes. A nanoelectrode array (NEA) can be the basis for an affinity-based biosensor to meet the needs in applications such as lab-on-a-chip, environmental monitoring, cancer diagnostics, biothreat monitoring, water and food safety and others. A couple of demonstrations including detection of e-coli and ricin will be discussed. The NEA is also useful for implantation in the brain for deep brain stimulation and neuroengineering applications. Miniaturization of payload such as science instrumentation and power sources is critical to reduce launch costs. High current density (greater than 100 mA/per square centimeters) field emission capabilities of CNTs can be exploited for construction of electron gun for electron microscopy and X-ray tubes for spectrometers and baggage screening. A CNT pillar array configuration has been demonstrated, not only meeting the high current density needs but more importantly providing long term emitter stability. Finally, supercapacitors hold the promise to combine the high energy density of a battery with the high power density of capacitors. Traditional graphite electrodes have not delivered this promise yet. A novel design and processing approach using MWCNTs has shown a record 550 F/g capacitance along with significant device endurance. This supercapacitor is suitable for railgun launch application for NASA, powering rovers and robots, consumer electronics and future hybrid vehicles.

  10. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  11. User Interface Design in Medical Distributed Web Applications.

    PubMed

    Serban, Alexandru; Crisan-Vida, Mihaela; Mada, Leonard; Stoicu-Tivadar, Lacramioara

    2016-01-01

    User interfaces are important to facilitate easy learning and operating with an IT application especially in the medical world. An easy to use interface has to be simple and to customize the user needs and mode of operation. The technology in the background is an important tool to accomplish this. The present work aims to creating a web interface using specific technology (HTML table design combined with CSS3) to provide an optimized responsive interface for a complex web application. In the first phase, the current icMED web medical application layout is analyzed, and its structure is designed using specific tools, on source files. In the second phase, a new graphic adaptable interface to different mobile terminals is proposed, (using HTML table design (TD) and CSS3 method) that uses no source files, just lines of code for layout design, improving the interaction in terms of speed and simplicity. For a complex medical software application a new prototype layout was designed and developed using HTML tables. The method uses a CSS code with only CSS classes applied to one or multiple HTML table elements, instead of CSS styles that can be applied to just one DIV tag at once. The technique has the advantage of a simplified CSS code, and a better adaptability to different media resolutions compared to DIV-CSS style method. The presented work is a proof that adaptive web interfaces can be developed just using and combining different types of design methods and technologies, using HTML table design, resulting in a simpler to learn and use interface, suitable for healthcare services.

  12. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  13. Systematically biological prioritizing remediation sites based on datasets of biological investigations and heavy metals in soil

    NASA Astrophysics Data System (ADS)

    Lin, Wei-Chih; Lin, Yu-Pin; Anthony, Johnathen

    2015-04-01

    Heavy metal pollution has adverse effects on not only the focal invertebrate species of this study, such as reduction in pupa weight and increased larval mortality, but also on the higher trophic level organisms which feed on them, either directly or indirectly, through the process of biomagnification. Despite this, few studies regarding remediation prioritization take species distribution or biological conservation priorities into consideration. This study develops a novel approach for delineating sites which are both contaminated by any of 5 readily bioaccumulated heavy metal soil contaminants and are of high ecological importance for the highly mobile, low trophic level focal species. The conservation priority of each site was based on the projected distributions of 6 moth species simulated via the presence-only maximum entropy species distribution model followed by the subsequent application of a systematic conservation tool. In order to increase the number of available samples, we also integrated crowd-sourced data with professionally-collected data via a novel optimization procedure based on a simulated annealing algorithm. This integration procedure was important since while crowd-sourced data can drastically increase the number of data samples available to ecologists, still the quality or reliability of crowd-sourced data can be called into question, adding yet another source of uncertainty in projecting species distributions. The optimization method screens crowd-sourced data in terms of the environmental variables which correspond to professionally-collected data. The sample distribution data was derived from two different sources, including the EnjoyMoths project in Taiwan (crowd-sourced data) and the Global Biodiversity Information Facility (GBIF) ?eld data (professional data). The distributions of heavy metal concentrations were generated via 1000 iterations of a geostatistical co-simulation approach. The uncertainties in distributions of the heavy metals were then quantified based on the overall consistency between realizations. Finally, Information-Gap Decision Theory (IGDT) was applied to rank the remediation priorities of contaminated sites in terms of both spatial consensus of multiple heavy metal realizations and the priority of specific conservation areas. Our results show that the crowd-sourced optimization algorithm developed in this study is effective at selecting suitable data from crowd-sourced data. By using this technique the available sample data increased to a total number of 96, 162, 72, 62, 69 and 62 or, that is, 2.6, 1.6, 2.5, 1.6, 1.2 and 1.8 times that originally available through the GBIF professionally-assembled database. Additionally, for all species considered the performance of models, in terms of test-AUC values, based on the combination of both data sources exceeded those models which were based on a single data source. Furthermore, the additional optimization-selected data lowered the overall variability, and therefore uncertainty, of model outputs. Based on the projected species distributions, our results revealed that around 30% of high species hotspot areas were also identified as contaminated. The decision-making tool, IGDT, successfully yielded remediation plans in terms of specific ecological value requirements, false positive tolerance rates of contaminated areas, and expected decision robustness. The proposed approach can be applied both to identify high conservation priority sites contaminated by heavy metals, based on the combination of screened crowd-sourced and professionally-collected data, and in making robust remediation decisions.

  14. A Systematic Review of Techniques and Sources of Big Data in the Healthcare Sector.

    PubMed

    Alonso, Susel Góngora; de la Torre Díez, Isabel; Rodrigues, Joel J P C; Hamrioui, Sofiane; López-Coronado, Miguel

    2017-10-14

    The main objective of this paper is to present a review of existing researches in the literature, referring to Big Data sources and techniques in health sector and to identify which of these techniques are the most used in the prediction of chronic diseases. Academic databases and systems such as IEEE Xplore, Scopus, PubMed and Science Direct were searched, considering the date of publication from 2006 until the present time. Several search criteria were established as 'techniques' OR 'sources' AND 'Big Data' AND 'medicine' OR 'health', 'techniques' AND 'Big Data' AND 'chronic diseases', etc. Selecting the paper considered of interest regarding the description of the techniques and sources of Big Data in healthcare. It found a total of 110 articles on techniques and sources of Big Data on health from which only 32 have been identified as relevant work. Many of the articles show the platforms of Big Data, sources, databases used and identify the techniques most used in the prediction of chronic diseases. From the review of the analyzed research articles, it can be noticed that the sources and techniques of Big Data used in the health sector represent a relevant factor in terms of effectiveness, since it allows the application of predictive analysis techniques in tasks such as: identification of patients at risk of reentry or prevention of hospital or chronic diseases infections, obtaining predictive models of quality.

  15. Automated mapping of clinical terms into SNOMED-CT. An application to codify procedures in pathology.

    PubMed

    Allones, J L; Martinez, D; Taboada, M

    2014-10-01

    Clinical terminologies are considered a key technology for capturing clinical data in a precise and standardized manner, which is critical to accurately exchange information among different applications, medical records and decision support systems. An important step to promote the real use of clinical terminologies, such as SNOMED-CT, is to facilitate the process of finding mappings between local terms of medical records and concepts of terminologies. In this paper, we propose a mapping tool to discover text-to-concept mappings in SNOMED-CT. Name-based techniques were combined with a query expansion system to generate alternative search terms, and with a strategy to analyze and take advantage of the semantic relationships of the SNOMED-CT concepts. The developed tool was evaluated and compared to the search services provided by two SNOMED-CT browsers. Our tool automatically mapped clinical terms from a Spanish glossary of procedures in pathology with 88.0% precision and 51.4% recall, providing a substantial improvement of recall (28% and 60%) over other publicly accessible mapping services. The improvements reached by the mapping tool are encouraging. Our results demonstrate the feasibility of accurately mapping clinical glossaries to SNOMED-CT concepts, by means a combination of structural, query expansion and named-based techniques. We have shown that SNOMED-CT is a great source of knowledge to infer synonyms for the medical domain. Results show that an automated query expansion system overcomes the challenge of vocabulary mismatch partially.

  16. Source term identification in atmospheric modelling via sparse optimization

    NASA Astrophysics Data System (ADS)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the concept of sparsity. In the paper, we summarize several optimization techniques which are used for finding sparse solutions and propose their modifications to handle selected constraints such as nonnegativity constraints and simple linear constraints, for example the minimal or maximal amount of total release. These techniques range from successive convex approximations to solution of one nonconvex problem. On simple examples, we explain these techniques and compare them from the point of implementation simplicity, approximation capability and convergence properties. Finally, these methods will be applied on the European Tracer Experiment (ETEX) data and the results will be compared with the current state of arts techniques such as regularized least squares or Bayesian approach. The obtained results show the surprisingly good results of these techniques. This research is supported by EEA/Norwegian Financial Mechanism under project 7F14287 STRADI.

  17. Brain-computer interface controlled gaming: evaluation of usability by severely motor restricted end-users.

    PubMed

    Holz, Elisa Mira; Höhne, Johannes; Staiger-Sälzer, Pit; Tangermann, Michael; Kübler, Andrea

    2013-10-01

    Connect-Four, a new sensorimotor rhythm (SMR) based brain-computer interface (BCI) gaming application, was evaluated by four severely motor restricted end-users; two were in the locked-in state and had unreliable eye-movement. Following the user-centred approach, usability of the BCI prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate (ITR) and subjective workload) and users' satisfaction. Online performance varied strongly across users and sessions (median accuracy (%) of end-users: A=.65; B=.60; C=.47; D=.77). Our results thus yielded low to medium effectiveness in three end-users and high effectiveness in one end-user. Consequently, ITR was low (0.05-1.44bits/min). Only two end-users were able to play the game in free-mode. Total workload was moderate but varied strongly across sessions. Main sources of workload were mental and temporal demand. Furthermore, frustration contributed to the subjective workload of two end-users. Nevertheless, most end-users accepted the BCI application well and rated satisfaction medium to high. Sources for dissatisfaction were (1) electrode gel and cap, (2) low effectiveness, (3) time-consuming adjustment and (4) not easy-to-use BCI equipment. All four end-users indicated ease of use as being one of the most important aspect of BCI. Effectiveness and efficiency are lower as compared to applications using the event-related potential as input channel. Nevertheless, the SMR-BCI application was satisfactorily accepted by the end-users and two of four could imagine using the BCI application in their daily life. Thus, despite moderate effectiveness and efficiency BCIs might be an option when controlling an application for entertainment. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  19. Term Coverage of Dietary Supplements Ingredients in Product Labels.

    PubMed

    Wang, Yefeng; Adam, Terrence J; Zhang, Rui

    2016-01-01

    As the clinical application and consumption of dietary supplements has grown, their side effects and possible interactions with prescribed medications has become a serious issue. Information extraction of dietary supplement related information is a critical need to support dietary supplement research. However, there currently is not an existing terminology for dietary supplements, placing a barrier for informatics research in this field. The terms related to dietary supplement ingredients should be collected and normalized before a terminology can be established to facilitate convenient search on safety information and control possible adverse effects of dietary supplements. In this study, the Dietary Supplement Label Database (DSLD) was chosen as the data source from which the ingredient information was extracted and normalized. The distribution based on the product type and the ingredient type of the dietary supplements were analyzed. The ingredient terms were then mapped to the existing terminologies, including UMLS, RxNorm and NDF-RT by using MetaMap and RxMix. The large gap between existing terminologies and ingredients were found: only 14.67%, 19.65%, and 12.88% of ingredient terms were covered by UMLS, RxNorm and NDF-RT, respectively.

  20. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system.

    PubMed

    Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.

  1. Development of Approach for Long-Term Management of Disused Sealed Radioactive Sources - 13630

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinker, M.; Reber, E.; Mansoux, H.

    Radioactive sources are used widely throughout the world in a variety of medical, industrial, research and military applications. When such radioactive sources are no longer used and are not intended to be used for the practice for which an authorization was granted, they are designated as 'disused sources'. Whether appropriate controls are in place during the useful life of a source or not, the end of this useful life is often a turning point after which it is more difficult to ensure the safety and security of the source over time. For various reasons, many disused sources cannot be returnedmore » to the manufacturer or the supplier for reuse or recycling. When these attempts fail, disused sources should be declared as radioactive waste and should be managed as such, in compliance with relevant international legal instruments and safety standards. However, disposal remains an unresolved issue in many counties, due to in part to limited public acceptance, insufficient funding, and a lack of practical examples of strategies for determining suitable disposal options. As a result, disused sources are often stored indefinitely at the facilities where they were once used. In order to prevent disused sources from becoming orphan sources, each country must develop and implement a comprehensive waste management strategy that includes disposal of disused sources. The International Atomic Energy Agency (IAEA) fosters international cooperation between countries and encourages the development of a harmonized 'cradle to grave' approach to managing sources consistent with international legal instruments, IAEA safety standards, and international good practices. This 'cradle to grave' approach requires the development of a national policy and implementing strategy, an adequate legal and regulatory framework, and adequate resources and infrastructure that cover the entire life cycle, from production and use of radioactive sources to disposal. (authors)« less

  2. 7 CFR 1.413 - Submission of a sourcing area application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... sourced is located. Where the sourcing area application will cover purchases from more than one agency, application is to be made to the agency from which the applicant expects to purchase the preponderance of its... agency concerned. The lead agency shall make the decision in consultation with, and upon co-signature of...

  3. Integrating data types to enhance shoreline change assessments

    NASA Astrophysics Data System (ADS)

    Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.

    2016-12-01

    Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.

  4. Combined optical gain and degradation measurements in DCM2 doped Tris-(8-hydroxyquinoline)aluminum thin-films

    NASA Astrophysics Data System (ADS)

    Čehovski, Marko; Döring, Sebastian; Rabe, Torsten; Caspary, Reinhard; Kowalsky, Wolfgang

    2016-04-01

    Organic laser sources offer the opportunity to integrate flexible and widely tunable lasers in polymer waveguide circuits, e.g. for Lab-on-Foil applications. Therefore, it is necessary to understand gain and degradation processes for long-term operation. In this paper we address the challenge of life-time (degradation) measurements of photoluminescence (PL) and optical gain in thin-film lasers. The well known guest-host system of aluminum-chelate Alq3 (Tris-(8-hydroxyquinoline)aluminum) as host material and the laser dye DCM2 (4-(Dicyanomethylene)-2- methyl-6-julolidyl-9-enyl-4H-pyran) as guest material is employed as laser active material. Sample layers have been built up by co-evaporation in an ultrahigh (UHV) vacuum chamber. 200nm thick films of Alq3:DCM2 with different doping concentrations have been processed onto glass and thermally oxidized silicon substrates. The gain measurements have been performed by the variable stripe length (VSL) method. This measurement technique allows to determine the thin-film waveguide gain and loss, respectively. For the measurements the samples were excited with UV irradiation (ƛ = 355nm) under nitrogen atmosphere by a passively Q-switched laser source. PL degradation measurements with regard to the optical gain have been done at laser threshold (approximately 3 μJ/cm2), five times above laser threshold and 10 times above laser threshold. A t50-PL lifetime of > 107 pulses could be measured at a maximum excitation energy density of 32 μJ/cm2. This allows for a detailed analysis of the gain degradation mechanism and therefore of the stimulated cross section. Depending on the DCM2 doping concentration C the stimulated cross section was reduced by 35 %. Nevertheless, the results emphasizes the necessity of the investigation of degradation processes in organic laser sources for long-term applications.

  5. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  6. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  7. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  8. Micropollutants in urban watersheds : substance flow analysis as management tool

    NASA Astrophysics Data System (ADS)

    Rossi, L.; Copin, P. J.; Barry, A. D.; Bader, H.-P.; Scheidegger, R.; Chèvre, N.

    2009-04-01

    Micropollutants released by cities into water are of increasing concern as they are suspected of inducing long-term effects on both aquatic organisms and humans (eg., hormonally active substances). Substances found in the urban water cycle have different sources in the urban area and different fates in this cycle. For example, the pollutants emitted from traffic, like copper or PAHs get to surface water during rain events often without any treatment. Pharmaceuticals resulting from human medical treatments get to surface water mainly through wastewater treatment plants, where they are only partly treated and eliminated. One other source of contamination in urban areas for these compounds are combined sewer overflows (CSOs). Once in the receiving waters (lakes, rivers, groundwater), these substances may re-enter the cycle through drinking water. It is therefore crucial to study the behaviour of micropollutants in the urban water cycle and to get flexible tools for urban water management. Substance flow analysis (SFA) has recently been proposed as instrument for water pollution management in urban water systems. This kind of analysis is an extension of material flow analysis (MFA) originally developed in the economic sector and later adapted to regional investigations. In this study, we propose to test the application of SFA for a large number of classes of micropollutants to evaluate its use for urban water management. We chose the city of Lausanne as case study since the receiving water of this city (Lake Geneva) is an important source of drinking water for the surrounding population. Moreover a profound system-knowledge and many data were available, both on the sewer system and the water quality. We focus our study on one heavy metal (copper) and four pharmaceuticals (diclofenac, ibuprofen, carbamazepine and naproxen). Results conducted on copper reveals that around 1500 kg of copper enter the aquatic compartment yearly. This amount contributes to sediment enrichment, which may pose a long-term risk for the benthic organisms. The major sources (total of 73%) of copper in receiving surface water are roofs and contact lines of trolleybuses. Thus technical solutions have to be found to manage this specific source of contamination. Application of SFA approach to four pharmaceuticals reveals that CSOs represent an important source of contamination: Between 14% (carbamazepine) and 61% (ibuprofen) of the total annual loads of Lausanne city to the Lake are due to CSOs. These results will help in defining the best management strategy to limit Lake Geneva contamination. SFA is thus a promising tool for integrated urban water management.

  9. Preamplifiers for non-contact capacitive biopotential measurements.

    PubMed

    Peng, GuoChen; Ignjatovic, Zeljko; Bocko, Mark F

    2013-01-01

    Non-contact biopotential sensing is an attractive measurement strategy for a number of health monitoring applications, primarily the ECG and the EEG. In all such applications a key technical challenge is the design of a low-noise trans-impedance preamplifier for the typically low-capacitance, high source impedance sensing electrodes. In this paper, we compare voltage and charge amplifier designs in terms of their common mode rejection ratio, noise performance, and frequency response. Both amplifier types employ the same operational-transconductance amplifier (OTA), which was fabricated in a 0.35 um CMOS process. The results show that a charge amplifier configuration has advantages for small electrode-to-subject coupling capacitance values (less than 10 pF--typical of noncontact electrodes) and that the voltage amplifier configuration has advantages for electrode capacitances above 10 pF.

  10. New applications and developments in the neutron shielding

    NASA Astrophysics Data System (ADS)

    Uğur, Fatma Aysun

    2017-09-01

    Shielding neutrons involve three steps that are slowing neutrons, absorption of neutrons, and impregnation of gamma rays. Neutrons slow down with thermal energy by hydrogen, water, paraffin, plastic. Hydrogenated materials are also very effective for the absorption of neutrons. Gamma rays are produced by neutron (radiation) retention on the neutron shield, inelastic scattering, and degradation of activation products. If a source emits gamma rays at various energies, high-energy gamma rays sometimes specify shielding requirements. Multipurpose Materials for Neutron Shields; Concrete, especially with barium mixed in, can slow and absorb the neutrons, and shield the gamma rays. Plastic with boron is also a good multipurpose shielding material. In this study; new applications and developments in the area of neutron shielding will be discussed in terms of different materials.

  11. Beamed Energy Propulsion: Research Status And Needs--Part 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkan, Mitat

    One promising solution to the operationally responsive space is the application of remote electromagnetic energy to propel a launch vehicle into orbit. With beamed energy propulsion, one can leave the power source stationary on the ground or space, and direct heat propellant on the spacecraft with a beam from a fixed station. This permits the spacecraft to leave its power source at home, saving significant amounts of mass, greatly improving performance. This concept, which removes the mass penalty of carrying the propulsion energy source on board the vehicle, was first proposed by Arthur Kantrowitz in 1972; he invoked an extremelymore » powerful ground based laser. The same year Michael Minovich suggested a conceptually similar 'in-space' laser rocket system utilizing a remote laser power station. In the late 1980's, Air Force Office of Scientific Research (AFOSR) funded continuous, double pulse laser and microwave propulsion while Strategic Defense Initiative Office (SDIO) funded ablative laser rocket propulsion. Currently AFOSR has been funding the concept initiated by Leik Myrabo, repetitively pulsed laser propulsion, which has been universally perceived, arguably, to be the closest for mid-term applications. This 2-part paper examines the investment strategies in beamed energy propulsion and technical challenges to be overcome. Part 1 presents a world-wide review of beamed energy propulsion research, including both laser and microwave arenas.« less

  12. ADDRESSING EMERGING ISSUES IN WATER QUALITY ...

    EPA Pesticide Factsheets

    Public concern over cleanliness and safety of source and recreational waters has prompted researchers to look for indicators of water quality. Giving public water authorities multiple tools to measure and monitor levels of chemical contaminants, as well as chemical markers of contamination, simply and rapidly would enhance public protection. The goals of water quality are outlined in the Water Quality Multi-year Plan [http://intranet.epa.gov/ospintra/Planning/wq.pdf] and the research in this task falls under GPRA Goal 2, 2.3.2, Long Term Goals 1, 2, and 4. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG

  13. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  14. Design of a plastic minicolpostat applicator with shields.

    PubMed

    Weeks, K J; Montana, G S; Bentel, G C

    1991-09-01

    A plastic intracavitary applicator system for the treatment of cancer of the uterine cervix is described. This applicator has a minicolpostat and a mechanism for affixing the tandem to the colpostats. Traditional afterloading refers only to the radioactive source. Both the source and the ovoid shield are afterloaded together in this applicator in contrast to traditional afterloading systems which afterload the source alone. A potential advantage of our applicator system is that it allows high quality CT localization because the sources and shields can be removed and the applicator is made of plastic. The advantages and disadvantages of this variation to the Fletcher system as well as other aspects of applicator design are discussed. An experimentally verified dose calculation method for shielded sources is applied to the design problems associated with this applicator. The dose distribution calculated for a source-shield configuration of the plastic applicator is compared to that obtained with a commercial Fletcher-Suit-Delclos (FSD) applicator. Significant shielding improvements can be achieved for the smallest diameter ovoid, that is, in the minicolpostat. The plastic minicolpostat dose distributions are similar to those produced by the conventional larger diameter colpostats. In particular, the colpostat shielding for rectum and bladder, which is reduced in the metal applicator's minicolpostat configuration, is maintained for the plastic minicolpostat. Further, it is shown that, if desired, relative to the FSD minicolpostat, the mucosa dose can be reduced by a suitable change of the minicolpostat source position.

  15. Material from the Internal Surface of Squid Axon Exhibits Excess Noise

    PubMed Central

    Fishman, Harvey M.

    1981-01-01

    A fluid material from a squid (Loligo pealei) axon was isolated by mechanical application of two types of microcapillary (1-3-μm Diam) to the internal surface of intact and cut-axon preparations. Current noise in the isolated material exceeded thermal levels and power spectra were 1/f in form in the frequency range 1.25-500 Hz with voltage-dependent intensities that were unrelated to specific ion channels. Whether conduction in this material is a significant source of excess noise during axon conduction remains to be determined. Nevertheless, a source of excess noise external to or within an ion channel may not be properly represented solely as an additive term to the spectrum of ion channel noise; a deconvolution of these spectral components may be required for modeling purposes. PMID:6266542

  16. Mushrooms: A Potential Natural Source of Anti-Inflammatory Compounds for Medical Applications

    PubMed Central

    Elsayed, Elsayed A.; El Enshasy, Hesham; Wadaan, Mohammad A. M.; Aziz, Ramlan

    2014-01-01

    For centuries, macrofungi have been used as food and medicine in different parts of the world. This is mainly attributed to their nutritional value as a potential source of carbohydrates, proteins, amino acids, and minerals. In addition, they also include many bioactive metabolites which make mushrooms and truffles common components in folk medicine, especially in Africa, the Middle East, China, and Japan. The reported medicinal effects of mushrooms include anti-inflammatory effects, with anti-inflammatory compounds of mushrooms comprising a highly diversified group in terms of their chemical structure. They include polysaccharides, terpenoids, phenolic compounds, and many other low molecular weight molecules. The aims of this review are to report the different types of bioactive metabolites and their relevant producers, as well as the different mechanisms of action of mushroom compounds as potent anti-inflammatory agents. PMID:25505823

  17. Solar radiation data sources, applications, and network design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    A prerequisite to considering solar energy projects is to determine the requirements for information about solar radiation to apply to possible projects. This report offers techniques to help the reader specify requirements in terms of solar radiation data and information currently available, describes the past and present programs to record and present information to be used for most requirements, presents courses of action to help the user meet his needs for information, lists sources of solar radiation data and presents the problems, costs, benefits and responsibilities of programs to acquire additional solar radiation data. Extensive background information is provided aboutmore » solar radiation data and its use. Specialized information about recording, collecting, processing, storing and disseminating solar radiation data is given. Several Appendices are included which provide reference material for special situations.« less

  18. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts.

    PubMed

    Ravagli, Carlo; Pognan, Francois; Marc, Philippe

    2017-01-01

    The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com This software is designed to run on a Java EE application server and store data in a relational database. philippe.marc@novartis.com. © The Author 2016. Published by Oxford University Press.

  19. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts

    PubMed Central

    Ravagli, Carlo; Pognan, Francois

    2017-01-01

    Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099

  20. Surveillance system for air pollutants by combination of the decision support system COMPAS and optical remote sensing systems

    NASA Astrophysics Data System (ADS)

    Flassak, Thomas; de Witt, Helmut; Hahnfeld, Peter; Knaup, Andreas; Kramer, Lothar

    1995-09-01

    COMPAS is a decision support system designed to assist in the assessment of the consequences of accidental releases of toxic and flammable substances. One of the key elements of COMPAS is a feedback algorithm which allows us to calculate the source term with the aid of concentration measurements. Up to now the feedback technique is applied to concentration measurements done with test tubes or conventional point sensors. In this paper the extension of the actual method is presented which is the combination of COMPAS and an optical remote sensing system like the KAYSER-THREDE K300 FTIR system. Active remote sensing methods based on FTIR are, among other applications, ideal for the so-called fence line monitoring of the diffuse emissions and accidental releases from industrial facilities, since from the FTIR spectra averaged concentration levels along the measurement path can be achieved. The line-averaged concentrations are ideally suited as on-line input for COMPAS' feedback technique. Uncertainties in the assessment of the source term related with both shortcomings of the dispersion model itself and also problems of a feedback strategy based on point measurements are reduced.

Top