Sample records for source term based

  1. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  2. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  3. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  4. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    NASA Astrophysics Data System (ADS)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.

  5. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  6. Prioritized packet video transmission over time-varying wireless channel using proactive FEC

    NASA Astrophysics Data System (ADS)

    Kumwilaisak, Wuttipong; Kim, JongWon; Kuo, C.-C. Jay

    2000-12-01

    Quality of video transmitted over time-varying wireless channels relies heavily on the coordinated effort to cope with both channel and source variations dynamically. Given the priority of each source packet and the estimated channel condition, an adaptive protection scheme based on joint source-channel criteria is investigated via proactive forward error correction (FEC). With proactive FEC in Reed Solomon (RS)/Rate-compatible punctured convolutional (RCPC) codes, we study a practical algorithm to match the relative priority of source packets and instantaneous channel conditions. The channel condition is estimated to capture the long-term fading effect in terms of the averaged SNR over a preset window. Proactive protection is performed for each packet based on the joint source-channel criteria with special attention to the accuracy, time-scale match, and feedback delay of channel status estimation. The overall gain of the proposed protection mechanism is demonstrated in terms of the end-to-end wireless video performance.

  7. Sample Based Unit Liter Dose Estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENSEN, L.

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new datamore » to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting {mu}Ci/g or {mu}Ci/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000).« less

  8. NuSTAR view of the central region of M31

    NASA Astrophysics Data System (ADS)

    Stiele, H.; Kong, A. K. H.

    2018-04-01

    Our neighbouring large spiral galaxy, the Andromeda galaxy (M31 or NGC 224), is an ideal target to study the X-ray source population of a nearby galaxy. NuSTAR observed the central region of M31 in 2015 and allows studying the population of X-ray point sources at energies higher than 10 keV. Based on the source catalogue of the large XMM-Newton survey of M31, we identified counterparts to the XMM-Newton sources in the NuSTAR data. The NuSTAR data only contain sources of a brightness comparable (or even brighter) than the selected sources that have been detected in XMM-Newton data. We investigate hardness ratios, spectra, and long-term light curves of individual sources obtained from NuSTAR data. Based on our spectral studies, we suggest four sources as possible X-ray binary candidates. The long-term light curves of seven sources that have been observed more than once show low (but significant) variability.

  9. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  10. Time-frequency approach to underdetermined blind source separation.

    PubMed

    Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong

    2012-02-01

    This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.

  11. Flowsheets and source terms for radioactive waste projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  12. Source inventory for Department of Energy solid low-level radioactive waste disposal facilities: What it means and how to get one of your own

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M.A.

    1991-12-31

    In conducting a performance assessment for a low-level waste (LLW) disposal facility, one of the important considerations for determining the source term, which is defined as the amount of radioactivity being released from the facility, is the quantity of radioactive material present. This quantity, which will be referred to as the source inventory, is generally estimated through a review of historical records and waste tracking systems at the LLW facility. In theory, estimating the total source inventory for Department of Energy (DOE) LLW disposal facilities should be possible by reviewing the national data base maintained for LLW operations, the Solidmore » Waste Information Management System (SWIMS), or through the annual report that summarizes the SWIMS data, the Integrated Data Base (IDB) report. However, in practice, there are some difficulties in making this estimate. This is not unexpected, since the SWIMS and the IDB were not developed with the goal of developing a performance assessment source term in mind. The practical shortcomings using the existing data to develop a source term for DOE facilities will be discussed in this paper.« less

  13. An extension of the Lighthill theory of jet noise to encompass refraction and shielding

    NASA Technical Reports Server (NTRS)

    Ribner, Herbert S.

    1995-01-01

    A formalism for jet noise prediction is derived that includes the refractive 'cone of silence' and other effects; outside the cone it approximates the simple Lighthill format. A key step is deferral of the simplifying assumption of uniform density in the dominant 'source' term. The result is conversion to a convected wave equation retaining the basic Lighthill source term. The main effect is to amend the Lighthill solution to allow for refraction by mean flow gradients, achieved via a frequency-dependent directional factor. A general formula for power spectral density emitted from unit volume is developed as the Lighthill-based value multiplied by a squared 'normalized' Green's function (the directional factor), referred to a stationary point source. The convective motion of the sources, with its powerful amplifying effect, also directional, is already accounted for in the Lighthill format: wave convection and source convection are decoupled. The normalized Green's function appears to be near unity outside the refraction dominated 'cone of silence', this validates our long term practice of using Lighthill-based approaches outside the cone, with extension inside via the Green's function. The function is obtained either experimentally (injected 'point' source) or numerically (computational aeroacoustics). Approximation by unity seems adequate except near the cone and except when there are shrouding jets: in that case the difference from unity quantifies the shielding effect. Further extension yields dipole and monopole source terms (cf. Morfey, Mani, and others) when the mean flow possesses density gradients (e.g., hot jets).

  14. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  15. Unprecedented long-term frequency stability with a microwave resonator oscillator.

    PubMed

    Grop, Serge; Schafer, Wolfgang; Bourgeois, Pierre-Yves; Kersale, Yann; Oxborrow, Mark; Rubiola, Enrico; Giordano, Vincent

    2011-08-01

    This article reports on the long-term frequency stability characterization of a new type of cryogenic sapphire oscillator using an autonomous pulse-tube cryocooler as its cold source. This new design enables a relative frequency stability of better than 4.5 x 10(-15) over one day of integration. To the best of our knowledge, this represents the best long-term frequency stability ever obtained with a signal source based on a macroscopic resonator.

  16. Localization of sound sources in a room with one microphone

    NASA Astrophysics Data System (ADS)

    Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre

    2017-08-01

    Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.

  17. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  18. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  19. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  20. Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2003-01-01

    A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.

  1. Source Term Estimation of Radioxenon Released from the Fukushima Dai-ichi Nuclear Reactors Using Measured Air Concentrations and Atmospheric Transport Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eslinger, Paul W.; Biegalski, S.; Bowyer, Ted W.

    2014-01-01

    Systems designed to monitor airborne radionuclides released from underground nuclear explosions detected radioactive fallout from the Fukushima Daiichi nuclear accident in March 2011. Atmospheric transport modeling (ATM) of plumes of noble gases and particulates were performed soon after the accident to determine plausible detection locations of any radioactive releases to the atmosphere. We combine sampling data from multiple International Modeling System (IMS) locations in a new way to estimate the magnitude and time sequence of the releases. Dilution factors from the modeled plume at five different detection locations were combined with 57 atmospheric concentration measurements of 133-Xe taken from Marchmore » 18 to March 23 to estimate the source term. This approach estimates that 59% of the 1.24×1019 Bq of 133-Xe present in the reactors at the time of the earthquake was released to the atmosphere over a three day period. Source term estimates from combinations of detection sites have lower spread than estimates based on measurements at single detection sites. Sensitivity cases based on data from four or more detection locations bound the source term between 35% and 255% of available xenon inventory.« less

  2. Boundary control of bidomain equations with state-dependent switching source functions in the ionic model

    NASA Astrophysics Data System (ADS)

    Chamakuri, Nagaiah; Engwer, Christian; Kunisch, Karl

    2014-09-01

    Optimal control for cardiac electrophysiology based on the bidomain equations in conjunction with the Fenton-Karma ionic model is considered. This generic ventricular model approximates well the restitution properties and spiral wave behavior of more complex ionic models of cardiac action potentials. However, it is challenging due to the appearance of state-dependent discontinuities in the source terms. A computational framework for the numerical realization of optimal control problems is presented. Essential ingredients are a shape calculus based treatment of the sensitivities of the discontinuous source terms and a marching cubes algorithm to track iso-surface of excitation wavefronts. Numerical results exhibit successful defibrillation by applying an optimally controlled extracellular stimulus.

  3. Development of surrogate models for the prediction of the flow around an aircraft propeller

    NASA Astrophysics Data System (ADS)

    Salpigidou, Christina; Misirlis, Dimitris; Vlahostergios, Zinon; Yakinthos, Kyros

    2018-05-01

    In the present work, the derivation of two surrogate models (SMs) for modelling the flow around a propeller for small aircrafts is presented. Both methodologies use derived functions based on computations with the detailed propeller geometry. The computations were performed using k-ω shear stress transport for modelling turbulence. In the SMs, the modelling of the propeller was performed in a computational domain of disk-like geometry, where source terms were introduced in the momentum equations. In the first SM, the source terms were polynomial functions of swirl and thrust, mainly related to the propeller radius. In the second SM, regression analysis was used to correlate the source terms with the velocity distribution through the propeller. The proposed SMs achieved faster convergence, in relation to the detail model, by providing also results closer to the available operational data. The regression-based model was the most accurate and required less computational time for convergence.

  4. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  5. Coral proxy record of decadal-scale reduction in base flow from Moloka'i, Hawaii

    USGS Publications Warehouse

    Prouty, Nancy G.; Jupiter, Stacy D.; Field, Michael E.; McCulloch, Malcolm T.

    2009-01-01

    Groundwater is a major resource in Hawaii and is the principal source of water for municipal, agricultural, and industrial use. With a growing population, a long-term downward trend in rainfall, and the need for proper groundwater management, a better understanding of the hydroclimatological system is essential. Proxy records from corals can supplement long-term observational networks, offering an accessible source of hydrologic and climate information. To develop a qualitative proxy for historic groundwater discharge to coastal waters, a suite of rare earth elements and yttrium (REYs) were analyzed from coral cores collected along the south shore of Moloka'i, Hawaii. The coral REY to calcium (Ca) ratios were evaluated against hydrological parameters, yielding the strongest relationship to base flow. Dissolution of REYs from labradorite and olivine in the basaltic rock aquifers is likely the primary source of coastal ocean REYs. There was a statistically significant downward trend (−40%) in subannually resolved REY/Ca ratios over the last century. This is consistent with long-term records of stream discharge from Moloka'i, which imply a downward trend in base flow since 1913. A decrease in base flow is observed statewide, consistent with the long-term downward trend in annual rainfall over much of the state. With greater demands on freshwater resources, it is appropriate for withdrawal scenarios to consider long-term trends and short-term climate variability. It is possible that coral paleohydrological records can be used to conduct model-data comparisons in groundwater flow models used to simulate changes in groundwater level and coastal discharge.

  6. Entropy-Based Bounds On Redundancies Of Huffman Codes

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic J.

    1992-01-01

    Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.

  7. On volume-source representations based on the representation theorem

    NASA Astrophysics Data System (ADS)

    Ichihara, Mie; Kusakabe, Tetsuya; Kame, Nobuki; Kumagai, Hiroyuki

    2016-01-01

    We discuss different ways to characterize a moment tensor associated with an actual volume change of ΔV C , which has been represented in terms of either the stress glut or the corresponding stress-free volume change ΔV T . Eshelby's virtual operation provides a conceptual model relating ΔV C to ΔV T and the stress glut, where non-elastic processes such as phase transitions allow ΔV T to be introduced and subsequent elastic deformation of - ΔV T is assumed to produce the stress glut. While it is true that ΔV T correctly represents the moment tensor of an actual volume source with volume change ΔV C , an explanation as to why such an operation relating ΔV C to ΔV T exists has not previously been given. This study presents a comprehensive explanation of the relationship between ΔV C and ΔV T based on the representation theorem. The displacement field is represented using Green's function, which consists of two integrals over the source surface: one for displacement and the other for traction. Both integrals are necessary for representing volumetric sources, whereas the representation of seismic faults includes only the first term, as the second integral over the two adjacent fault surfaces, across which the traction balances, always vanishes. Therefore, in a seismological framework, the contribution from the second term should be included as an additional surface displacement. We show that the seismic moment tensor of a volume source is directly obtained from the actual state of the displacement and stress at the source without considering any virtual non-elastic operations. A purely mathematical procedure based on the representation theorem enables us to specify the additional imaginary displacement necessary for representing a volume source only by the displacement term, which links ΔV C to ΔV T . It also specifies the additional imaginary stress necessary for representing a moment tensor solely by the traction term, which gives the "stress glut." The imaginary displacement-stress approach clarifies the mathematical background to the classical theory.

  8. A controlled variation scheme for convection treatment in pressure-based algorithm

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Thakur, Siddharth; Tucker, Kevin

    1993-01-01

    Convection effect and source terms are two primary sources of difficulties in computing turbulent reacting flows typically encountered in propulsion devices. The present work intends to elucidate the individual as well as the collective roles of convection and source terms in the fluid flow equations, and to devise appropriate treatments and implementations to improve our current capability of predicting such flows. A controlled variation scheme (CVS) has been under development in the context of a pressure-based algorithm, which has the characteristics of adaptively regulating the amount of numerical diffusivity, relative to central difference scheme, according to the variation in local flow field. Both the basic concepts and a pragmatic assessment will be presented to highlight the status of this work.

  9. Bayesian source term estimation of atmospheric releases in urban areas using LES approach.

    PubMed

    Xue, Fei; Kikumoto, Hideki; Li, Xiaofeng; Ooka, Ryozo

    2018-05-05

    The estimation of source information from limited measurements of a sensor network is a challenging inverse problem, which can be viewed as an assimilation process of the observed concentration data and the predicted concentration data. When dealing with releases in built-up areas, the predicted data are generally obtained by the Reynolds-averaged Navier-Stokes (RANS) equations, which yields building-resolving results; however, RANS-based models are outperformed by large-eddy simulation (LES) in the predictions of both airflow and dispersion. Therefore, it is important to explore the possibility of improving the estimation of the source parameters by using the LES approach. In this paper, a novel source term estimation method is proposed based on LES approach using Bayesian inference. The source-receptor relationship is obtained by solving the adjoint equations constructed using the time-averaged flow field simulated by the LES approach based on the gradient diffusion hypothesis. A wind tunnel experiment with a constant point source downwind of a single building model is used to evaluate the performance of the proposed method, which is compared with that of the existing method using a RANS model. The results show that the proposed method reduces the errors of source location and releasing strength by 77% and 28%, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. The Effect of Bilingual Term List Size on Dictionary-Based Cross-Language Information Retrieval

    DTIC Science & Technology

    2006-01-01

    The Effect of Bilingual Term List Size on Dictionary -Based Cross-Language Information Retrieval Dina Demner-Fushman Department of Computer Science... dictionary -based Cross-Language Information Retrieval (CLIR), in which the goal is to find documents written in one natural language based on queries that...in which the documents are written. In dictionary -based CLIR techniques, the princi- pal source of translation knowledge is a translation lexicon

  11. Treating convection in sequential solvers

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Thakur, Siddharth

    1992-01-01

    The treatment of the convection terms in the sequential solver, a standard procedure found in virtually all pressure based algorithms, to compute the flow problems with sharp gradients and source terms is investigated. Both scalar model problems and one-dimensional gas dynamics equations have been used to study the various issues involved. Different approaches including the use of nonlinear filtering techniques and adoption of TVD type schemes have been investigated. Special treatments of the source terms such as pressure gradients and heat release have also been devised, yielding insight and improved accuracy of the numerical procedure adopted.

  12. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE PAGES

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    2016-01-18

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/ 137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/ 137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/ 137Cs versus 134Cs/ 137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  14. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Mathew S.; Snyder, Darin C.; Delmore, James E.

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1–3 and spent fuel ponds 1–4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100–250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequentialmore » ammonium molybdophosphate-polyacrylonitrile columns, following which 135Cs/ 137Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. 135Cs/ 137Cs isotope ratios from samples 100–250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. 135Cs/ 137Cs versus 134Cs/ 137Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. In conclusion, cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.« less

  15. Fukushima Daiichi reactor source term attribution using cesium isotope ratios from contaminated environmental samples.

    PubMed

    Snow, Mathew S; Snyder, Darin C; Delmore, James E

    2016-02-28

    Source term attribution of environmental contamination following the Fukushima Daiichi Nuclear Power Plant (FDNPP) disaster is complicated by a large number of possible similar emission source terms (e.g. FDNPP reactor cores 1-3 and spent fuel ponds 1-4). Cesium isotopic analyses can be utilized to discriminate between environmental contamination from different FDNPP source terms and, if samples are sufficiently temporally resolved, potentially provide insights into the extent of reactor core damage at a given time. Rice, soil, mushroom, and soybean samples taken 100-250 km from the FDNPP site were dissolved using microwave digestion. Radiocesium was extracted and purified using two sequential ammonium molybdophosphate-polyacrylonitrile columns, following which (135)Cs/(137) Cs isotope ratios were measured using thermal ionization mass spectrometry (TIMS). Results were compared with data reported previously from locations to the northwest of FDNPP and 30 km to the south of FDNPP. (135)Cs/(137)Cs isotope ratios from samples 100-250 km to the southwest of the FDNPP site show a consistent value of 0.376 ± 0.008. (135)Cs/(137)Cs versus (134)Cs/(137)Cs correlation plots suggest that radiocesium to the southwest is derived from a mixture of FDNPP reactor cores 1, 2, and 3. Conclusions from the cesium isotopic data are in agreement with those derived independently based upon the event chronology combined with meteorological conditions at the time of the disaster. Cesium isotopic analyses provide a powerful tool for source term discrimination of environmental radiocesium contamination at the FDNPP site. For higher precision source term attribution and forensic determination of the FDNPP core conditions based upon cesium, analyses of a larger number of samples from locations to the north and south of the FDNPP site (particularly time-resolved air filter samples) are needed. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  16. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.

  17. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  18. Trends and drivers of marine debris on the Atlantic coast of the United States 1997-2007

    USGS Publications Warehouse

    Ribic, C.A.; Sheavly, S.B.; Rugg, D.J.; Erdmann, Eric S.

    2010-01-01

    For the first time, we documented regional differences in amounts and long-term trends of marine debris along the US Atlantic coast. The Southeast Atlantic had low land-based and general-source debris loads as well as no increases despite a 19% increase in coastal population. The Northeast (8% population increase) also had low land-based and general-source debris loads and no increases. The Mid-Atlantic (10% population increase) fared the worst, with heavy land-based and general-source debris loads that increased over time. Ocean-based debris did not change in the Northeast where the fishery is relatively stable; it declined over the Mid-Atlantic and Southeast and was correlated with declining regional fisheries. Drivers, including human population, land use status, fishing activity, and oceanic current systems, had complex relationships with debris loads at local and regional scales. Management challenges remain undeniably large but solid information from long-term programs is one key to addressing this pressing pollution issue. ?? 2010.

  19. Trends and drivers of marine debris on the Atlantic coast of the United States 1997-2007.

    PubMed

    Ribic, Christine A; Sheavly, Seba B; Rugg, David J; Erdmann, Eric S

    2010-08-01

    For the first time, we documented regional differences in amounts and long-term trends of marine debris along the US Atlantic coast. The Southeast Atlantic had low land-based and general-source debris loads as well as no increases despite a 19% increase in coastal population. The Northeast (8% population increase) also had low land-based and general-source debris loads and no increases. The Mid-Atlantic (10% population increase) fared the worst, with heavy land-based and general-source debris loads that increased over time. Ocean-based debris did not change in the Northeast where the fishery is relatively stable; it declined over the Mid-Atlantic and Southeast and was correlated with declining regional fisheries. Drivers, including human population, land use status, fishing activity, and oceanic current systems, had complex relationships with debris loads at local and regional scales. Management challenges remain undeniably large but solid information from long-term programs is one key to addressing this pressing pollution issue. Published by Elsevier Ltd.

  20. Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.

    PubMed

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei

    2017-04-01

    Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.

  1. Sparse reconstruction for quantitative bioluminescence tomography based on the incomplete variables truncated conjugate gradient method.

    PubMed

    He, Xiaowei; Liang, Jimin; Wang, Xiaorui; Yu, Jingjing; Qu, Xiaochao; Wang, Xiaodong; Hou, Yanbin; Chen, Duofang; Liu, Fang; Tian, Jie

    2010-11-22

    In this paper, we present an incomplete variables truncated conjugate gradient (IVTCG) method for bioluminescence tomography (BLT). Considering the sparse characteristic of the light source and insufficient surface measurement in the BLT scenarios, we combine a sparseness-inducing (ℓ1 norm) regularization term with a quadratic error term in the IVTCG-based framework for solving the inverse problem. By limiting the number of variables updated at each iterative and combining a variable splitting strategy to find the search direction more efficiently, it obtains fast and stable source reconstruction, even without a priori information of the permissible source region and multispectral measurements. Numerical experiments on a mouse atlas validate the effectiveness of the method. In vivo mouse experimental results further indicate its potential for a practical BLT system.

  2. Systematically biological prioritizing remediation sites based on datasets of biological investigations and heavy metals in soil

    NASA Astrophysics Data System (ADS)

    Lin, Wei-Chih; Lin, Yu-Pin; Anthony, Johnathen

    2015-04-01

    Heavy metal pollution has adverse effects on not only the focal invertebrate species of this study, such as reduction in pupa weight and increased larval mortality, but also on the higher trophic level organisms which feed on them, either directly or indirectly, through the process of biomagnification. Despite this, few studies regarding remediation prioritization take species distribution or biological conservation priorities into consideration. This study develops a novel approach for delineating sites which are both contaminated by any of 5 readily bioaccumulated heavy metal soil contaminants and are of high ecological importance for the highly mobile, low trophic level focal species. The conservation priority of each site was based on the projected distributions of 6 moth species simulated via the presence-only maximum entropy species distribution model followed by the subsequent application of a systematic conservation tool. In order to increase the number of available samples, we also integrated crowd-sourced data with professionally-collected data via a novel optimization procedure based on a simulated annealing algorithm. This integration procedure was important since while crowd-sourced data can drastically increase the number of data samples available to ecologists, still the quality or reliability of crowd-sourced data can be called into question, adding yet another source of uncertainty in projecting species distributions. The optimization method screens crowd-sourced data in terms of the environmental variables which correspond to professionally-collected data. The sample distribution data was derived from two different sources, including the EnjoyMoths project in Taiwan (crowd-sourced data) and the Global Biodiversity Information Facility (GBIF) ?eld data (professional data). The distributions of heavy metal concentrations were generated via 1000 iterations of a geostatistical co-simulation approach. The uncertainties in distributions of the heavy metals were then quantified based on the overall consistency between realizations. Finally, Information-Gap Decision Theory (IGDT) was applied to rank the remediation priorities of contaminated sites in terms of both spatial consensus of multiple heavy metal realizations and the priority of specific conservation areas. Our results show that the crowd-sourced optimization algorithm developed in this study is effective at selecting suitable data from crowd-sourced data. By using this technique the available sample data increased to a total number of 96, 162, 72, 62, 69 and 62 or, that is, 2.6, 1.6, 2.5, 1.6, 1.2 and 1.8 times that originally available through the GBIF professionally-assembled database. Additionally, for all species considered the performance of models, in terms of test-AUC values, based on the combination of both data sources exceeded those models which were based on a single data source. Furthermore, the additional optimization-selected data lowered the overall variability, and therefore uncertainty, of model outputs. Based on the projected species distributions, our results revealed that around 30% of high species hotspot areas were also identified as contaminated. The decision-making tool, IGDT, successfully yielded remediation plans in terms of specific ecological value requirements, false positive tolerance rates of contaminated areas, and expected decision robustness. The proposed approach can be applied both to identify high conservation priority sites contaminated by heavy metals, based on the combination of screened crowd-sourced and professionally-collected data, and in making robust remediation decisions.

  3. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  4. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames.

    PubMed

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  5. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames

    NASA Astrophysics Data System (ADS)

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    2014-05-01

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  6. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    PubMed

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  7. A study of electron density profiles in relation to ionization sources and ground-based radio wave absorption measurements, part 1

    NASA Technical Reports Server (NTRS)

    Gnanalingam, S.; Kane, J. A.

    1973-01-01

    An extensive set of ground-based measurements of the diurnal variation of medium frequency radio wave adsorption and virtual height is analyzed in terms of current understanding of the D- and lower E-region ion production and loss process. When this is done a gross discrepancy arises, the source of which is not known.

  8. 2005 Defense Base Closure and Realignment Commission Report. Volume 2

    DTIC Science & Technology

    2005-01-01

    known as the Superfund, is the legal framework for the identification, restoration, and transfer of contaminated property. In 1986, CERCLA was revised...place and operating pursuant to an approved remedial design. This allows the transfer prior to complete remediation of contamination , (source: 2005...uncontaminated parcels at closing bases while long-term cleanup of contaminated parcels continues, (source: 2005 BRAC Commission) CERTIFIED DATA P.L

  9. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  10. Heat source reconstruction from noisy temperature fields using an optimised derivative Gaussian filter

    NASA Astrophysics Data System (ADS)

    Delpueyo, D.; Balandraud, X.; Grédiac, M.

    2013-09-01

    The aim of this paper is to present a post-processing technique based on a derivative Gaussian filter to reconstruct heat source fields from temperature fields measured by infrared thermography. Heat sources can be deduced from temperature variations thanks to the heat diffusion equation. Filtering and differentiating are key-issues which are closely related here because the temperature fields which are processed are unavoidably noisy. We focus here only on the diffusion term because it is the most difficult term to estimate in the procedure, the reason being that it involves spatial second derivatives (a Laplacian for isotropic materials). This quantity can be reasonably estimated using a convolution of the temperature variation fields with second derivatives of a Gaussian function. The study is first based on synthetic temperature variation fields corrupted by added noise. The filter is optimised in order to reconstruct at best the heat source fields. The influence of both the dimension and the level of a localised heat source is discussed. Obtained results are also compared with another type of processing based on an averaging filter. The second part of this study presents an application to experimental temperature fields measured with an infrared camera on a thin plate in aluminium alloy. Heat sources are generated with an electric heating patch glued on the specimen surface. Heat source fields reconstructed from measured temperature fields are compared with the imposed heat sources. Obtained results illustrate the relevancy of the derivative Gaussian filter to reliably extract heat sources from noisy temperature fields for the experimental thermomechanics of materials.

  11. The Iterative Reweighted Mixed-Norm Estimate for Spatio-Temporal MEG/EEG Source Reconstruction.

    PubMed

    Strohmeier, Daniel; Bekhti, Yousra; Haueisen, Jens; Gramfort, Alexandre

    2016-10-01

    Source imaging based on magnetoencephalography (MEG) and electroencephalography (EEG) allows for the non-invasive analysis of brain activity with high temporal and good spatial resolution. As the bioelectromagnetic inverse problem is ill-posed, constraints are required. For the analysis of evoked brain activity, spatial sparsity of the neuronal activation is a common assumption. It is often taken into account using convex constraints based on the l 1 -norm. The resulting source estimates are however biased in amplitude and often suboptimal in terms of source selection due to high correlations in the forward model. In this work, we demonstrate that an inverse solver based on a block-separable penalty with a Frobenius norm per block and a l 0.5 -quasinorm over blocks addresses both of these issues. For solving the resulting non-convex optimization problem, we propose the iterative reweighted Mixed Norm Estimate (irMxNE), an optimization scheme based on iterative reweighted convex surrogate optimization problems, which are solved efficiently using a block coordinate descent scheme and an active set strategy. We compare the proposed sparse imaging method to the dSPM and the RAP-MUSIC approach based on two MEG data sets. We provide empirical evidence based on simulations and analysis of MEG data that the proposed method improves on the standard Mixed Norm Estimate (MxNE) in terms of amplitude bias, support recovery, and stability.

  12. Term amniotic fluid: an unexploited reserve of mesenchymal stromal cells for reprogramming and potential cell therapy applications.

    PubMed

    Moraghebi, Roksana; Kirkeby, Agnete; Chaves, Patricia; Rönn, Roger E; Sitnicka, Ewa; Parmar, Malin; Larsson, Marcus; Herbst, Andreas; Woods, Niels-Bjarne

    2017-08-25

    Mesenchymal stromal cells (MSCs) are currently being evaluated in numerous pre-clinical and clinical cell-based therapy studies. Furthermore, there is an increasing interest in exploring alternative uses of these cells in disease modelling, pharmaceutical screening, and regenerative medicine by applying reprogramming technologies. However, the limited availability of MSCs from various sources restricts their use. Term amniotic fluid has been proposed as an alternative source of MSCs. Previously, only low volumes of term fluid and its cellular constituents have been collected, and current knowledge of the MSCs derived from this fluid is limited. In this study, we collected amniotic fluid at term using a novel collection system and evaluated amniotic fluid MSC content and their characteristics, including their feasibility to undergo cellular reprogramming. Amniotic fluid was collected at term caesarean section deliveries using a closed catheter-based system. Following fluid processing, amniotic fluid was assessed for cellularity, MSC frequency, in-vitro proliferation, surface phenotype, differentiation, and gene expression characteristics. Cells were also reprogrammed to the pluripotent stem cell state and differentiated towards neural and haematopoietic lineages. The average volume of term amniotic fluid collected was approximately 0.4 litres per donor, containing an average of 7 million viable mononuclear cells per litre, and a CFU-F content of 15 per 100,000 MNCs. Expanded CFU-F cultures showed similar surface phenotype, differentiation potential, and gene expression characteristics to MSCs isolated from traditional sources, and showed extensive expansion potential and rapid doubling times. Given the high proliferation rates of these neonatal source cells, we assessed them in a reprogramming application, where the derived induced pluripotent stem cells showed multigerm layer lineage differentiation potential. The potentially large donor base from caesarean section deliveries, the high yield of term amniotic fluid MSCs obtainable, the properties of the MSCs identified, and the suitability of the cells to be reprogrammed into the pluripotent state demonstrated these cells to be a promising and plentiful resource for further evaluation in bio-banking, cell therapy, disease modelling, and regenerative medicine applications.

  13. Long-term variability in bright hard X-ray sources: 5+ years of BATSE data

    NASA Technical Reports Server (NTRS)

    Robinson, C. R.; Harmon, B. A.; McCollough, M. L.; Paciesas, W. S.; Sahi, M.; Scott, D. M.; Wilson, C. A.; Zhang, S. N.; Deal, K. J.

    1997-01-01

    The operation of the Compton Gamma Ray Observatory (CGRO)/burst and transient source experiment (BATSE) continues to provide data for inclusion into a data base for the analysis of long term variability in bright, hard X-ray sources. The all-sky capability of BATSE provides up to 30 flux measurements/day for each source. The long baseline and the various rising and setting occultation flux measurements allow searches for periodic and quasi-periodic signals with periods of between several hours to hundreds of days to be conducted. The preliminary results from an analysis of the hard X-ray variability in 24 of the brightest BATSE sources are presented. Power density spectra are computed for each source and profiles are presented of the hard X-ray orbital modulations in some X-ray binaries, together with amplitude modulations and variations in outburst durations and intensities in recurrent X-ray transients.

  14. Sol-gel coated ion sources for liquid chromatography-direct electron ionization mass spectrometry.

    PubMed

    Riboni, Nicolò; Magrini, Laura; Bianchi, Federica; Careri, Maria; Cappiello, Achille

    2017-07-25

    Advances in interfacing liquid chromatography and electron ionization mass spectrometry are presented. New ion source coatings synthesized by sol-gel technology were developed and tested as vaporization surfaces in terms of peak intensity, peak width and peak delay for the liquid chromatography-direct electron ionization mass spectrometry (Direct-EI) determination of environmental pollutants like polycyclic aromatic hydrocarbons and steroids. Silica-, titania-, and zirconia-based coatings were sprayed inside the stainless steel ion source and characterized in terms of thermal stability, film thickness and morphology. Negligible weight losses until 350-400 °C were observed for all the materials, with coating thicknesses in the 6 (±1)-11 (±2) μm range for optimal ionization process. The best performances in terms of both peak intensity and peak width were obtained by using the silica-based coating: the detection of the investigated compounds was feasible at low ng μl -1 levels with a good precision (RSD < 9% for polycyclic aromatic hydrocarbons and <11% for hormones). Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Emergency Preparedness technology support to the Health and Safety Executive (HSE), Nuclear Installations Inspectorate (NII) of the United Kingdom. Appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.

    1994-03-01

    The Nuclear Installations Inspectorate (NII) of the United Kingdom (UK) suggested the use of an accident progression logic model method developed by Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) for K Reactor to predict the magnitude and timing of radioactivity releases (the source term) based on an advanced logic model methodology. Predicted releases are output from the personal computer-based model in a level-of-confidence format. Additional technical discussions eventually led to a request from the NII to develop a proposal for assembling a similar technology to predict source terms for the UK`s advanced gas-cooled reactor (AGR) type.more » To respond to this request, WSRC is submitting a proposal to provide contractual assistance as specified in the Scope of Work. The work will produce, document, and transfer technology associated with a Decision-Oriented Source Term Estimator for Emergency Preparedness (DOSE-EP) for the NII to apply to AGRs in the United Kingdom. This document, Appendix A is a part of this proposal.« less

  16. Fiber-based polarization-sensitive Mueller matrix optical coherence tomography with continuous source polarization modulation.

    PubMed

    Jiao, Shuliang; Todorović, Milos; Stoica, George; Wang, Lihong V

    2005-09-10

    We report on a new configuration of fiber-based polarization-sensitive Mueller matrix optical coherence tomography that permits the acquisition of the round-trip Jones matrix of a biological sample using only one light source and a single depth scan. In this new configuration, a polarization modulator is used in the source arm to continuously modulate the incident polarization state for both the reference and the sample arms. The Jones matrix of the sample can be calculated from the two frequency terms in the two detection channels. The first term is modulated by the carrier frequency, which is determined by the longitudinal scanning mechanism, whereas the other term is modulated by the beat frequency between the carrier frequency and the second harmonic of the modulation frequency of the polarization modulator. One important feature of this system is that, for the first time to our knowledge, the Jones matrix of the sample can be calculated with a single detection channel and a single measurement when diattenuation is negligible. The system was successfully tested by imaging both standard polarization elements and biological samples.

  17. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  18. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  19. Spent fuel radionuclide source-term model for assessing spent fuel performance in geological disposal. Part I: Assessment of the instant release fraction

    NASA Astrophysics Data System (ADS)

    Johnson, Lawrence; Ferry, Cécile; Poinssot, Christophe; Lovera, Patrick

    2005-11-01

    A source-term model for the short-term release of radionuclides from spent nuclear fuel (SNF) has been developed. It provides quantitative estimates of the fraction of various radionuclides that are expected to be released rapidly (the instant release fraction, or IRF) when water contacts the UO 2 or MOX fuel after container breaching in a geological repository. The estimates are based on correlation of leaching data for radionuclides with fuel burnup and fission gas release. Extrapolation of the data to higher fuel burnup values is based on examination of data on fuel restructuring, such as rim development, and on fission gas release data, which permits bounding IRF values to be estimated assuming that radionuclide releases will be less than fission gas release. The consideration of long-term solid-state changes influencing the IRF prior to canister breaching is addressed by evaluating alpha self-irradiation enhanced diffusion, which may gradually increase the accumulation of fission products at grain boundaries.

  20. Regularized Dual Averaging Image Reconstruction for Full-Wave Ultrasound Computed Tomography.

    PubMed

    Matthews, Thomas P; Wang, Kun; Li, Cuiping; Duric, Neb; Anastasio, Mark A

    2017-05-01

    Ultrasound computed tomography (USCT) holds great promise for breast cancer screening. Waveform inversion-based image reconstruction methods account for higher order diffraction effects and can produce high-resolution USCT images, but are computationally demanding. Recently, a source encoding technique has been combined with stochastic gradient descent (SGD) to greatly reduce image reconstruction times. However, this method bundles the stochastic data fidelity term with the deterministic regularization term. This limitation can be overcome by replacing SGD with a structured optimization method, such as the regularized dual averaging method, that exploits knowledge of the composition of the cost function. In this paper, the dual averaging method is combined with source encoding techniques to improve the effectiveness of regularization while maintaining the reduced reconstruction times afforded by source encoding. It is demonstrated that each iteration can be decomposed into a gradient descent step based on the data fidelity term and a proximal update step corresponding to the regularization term. Furthermore, the regularization term is never explicitly differentiated, allowing nonsmooth regularization penalties to be naturally incorporated. The wave equation is solved by the use of a time-domain method. The effectiveness of this approach is demonstrated through computer simulation and experimental studies. The results suggest that the dual averaging method can produce images with less noise and comparable resolution to those obtained by the use of SGD.

  1. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    NASA Astrophysics Data System (ADS)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  2. Source terms, shielding calculations and soil activation for a medical cyclotron.

    PubMed

    Konheiser, J; Naumann, B; Ferrari, A; Brachem, C; Müller, S E

    2016-12-01

    Calculations of the shielding and estimates of soil activation for a medical cyclotron are presented in this work. Based on the neutron source term from the 18 O(p,n) 18 F reaction produced by a 28 MeV proton beam, neutron and gamma dose rates outside the building were estimated with the Monte Carlo code MCNP6 (Goorley et al 2012 Nucl. Technol. 180 298-315). The neutron source term was calculated with the MCNP6 code and FLUKA (Ferrari et al 2005 INFN/TC_05/11, SLAC-R-773) code as well as with supplied data by the manufacturer. MCNP and FLUKA calculations yielded comparable results, while the neutron yield obtained using the manufacturer-supplied information is about a factor of 5 smaller. The difference is attributed to the missing channels in the manufacturer-supplied neutron source terms which considers only the 18 O(p,n) 18 F reaction, whereas the MCNP and FLUKA calculations include additional neutron reaction channels. Soil activation was performed using the FLUKA code. The estimated dose rate based on MCNP6 calculations in the public area is about 0.035 µSv h -1 and thus significantly below the reference value of 0.5 µSv h -1 (2011 Strahlenschutzverordnung, 9 Auflage vom 01.11.2011, Bundesanzeiger Verlag). After 5 years of continuous beam operation and a subsequent decay time of 30 d, the activity concentration of the soil is about 0.34 Bq g -1 .

  3. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  4. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  5. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  6. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  7. 10 CFR 960.3-1-5 - Basis for site evaluations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... comparative evaluations of sites in terms of the capabilities of the natural barriers for waste isolation and.... Comparative site evaluations shall place primary importance on the natural barriers of the site. In such... only to the extent necessary to obtain realistic source terms for comparative site evaluations based on...

  8. 40 CFR 35.3555 - Intended Use Plan (IUP).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... description of the financial planning process undertaken for the Fund and the impact of funding decisions on the long-term financial health of the Fund. (4) Financial status. The IUP must describe the sources... project; the expected terms of financial assistance based on the best information available at the time...

  9. Golden Ratio Versus Pi as Random Sequence Sources for Monte Carlo Integration

    NASA Technical Reports Server (NTRS)

    Sen, S. K.; Agarwal, Ravi P.; Shaykhian, Gholam Ali

    2007-01-01

    We discuss here the relative merits of these numbers as possible random sequence sources. The quality of these sequences is not judged directly based on the outcome of all known tests for the randomness of a sequence. Instead, it is determined implicitly by the accuracy of the Monte Carlo integration in a statistical sense. Since our main motive of using a random sequence is to solve real world problems, it is more desirable if we compare the quality of the sequences based on their performances for these problems in terms of quality/accuracy of the output. We also compare these sources against those generated by a popular pseudo-random generator, viz., the Matlab rand and the quasi-random generator ha/ton both in terms of error and time complexity. Our study demonstrates that consecutive blocks of digits of each of these numbers produce a good random sequence source. It is observed that randomly chosen blocks of digits do not have any remarkable advantage over consecutive blocks for the accuracy of the Monte Carlo integration. Also, it reveals that pi is a better source of a random sequence than theta when the accuracy of the integration is concerned.

  10. Effect of Americium-241 Content on Plutonium Radiation Source Terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    1998-12-28

    The management of excess plutonium by the US Department of Energy includes a number of storage and disposition alternatives. Savannah River Site (SRS) is supporting DOE with plutonium disposition efforts, including the immobilization of certain plutonium materials in a borosilicate glass matrix. Surplus plutonium inventories slated for vitrification include materials with elevated levels of Americium-241. The Am-241 content of plutonium materials generally reflects in-growth of the isotope due to decay of plutonium and is age-dependent. However, select plutonium inventories have Am-241 levels considerably above the age-based levels. Elevated levels of americium significantly impact radiation source terms of plutonium materials andmore » will make handling of the materials more difficult. Plutonium materials are normally handled in shielded glove boxes, and the work entails both extremity and whole body exposures. This paper reports results of an SRS analysis of plutonium materials source terms vs. the Americium-241 content of the materials. Data with respect to dependence and magnitude of source terms on/vs. Am-241 levels are presented and discussed. The investigation encompasses both vitrified and un-vitrified plutonium oxide (PuO2) batches.« less

  11. 47 CFR 24.5 - Terms and definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... in the National Geodetic Survey (NGS) data base. (Source: National Geodetic Survey, U.S. Department... antenna site. Base Station. A land station in the land mobile service. Broadband PCS. PCS services.... Fixed Station. A station in the fixed service. Land Mobile Service. A mobile service between base...

  12. 47 CFR 24.5 - Terms and definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... in the National Geodetic Survey (NGS) data base. (Source: National Geodetic Survey, U.S. Department... antenna site. Base Station. A land station in the land mobile service. Broadband PCS. PCS services.... Fixed Station. A station in the fixed service. Land Mobile Service. A mobile service between base...

  13. 47 CFR 24.5 - Terms and definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... in the National Geodetic Survey (NGS) data base. (Source: National Geodetic Survey, U.S. Department... antenna site. Base Station. A land station in the land mobile service. Broadband PCS. PCS services.... Fixed Station. A station in the fixed service. Land Mobile Service. A mobile service between base...

  14. 47 CFR 24.5 - Terms and definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... in the National Geodetic Survey (NGS) data base. (Source: National Geodetic Survey, U.S. Department... antenna site. Base Station. A land station in the land mobile service. Broadband PCS. PCS services.... Fixed Station. A station in the fixed service. Land Mobile Service. A mobile service between base...

  15. Characterizing SRAM Single Event Upset in Terms of Single and Double Node Charge Collection

    NASA Technical Reports Server (NTRS)

    Black, J. D.; Ball, D. R., II; Robinson, W. H.; Fleetwood, D. M.; Schrimpf, R. D.; Reed, R. A.; Black, D. A.; Warren, K. M.; Tipton, A. D.; Dodd, P. E.; hide

    2008-01-01

    A well-collapse source-injection mode for SRAM SEU is demonstrated through TCAD modeling. The recovery of the SRAM s state is shown to be based upon the resistive path from the p+-sources in the SRAM to the well. Multiple cell upset patterns for direct charge collection and the well-collapse source-injection mechanisms are then predicted and compared to recent SRAM test data.

  16. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  17. Equivalent charge source model based iterative maximum neighbor weight for sparse EEG source localization.

    PubMed

    Xu, Peng; Tian, Yin; Lei, Xu; Hu, Xiao; Yao, Dezhong

    2008-12-01

    How to localize the neural electric activities within brain effectively and precisely from the scalp electroencephalogram (EEG) recordings is a critical issue for current study in clinical neurology and cognitive neuroscience. In this paper, based on the charge source model and the iterative re-weighted strategy, proposed is a new maximum neighbor weight based iterative sparse source imaging method, termed as CMOSS (Charge source model based Maximum neighbOr weight Sparse Solution). Different from the weight used in focal underdetermined system solver (FOCUSS) where the weight for each point in the discrete solution space is independently updated in iterations, the new designed weight for each point in each iteration is determined by the source solution of the last iteration at both the point and its neighbors. Using such a new weight, the next iteration may have a bigger chance to rectify the local source location bias existed in the previous iteration solution. The simulation studies with comparison to FOCUSS and LORETA for various source configurations were conducted on a realistic 3-shell head model, and the results confirmed the validation of CMOSS for sparse EEG source localization. Finally, CMOSS was applied to localize sources elicited in a visual stimuli experiment, and the result was consistent with those source areas involved in visual processing reported in previous studies.

  18. Estimating Differences in Area-Level Impacts of Various Recruiting Resources: Can Different Recruiting Areas and Years by Pooled?

    DTIC Science & Technology

    1983-08-01

    Local Leads (Qualified and Interested) from LAMS Advertising (Based on FY82 Experience) Table 7 - Long Term Elasticities for Navy-Sourced NOIC Leads...Area Level Elasticities for Total NOIC Leads (Regardless of Source of Advertising ) for FY79, FY80 (FY80: 146,465) Appendix - Table la - Comparison of...of national .1*S leads (e.g., NOIC leads from a Navy source or from Joint DOD advertising (JADOR) sources), and for local leads. An Appendix

  19. Sources for Developing a Theory of Visual Literacy.

    ERIC Educational Resources Information Center

    Hortin, John A.

    Organized as a bibliographic essay, this paper examines the many sources available for developing a theory of visual literacy. Several definitions are offered in order to clarify the meaning of the term "visual literacy" so that meaningful research can be conducted on the topic. Based on the review of resources, three recommendations are offered…

  20. Assessment of macroseismic intensity in the Nile basin, Egypt

    NASA Astrophysics Data System (ADS)

    Fergany, Elsayed

    2018-01-01

    This work intends to assess deterministic seismic hazard and risk analysis in terms of the maximum expected intensity map of the Egyptian Nile basin sector. Seismic source zone model of Egypt was delineated based on updated compatible earthquake catalog in 2015, focal mechanisms, and the common tectonic elements. Four effective seismic source zones were identified along the Nile basin. The observed macroseismic intensity data along the basin was used to develop intensity prediction equation defined in terms of moment magnitude. Expected maximum intensity map was proven based on the developed intensity prediction equation, identified effective seismic source zones, and maximum expected magnitude for each zone along the basin. The earthquake hazard and risk analysis was discussed and analyzed in view of the maximum expected moment magnitude and the maximum expected intensity values for each effective source zone. Moderate expected magnitudes are expected to put high risk at Cairo and Aswan regions. The results of this study could be a recommendation for the planners in charge to mitigate the seismic risk at these strategic zones of Egypt.

  1. Towards next generation time-domain diffuse optics devices

    NASA Astrophysics Data System (ADS)

    Dalla Mora, Alberto; Contini, Davide; Arridge, Simon R.; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio

    2015-03-01

    Diffuse Optics is growing in terms of applications ranging from e.g. oximetry, to mammography, molecular imaging, quality assessment of food and pharmaceuticals, wood optics, physics of random media. Time-domain (TD) approaches, although appealing in terms of quantitation and depth sensibility, are presently limited to large fiber-based systems, with limited number of source-detector pairs. We present a miniaturized TD source-detector probe embedding integrated laser sources and single-photon detectors. Some electronics are still external (e.g. power supply, pulse generators, timing electronics), yet full integration on-board using already proven technologies is feasible. The novel devices were successfully validated on heterogeneous phantoms showing performances comparable to large state-of-the-art TD rack-based systems. With an investigation based on simulations we provide numerical evidence that the possibility to stack many TD compact source-detector pairs in a dense, null source-detector distance arrangement could yield on the brain cortex about 1 decade higher contrast as compared to a continuous wave (CW) approach. Further, a 3-fold increase in the maximum depth (down to 6 cm) is estimated, opening accessibility to new organs such as the lung or the heart. Finally, these new technologies show the way towards compact and wearable TD probes with orders of magnitude reduction in size and cost, for a widespread use of TD devices in real life.

  2. Comparison of the landslide susceptibility models in Taipei Water Source Domain, Taiwan

    NASA Astrophysics Data System (ADS)

    WU, C. Y.; Yeh, Y. C.; Chou, T. H.

    2017-12-01

    Taipei Water Source Domain, locating at the southeast of Taipei Metropolis, is the main source of water resource in this region. Recently, the downstream turbidity often soared significantly during the typhoon period because of the upstream landslides. The landslide susceptibilities should be analysed to assess the influence zones caused by different rainfall events, and to ensure the abilities of this domain to serve enough and quality water resource. Generally, the landslide susceptibility models can be established based on either a long-term landslide inventory or a specified landslide event. Sometimes, there is no long-term landslide inventory in some areas. Thus, the event-based landslide susceptibility models are established widely. However, the inventory-based and event-based landslide susceptibility models may result in dissimilar susceptibility maps in the same area. So the purposes of this study were to compare the landslide susceptibility maps derived from the inventory-based and event-based models, and to interpret how to select a representative event to be included in the susceptibility model. The landslide inventory from Typhoon Tim in July, 1994 and Typhoon Soudelor in August, 2015 was collected, and used to establish the inventory-based landslide susceptibility model. The landslides caused by Typhoon Nari and rainfall data were used to establish the event-based model. The results indicated the high susceptibility slope-units were located at middle upstream Nan-Shih Stream basin.

  3. A Parameter Identification Method for Helicopter Noise Source Identification and Physics-Based Semi-Empirical Modeling

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric, II; Schmitz, Fredric H.

    2010-01-01

    A new physics-based parameter identification method for rotor harmonic noise sources is developed using an acoustic inverse simulation technique. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. This new method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor Blade-Vortex Interaction (BVI) noise, allowing accurate estimates of BVI noise to be made for operating conditions based on a small number of measurements taken at different operating conditions.

  4. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  5. Perspectives on Performance-Based Incentive Plans.

    ERIC Educational Resources Information Center

    Duttweiler, Patricia Cloud; Ramos-Cancel, Maria L.

    This document is a synthesis of the current literature on performance-based incentive systems for teachers and administrators. Section one provides an introduction to the reform movement and to performance-based pay initiatives; a definition of terms; a brief discussion of funding sources; a discussion of compensation strategies; a description of…

  6. Modeling of Radiotherapy Linac Source Terms Using ARCHER Monte Carlo Code: Performance Comparison for GPU and MIC Parallel Computing Devices

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Liu, Tianyu; Su, Lin; Bednarz, Bryan; Caracappa, Peter; Xu, X. George

    2017-09-01

    Monte Carlo (MC) simulation is well recognized as the most accurate method for radiation dose calculations. For radiotherapy applications, accurate modelling of the source term, i.e. the clinical linear accelerator is critical to the simulation. The purpose of this paper is to perform source modelling and examine the accuracy and performance of the models on Intel Many Integrated Core coprocessors (aka Xeon Phi) and Nvidia GPU using ARCHER and explore the potential optimization methods. Phase Space-based source modelling for has been implemented. Good agreements were found in a tomotherapy prostate patient case and a TrueBeam breast case. From the aspect of performance, the whole simulation for prostate plan and breast plan cost about 173s and 73s with 1% statistical error.

  7. Midfield wireless powering of subwavelength autonomous devices.

    PubMed

    Kim, Sanghoek; Ho, John S; Poon, Ada S Y

    2013-05-17

    We obtain an analytical bound on the efficiency of wireless power transfer to a weakly coupled device. The optimal source is solved for a multilayer geometry in terms of a representation based on the field equivalence principle. The theory reveals that optimal power transfer exploits the properties of the midfield to achieve efficiencies far greater than conventional coil-based designs. As a physical realization of the source, we present a slot array structure whose performance closely approaches the theoretical bound.

  8. Management of Ultimate Risk of Nuclear Power Plants by Source Terms - Lessons Learned from the Chernobyl Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genn Saji

    2006-07-01

    The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less

  9. A hemispherical Langmuir probe array detector for angular resolved measurements on droplet-based laser-produced plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gambino, Nadia, E-mail: gambinon@ethz.ch; Brandstätter, Markus; Rollinger, Bob

    2014-09-15

    In this work, a new diagnostic tool for laser-produced plasmas (LPPs) is presented. The detector is based on a multiple array of six motorized Langmuir probes. It allows to measure the dynamics of a LPP in terms of charged particles detection with particular attention to droplet-based LPP sources for EUV lithography. The system design permits to temporally resolve the angular and radial plasma charge distribution and to obtain a hemispherical mapping of the ions and electrons around the droplet plasma. The understanding of these dynamics is fundamental to improve the debris mitigation techniques for droplet-based LPP sources. The device hasmore » been developed, built, and employed at the Laboratory for Energy Conversion, ETH Zürich. The experimental results have been obtained on the droplet-based LPP source ALPS II. For the first time, 2D mappings of the ion kinetic energy distribution around the droplet plasma have been obtained with an array of multiple Langmuir probes. These measurements show an anisotropic expansion of the ions in terms of kinetic energy and amount of ion charge around the droplet target. First estimations of the plasma density and electron temperature were also obtained from the analysis of the probe current signals.« less

  10. Unsupervised Segmentation of Head Tissues from Multi-modal MR Images for EEG Source Localization.

    PubMed

    Mahmood, Qaiser; Chodorowski, Artur; Mehnert, Andrew; Gellermann, Johanna; Persson, Mikael

    2015-08-01

    In this paper, we present and evaluate an automatic unsupervised segmentation method, hierarchical segmentation approach (HSA)-Bayesian-based adaptive mean shift (BAMS), for use in the construction of a patient-specific head conductivity model for electroencephalography (EEG) source localization. It is based on a HSA and BAMS for segmenting the tissues from multi-modal magnetic resonance (MR) head images. The evaluation of the proposed method was done both directly in terms of segmentation accuracy and indirectly in terms of source localization accuracy. The direct evaluation was performed relative to a commonly used reference method brain extraction tool (BET)-FMRIB's automated segmentation tool (FAST) and four variants of the HSA using both synthetic data and real data from ten subjects. The synthetic data includes multiple realizations of four different noise levels and several realizations of typical noise with a 20% bias field level. The Dice index and Hausdorff distance were used to measure the segmentation accuracy. The indirect evaluation was performed relative to the reference method BET-FAST using synthetic two-dimensional (2D) multimodal magnetic resonance (MR) data with 3% noise and synthetic EEG (generated for a prescribed source). The source localization accuracy was determined in terms of localization error and relative error of potential. The experimental results demonstrate the efficacy of HSA-BAMS, its robustness to noise and the bias field, and that it provides better segmentation accuracy than the reference method and variants of the HSA. They also show that it leads to a more accurate localization accuracy than the commonly used reference method and suggest that it has potential as a surrogate for expert manual segmentation for the EEG source localization problem.

  11. Source and long-term behavior of transuranic aerosols in the WIPP environment.

    PubMed

    Thakur, P; Lemons, B G

    2016-10-01

    Source and long-term behavior transuranic aerosols ((239+240)Pu, (238)Pu, and (241)Am) in the ambient air samples collected at and near the Waste Isolation Pilot Plant (WIPP) deep geologic repository site were investigated using historical data from an independent monitoring program conducted by the Carlsbad Environmental Monitoring and Research Center and an oversight monitoring program conducted by the management and operating contractor for WIPP at and near the facility. An analysis of historical data indicates frequent detections of (239+240)Pu and (241)Am, whereas (238)Pu is detected infrequently. Peaks in (239+240)Pu and (241)Am concentrations in ambient air generally occur from March to June timeframe, which is when strong and gusty winds in the area frequently give rise to blowing dust. Long-term measurements of plutonium isotopes (1985-2015) in the WIPP environment suggest that the resuspension of previously contaminated soils is likely the primary source of plutonium in the ambient air samples from WIPP and its vicinity. There is no evidence that WIPP is a source of environmental contamination that can be considered significant by any health-based standard.

  12. Parameterizing unresolved obstacles with source terms in wave modeling: A real-world application

    NASA Astrophysics Data System (ADS)

    Mentaschi, Lorenzo; Kakoulaki, Georgia; Vousdoukas, Michalis; Voukouvalas, Evangelos; Feyen, Luc; Besio, Giovanni

    2018-06-01

    Parameterizing the dissipative effects of small, unresolved coastal features, is fundamental to improve the skills of wave models. The established technique to deal with this problem consists in reducing the amount of energy advected within the propagation scheme, and is currently available only for regular grids. To find a more general approach, Mentaschi et al., 2015b formulated a technique based on source terms, and validated it on synthetic case studies. This technique separates the parameterization of the unresolved features from the energy advection, and can therefore be applied to any numerical scheme and to any type of mesh. Here we developed an open-source library for the estimation of the transparency coefficients needed by this approach, from bathymetric data and for any type of mesh. The spectral wave model WAVEWATCH III was used to show that in a real-world domain, such as the Caribbean Sea, the proposed approach has skills comparable and sometimes better than the established propagation-based technique.

  13. Certification of Public Librarians in the United States. A Detailed Summary of Legally Mandated and Voluntary Certification Plans for Public Librarians Based on Information Supplied by the Various Certificating State Agencies or Other Appropriate Sources. 3rd Edition.

    ERIC Educational Resources Information Center

    Coe, Mary J., Ed.

    This report contains summaries of legally mandated and voluntary certification plans for public librarians in the United States based on information supplied by the various certifying state agencies or other appropriate sources in April 1979. Each plan is identified by the descriptive terms "mandatory" (certification required by law--23 states),…

  14. An Ultradeep Chandra Catalog of X-Ray Point Sources in the Galactic Center Star Cluster

    NASA Astrophysics Data System (ADS)

    Zhu, Zhenlin; Li, Zhiyuan; Morris, Mark R.

    2018-04-01

    We present an updated catalog of X-ray point sources in the inner 500″ (∼20 pc) of the Galactic center (GC), where the nuclear star cluster (NSC) stands, based on a total of ∼4.5 Ms of Chandra observations taken from 1999 September to 2013 April. This ultradeep data set offers unprecedented sensitivity for detecting X-ray sources in the GC, down to an intrinsic 2–10 keV luminosity of 1.0 × 1031 erg s‑1. A total of 3619 sources are detected in the 2–8 keV band, among which ∼3500 are probable GC sources and ∼1300 are new identifications. The GC sources collectively account for ∼20% of the total 2–8 keV flux from the inner 250″ region where detection sensitivity is the greatest. Taking advantage of this unprecedented sample of faint X-ray sources that primarily traces the old stellar populations in the NSC, we revisit global source properties, including long-term variability, cumulative spectra, luminosity function, and spatial distribution. Based on the equivalent width and relative strength of the iron lines, we suggest that in addition to the arguably predominant population of magnetic cataclysmic variables (CVs), nonmagnetic CVs contribute substantially to the detected sources, especially in the lower-luminosity group. On the other hand, the X-ray sources have a radial distribution closely following the stellar mass distribution in the NSC, but much flatter than that of the known X-ray transients, which are presumably low-mass X-ray binaries (LMXBs) caught in outburst. This, together with the very modest long-term variability of the detected sources, strongly suggests that quiescent LMXBs are a minor (less than a few percent) population.

  15. Characterization of the Multi-Blade 10B-based detector at the CRISP reflectometer at ISIS for neutron reflectometry at ESS

    NASA Astrophysics Data System (ADS)

    Piscitelli, F.; Mauri, G.; Messi, F.; Anastasopoulos, M.; Arnold, T.; Glavic, A.; Höglund, C.; Ilves, T.; Lopez Higuera, I.; Pazmandi, P.; Raspino, D.; Robinson, L.; Schmidt, S.; Svensson, P.; Varga, D.; Hall-Wilton, R.

    2018-05-01

    The Multi-Blade is a Boron-10-based gaseous thermal neutron detector developed to face the challenge arising in neutron reflectometry at neutron sources. Neutron reflectometers are challenging instruments in terms of instantaneous counting rate and spatial resolution. This detector has been designed according to the requirements given by the reflectometers at the European Spallation Source (ESS) in Sweden. The Multi-Blade has been installed and tested on the CRISP reflectometer at the ISIS neutron and muon source in U.K.. The results on the detailed detector characterization are discussed in this manuscript.

  16. The Relationship between Censorship and the Emotional and Critical Tone of Television News Coverage of the Persian Gulf War.

    ERIC Educational Resources Information Center

    Newhagen, John E.

    1994-01-01

    Analyzes television news stories broadcast during the Persian Gulf War for censorship disclaimers, the censoring source, and the producing network. Discusses results in terms of both production- and viewer-based differences. Considers the question of whether censorship "works" in terms of unanticipated results related to story…

  17. Revisiting the radionuclide atmospheric dispersion event of the Chernobyl disaster - modelling sensitivity and data assimilation

    NASA Astrophysics Data System (ADS)

    Roustan, Yelva; Duhanyan, Nora; Bocquet, Marc; Winiarek, Victor

    2013-04-01

    A sensitivity study of the numerical model, as well as, an inverse modelling approach applied to the atmospheric dispersion issues after the Chernobyl disaster are both presented in this paper. On the one hand, the robustness of the source term reconstruction through advanced data assimilation techniques was tested. On the other hand, the classical approaches for sensitivity analysis were enhanced by the use of an optimised forcing field which otherwise is known to be strongly uncertain. The POLYPHEMUS air quality system was used to perform the simulations of radionuclide dispersion. Activity concentrations in air and deposited to the ground of iodine-131, caesium-137 and caesium-134 were considered. The impact of the implemented parameterizations of the physical processes (dry and wet depositions, vertical turbulent diffusion), of the forcing fields (meteorology and source terms) and of the numerical configuration (horizontal resolution) were investigated for the sensitivity study of the model. A four dimensional variational scheme (4D-Var) based on the approximate adjoint of the chemistry transport model was used to invert the source term. The data assimilation is performed with measurements of activity concentrations in air extracted from the Radioactivity Environmental Monitoring (REM) database. For most of the investigated configurations (sensitivity study), the statistics to compare the model results to the field measurements as regards the concentrations in air are clearly improved while using a reconstructed source term. As regards the ground deposited concentrations, an improvement can only be seen in case of satisfactorily modelled episode. Through these studies, the source term and the meteorological fields are proved to have a major impact on the activity concentrations in air. These studies also reinforce the use of reconstructed source term instead of the usual estimated one. A more detailed parameterization of the deposition process seems also to be able to improve the simulation results. For deposited activities the results are more complex probably due to a strong sensitivity to some of the meteorological fields which remain quite uncertain.

  18. Performance evaluation of a permanent ring magnet based helicon plasma source for negative ion source research

    NASA Astrophysics Data System (ADS)

    Pandey, Arun; Bandyopadhyay, M.; Sudhir, Dass; Chakraborty, A.

    2017-10-01

    Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.

  19. Performance evaluation of a permanent ring magnet based helicon plasma source for negative ion source research.

    PubMed

    Pandey, Arun; Bandyopadhyay, M; Sudhir, Dass; Chakraborty, A

    2017-10-01

    Helicon wave heated plasmas are much more efficient in terms of ionization per unit power consumed. A permanent magnet based compact helicon wave heated plasma source is developed in the Institute for Plasma Research, after carefully optimizing the geometry, the frequency of the RF power, and the magnetic field conditions. The HELicon Experiment for Negative ion-I source is the single driver helicon plasma source that is being studied for the development of a large sized, multi-driver negative hydrogen ion source. In this paper, the details about the single driver machine and the results from the characterization of the device are presented. A parametric study at different pressures and magnetic field values using a 13.56 MHz RF source has been carried out in argon plasma, as an initial step towards source characterization. A theoretical model is also presented for the particle and power balance in the plasma. The ambipolar diffusion process taking place in a magnetized helicon plasma is also discussed.

  20. Martian methane plume models for defining Mars rover methane source search strategies

    NASA Astrophysics Data System (ADS)

    Nicol, Christopher; Ellery, Alex; Lynch, Brian; Cloutis, Ed

    2018-07-01

    The detection of atmospheric methane on Mars implies an active methane source. This introduces the possibility of a biotic source with the implied need to determine whether the methane is indeed biotic in nature or geologically generated. There is a clear need for robotic algorithms which are capable of manoeuvring a rover through a methane plume on Mars to locate its source. We explore aspects of Mars methane plume modelling to reveal complex dynamics characterized by advection and diffusion. A statistical analysis of the plume model has been performed and compared to analyses of terrestrial plume models. Finally, we consider a robotic search strategy to find a methane plume source. We find that gradient-based techniques are ineffective, but that more sophisticated model-based search strategies are unlikely to be available in near-term rover missions.

  1. Method and system of filtering and recommending documents

    DOEpatents

    Patton, Robert M.; Potok, Thomas E.

    2016-02-09

    Disclosed is a method and system for discovering documents using a computer and providing a small set of the most relevant documents to the attention of a human observer. Using the method, the computer obtains a seed document from the user and generates a seed document vector using term frequency-inverse corpus frequency weighting. A keyword index for a plurality of source documents can be compared with the weighted terms of the seed document vector. The comparison is then filtered to reduce the number of documents, which define an initial subset of the source documents. Initial subset vectors are generated and compared to the seed document vector to obtain a similarity value for each comparison. Based on the similarity value, the method then recommends one or more of the source documents.

  2. Modeling Interactions Among Turbulence, Gas-Phase Chemistry, Soot and Radiation Using Transported PDF Methods

    NASA Astrophysics Data System (ADS)

    Haworth, Daniel

    2013-11-01

    The importance of explicitly accounting for the effects of unresolved turbulent fluctuations in Reynolds-averaged and large-eddy simulations of chemically reacting turbulent flows is increasingly recognized. Transported probability density function (PDF) methods have emerged as one of the most promising modeling approaches for this purpose. In particular, PDF methods provide an elegant and effective resolution to the closure problems that arise from averaging or filtering terms that correspond to nonlinear point processes, including chemical reaction source terms and radiative emission. PDF methods traditionally have been associated with studies of turbulence-chemistry interactions in laboratory-scale, atmospheric-pressure, nonluminous, statistically stationary nonpremixed turbulent flames; and Lagrangian particle-based Monte Carlo numerical algorithms have been the predominant method for solving modeled PDF transport equations. Recent advances and trends in PDF methods are reviewed and discussed. These include advances in particle-based algorithms, alternatives to particle-based algorithms (e.g., Eulerian field methods), treatment of combustion regimes beyond low-to-moderate-Damköhler-number nonpremixed systems (e.g., premixed flamelets), extensions to include radiation heat transfer and multiphase systems (e.g., soot and fuel sprays), and the use of PDF methods as the basis for subfilter-scale modeling in large-eddy simulation. Examples are provided that illustrate the utility and effectiveness of PDF methods for physics discovery and for applications to practical combustion systems. These include comparisons of results obtained using the PDF method with those from models that neglect unresolved turbulent fluctuations in composition and temperature in the averaged or filtered chemical source terms and/or the radiation heat transfer source terms. In this way, the effects of turbulence-chemistry-radiation interactions can be isolated and quantified.

  3. Survey on the Performance of Source Localization Algorithms.

    PubMed

    Fresno, José Manuel; Robles, Guillermo; Martínez-Tarifa, Juan Manuel; Stewart, Brian G

    2017-11-18

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton-Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm.

  4. Survey on the Performance of Source Localization Algorithms

    PubMed Central

    2017-01-01

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton–Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm. PMID:29156565

  5. Influences on and Obstacles to K-12 Administrators' Support for Environment-Based Education

    ERIC Educational Resources Information Center

    Ernst, Julie

    2012-01-01

    The term environment-based education (EBE) describes a form of school-based environmental education that uses the environment as a context for integrating subjects and a source of real-world learning experiences. Because of the importance of administrator support in teachers' use of EBE (Ernst, 2009), survey research was conducted to explore…

  6. A new traffic model with a lane-changing viscosity term

    NASA Astrophysics Data System (ADS)

    Ko, Hung-Tang; Liu, Xiao-He; Guo, Ming-Min; Wu, Zheng

    2015-09-01

    In this paper, a new continuum traffic flow model is proposed, with a lane-changing source term in the continuity equation and a lane-changing viscosity term in the acceleration equation. Based on previous literature, the source term addresses the impact of speed difference and density difference between adjacent lanes, which provides better precision for free lane-changing simulation; the viscosity term turns lane-changing behavior to a “force” that may influence speed distribution. Using a flux-splitting scheme for the model discretization, two cases are investigated numerically. The case under a homogeneous initial condition shows that the numerical results by our model agree well with the analytical ones; the case with a small initial disturbance shows that our model can simulate the evolution of perturbation, including propagation, dissipation, cluster effect and stop-and-go phenomenon. Project supported by the National Natural Science Foundation of China (Grant Nos. 11002035 and 11372147) and Hui-Chun Chin and Tsung-Dao Lee Chinese Undergraduate Research Endowment (Grant No. CURE 14024).

  7. Identification of Spurious Signals from Permeable Ffowcs Williams and Hawkings Surfaces

    NASA Technical Reports Server (NTRS)

    Lopes, Leonard V.; Boyd, David D., Jr.; Nark, Douglas M.; Wiedemann, Karl E.

    2017-01-01

    Integral forms of the permeable surface formulation of the Ffowcs Williams and Hawkings (FW-H) equation often require an input in the form of a near field Computational Fluid Dynamics (CFD) solution to predict noise in the near or far field from various types of geometries. The FW-H equation involves three source terms; two surface terms (monopole and dipole) and a volume term (quadrupole). Many solutions to the FW-H equation, such as several of Farassat's formulations, neglect the quadrupole term. Neglecting the quadrupole term in permeable surface formulations leads to inaccuracies called spurious signals. This paper explores the concept of spurious signals, explains how they are generated by specifying the acoustic and hydrodynamic surface properties individually, and provides methods to determine their presence, regardless of whether a correction algorithm is employed. A potential approach based on the equivalent sources method (ESM) and the sensitivity of Formulation 1A (Formulation S1A) is also discussed for the removal of spurious signals.

  8. Comparison of advanced rechargeable batteries for autonomous underwater vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Descroix, J.P.; Chagnon, G.

    1994-12-31

    For AUV to be promising in the field of military oceanic and scientific missions, it is of great importance that power sources must meet the system needs. In view of this, this article will address the present and near term options for electric power sources. Evaluation is based on a hypothetical AUV. It is expected that considerable results will be achieved with respect to the possible options and cost needed in the manufacture of such power sources. 5 refs.

  9. The numerical dynamic for highly nonlinear partial differential equations

    NASA Technical Reports Server (NTRS)

    Lafon, A.; Yee, H. C.

    1992-01-01

    Problems associated with the numerical computation of highly nonlinear equations in computational fluid dynamics are set forth and analyzed in terms of the potential ranges of spurious behaviors. A reaction-convection equation with a nonlinear source term is employed to evaluate the effects related to spatial and temporal discretizations. The discretization of the source term is described according to several methods, and the various techniques are shown to have a significant effect on the stability of the spurious solutions. Traditional linearized stability analyses cannot provide the level of confidence required for accurate fluid dynamics computations, and the incorporation of nonlinear analysis is proposed. Nonlinear analysis based on nonlinear dynamical systems complements the conventional linear approach and is valuable in the analysis of hypersonic aerodynamics and combustion phenomena.

  10. Monitoring Knowledge Base (MKB)

    EPA Pesticide Factsheets

    The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial process descriptions, and permitting techniques, including flexible permit development. Using MKB, one can gain a comprehensive understanding of emissions sources, control devices, and monitoring techniques, enabling one to determine appropriate permit terms and conditions.

  11. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  12. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  13. A boundary element approach to optimization of active noise control sources on three-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cunefare, K. A.; Koopmann, G. H.

    1991-01-01

    This paper presents the theoretical development of an approach to active noise control (ANC) applicable to three-dimensional radiators. The active noise control technique, termed ANC Optimization Analysis, is based on minimizing the total radiated power by adding secondary acoustic sources on the primary noise source. ANC Optimization Analysis determines the optimum magnitude and phase at which to drive the secondary control sources in order to achieve the best possible reduction in the total radiated power from the noise source/control source combination. For example, ANC Optimization Analysis predicts a 20 dB reduction in the total power radiated from a sphere of radius at a dimensionless wavenumber ka of 0.125, for a single control source representing 2.5 percent of the total area of the sphere. ANC Optimization Analysis is based on a boundary element formulation of the Helmholtz Integral Equation, and thus, the optimization analysis applies to a single frequency, while multiple frequencies can be treated through repeated analyses.

  14. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  15. Audio visual speech source separation via improved context dependent association model

    NASA Astrophysics Data System (ADS)

    Kazemi, Alireza; Boostani, Reza; Sobhanmanesh, Fariborz

    2014-12-01

    In this paper, we exploit the non-linear relation between a speech source and its associated lip video as a source of extra information to propose an improved audio-visual speech source separation (AVSS) algorithm. The audio-visual association is modeled using a neural associator which estimates the visual lip parameters from a temporal context of acoustic observation frames. We define an objective function based on mean square error (MSE) measure between estimated and target visual parameters. This function is minimized for estimation of the de-mixing vector/filters to separate the relevant source from linear instantaneous or time-domain convolutive mixtures. We have also proposed a hybrid criterion which uses AV coherency together with kurtosis as a non-Gaussianity measure. Experimental results are presented and compared in terms of visually relevant speech detection accuracy and output signal-to-interference ratio (SIR) of source separation. The suggested audio-visual model significantly improves relevant speech classification accuracy compared to existing GMM-based model and the proposed AVSS algorithm improves the speech separation quality compared to reference ICA- and AVSS-based methods.

  16. Benefits and hazards of dietary carbohydrate.

    PubMed

    Connor, William E; Duell, P Barton; Connor, Sonja L

    2005-11-01

    Since the dawn of civilization, carbohydrate has comprised the largest source of energy in the diet for most populations. The source of the carbohydrate has been from plants in the form of complex carbohydrate high in fiber. Only in affluent cultures has sugar contributed so much of the total energy. When carbohydrate is consumed as a major component of a plant-based diet, a high-carbohydrate, low-fat diet is associated with low plasma levels of total and low-density lipoprotein cholesterol, less coronary heart disease, less diabetes, and less obesity. Very low-carbohydrate (ketogenic) diets may provide short-term solutions but do not lead to a long-term solution for most people.

  17. Managing multicentre clinical trials with open source.

    PubMed

    Raptis, Dimitri Aristotle; Mettler, Tobias; Fischer, Michael Alexander; Patak, Michael; Lesurtel, Mickael; Eshmuminov, Dilmurodjon; de Rougemont, Olivier; Graf, Rolf; Clavien, Pierre-Alain; Breitenstein, Stefan

    2014-03-01

    Multicentre clinical trials are challenged by high administrative burden, data management pitfalls and costs. This leads to a reduced enthusiasm and commitment of the physicians involved and thus to a reluctance in conducting multicentre clinical trials. The purpose of this study was to develop a web-based open source platform to support a multi-centre clinical trial. We developed on Drupal, an open source software distributed under the terms of the General Public License, a web-based, multi-centre clinical trial management system with the design science research approach. This system was evaluated by user-testing and well supported several completed and on-going clinical trials and is available for free download. Open source clinical trial management systems are capable in supporting multi-centre clinical trials by enhancing efficiency, quality of data management and collaboration.

  18. Quantum connectivity optimization algorithms for entanglement source deployment in a quantum multi-hop network

    NASA Astrophysics Data System (ADS)

    Zou, Zhen-Zhen; Yu, Xu-Tao; Zhang, Zai-Chen

    2018-04-01

    At first, the entanglement source deployment problem is studied in a quantum multi-hop network, which has a significant influence on quantum connectivity. Two optimization algorithms are introduced with limited entanglement sources in this paper. A deployment algorithm based on node position (DNP) improves connectivity by guaranteeing that all overlapping areas of the distribution ranges of the entanglement sources contain nodes. In addition, a deployment algorithm based on an improved genetic algorithm (DIGA) is implemented by dividing the region into grids. From the simulation results, DNP and DIGA improve quantum connectivity by 213.73% and 248.83% compared to random deployment, respectively, and the latter performs better in terms of connectivity. However, DNP is more flexible and adaptive to change, as it stops running when all nodes are covered.

  19. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal

    NASA Astrophysics Data System (ADS)

    Johnston, C. D.; Davis, G. B.; Bastow, T. P.; Woodbury, R. J.; Rao, P. S. C.; Annable, M. D.; Rhodes, S.

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L3/L2/T) and mass fluxes (Jc; M/L2/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104 g day- 1 to 24-31 g day- 1 (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site.

  20. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal.

    PubMed

    Johnston, C D; Davis, G B; Bastow, T P; Woodbury, R J; Rao, P S C; Annable, M D; Rhodes, S

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L(3)/L(2)/T) and mass fluxes (Jc; M/L(2)/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104gday(-1) to 24-31gday(-1) (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Sources and Nature of Cost Analysis Data Base Reference Manual.

    DTIC Science & Technology

    1983-07-01

    COVERED Sources and Nature of Cost Analysis Data Base Interim Report (Update) Reference Manual 6 . PERFORMING ORG. REPORT NUMBER USAAVRADCOM TM 83-F-3 7 ...SECTION 6 - DATA FOR MULTIPLE APPLICATIONS 6.0.0 7.0.0 SECTION 7 - GLOSSARY OF COST ANALYSIS TERMS SECTION 8 - REFERENCES 8.0.0 SECTION 9 - BIBLIOGRAPHY...Relationsh-;ips Manual for the Army 1.14. 1 Yateri ci Command, TP-449, Mla; 1912 ( 7 21 RACKFORS JiR 1CO(PTER, INC. xlB.Aii- 6 -4A 1.15. 1 Z FNE>:THj MUNSON

  2. Estimating the Benefits of the Air Force Purchasing and Supply Chain Management Initiative

    DTIC Science & Technology

    2008-01-01

    sector, known as strategic sourcing.6 The Customer Relationship Management initiative ( CRM ) pro- vides a single customer point of contact for all... Customer Relationship Management initiative. commodity council A term used to describe a cross-functional sourc- ing group charged with formulating a...initiative has four major components, all based on commercial best practices (Gabreski, 2004): commodity councils customer relationship management

  3. Generation of GHS Scores from TEST and online sources ...

    EPA Pesticide Factsheets

    Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat toxicity, developmental toxicity, endocrine activity, and mutagenicity. It can be used to evaluate ecotoxicity (in terms of acute fathead minnow toxicity) and fate (in terms of bioconcentration factor). It also be used to estimate a variety of key physicochemical properties such as melting point, boiling point, vapor pressure, water solubility, and bioconcentration factor. A web-based version of T.E.S.T. is currently being developed to allow predictions to be made from other web tools. Online data sources such as from NCCT’s Chemistry Dashboard, REACH dossiers, or from ChemHat.org can also be utilized to obtain GHS (Global Harmonization System) scores for comparing alternatives. The purpose of this talk is to show how GHS (Global Harmonization Score) data can be obtained from literature sources and from T.E.S.T. (Toxicity Estimation Software Tool). This data will be used to compare chemical alternatives in the alternatives assessment dashboard (a 2018 CSS product).

  4. Design and implementation of wireless dose logger network for radiological emergency decision support system.

    PubMed

    Gopalakrishnan, V; Baskaran, R; Venkatraman, B

    2016-08-01

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.

  5. Design and implementation of wireless dose logger network for radiological emergency decision support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less

  6. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less

  7. An Improved Elastic and Nonelastic Neutron Transport Algorithm for Space Radiation

    NASA Technical Reports Server (NTRS)

    Clowdsley, Martha S.; Wilson, John W.; Heinbockel, John H.; Tripathi, R. K.; Singleterry, Robert C., Jr.; Shinn, Judy L.

    2000-01-01

    A neutron transport algorithm including both elastic and nonelastic particle interaction processes for use in space radiation protection for arbitrary shield material is developed. The algorithm is based upon a multiple energy grouping and analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. The algorithm is then coupled to the Langley HZETRN code through a bidirectional neutron evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for an aluminum water shield-target configuration is then compared with MCNPX and LAHET Monte Carlo calculations for the same shield-target configuration. With the Monte Carlo calculation as a benchmark, the algorithm developed in this paper showed a great improvement in results over the unmodified HZETRN solution. In addition, a high-energy bidirectional neutron source based on a formula by Ranft showed even further improvement of the fluence results over previous results near the front of the water target where diffusion out the front surface is important. Effects of improved interaction cross sections are modest compared with the addition of the high-energy bidirectional source terms.

  8. Activities and sources of income after a period of long-term sick leave--a population-based prospective cohort study.

    PubMed

    Wikman, Anders; Wiberg, Michael; Marklund, Staffan; Alexanderson, Kristina

    2012-09-06

    There is limited knowledge about what happens to people after long-term sick leave. The aim of this report was to conduct a prospective study of individuals who were on prolonged sick leave during a particular year, considering their activities and sources of income during subsequent years. To enable comparison of different time periods, we used three cohorts of individuals with different starting years. Using data from national registers, three separate cohorts were constructed that included all people living in Sweden who were 20-64 years of age (>5 million) in the years 1995, 2000 and 2005, respectively. The individual members of the cohorts were classified into the following groups based on their main source of income and activity in 1995-2008: on long-term sick leave, employed, old-age pensioner, long-term unemployed, disability pensioner, on parental leave, social assistance recipient, student allowance recipient, deceased, or emigrated. Most individuals on long-term (> 6 months) sick leave in 1995 were not employed 13 years later. Only 11% of the women and 13% of the men were primarily in employment after 13 years. Instead, a wide range of alternatives existed, for example, many had been granted disability pension, and about 10% of the women and 17% of the men had died during the follow-up period. A larger proportion of those with long-term sick leave were back in employment when 2005 was the starting year for the follow-up. The low future employment rates for people on long-term sick leave may seem surprising. There are several possible explanations for the finding: The disorders these people may have, might have entailed longstanding difficulties on the labor market. Besides, long-term absence from work, no matter what its causes were, might have worsen the chances of further employment. The economic cycles may also have been of importance. The improving labor market during later years seems to have improved the chances for employment among those earlier on long-term sick leave.

  9. GLAST and Ground-Based Gamma-Ray Astronomy

    NASA Technical Reports Server (NTRS)

    McEnery, Julie

    2008-01-01

    The launch of the Gamma-ray Large Area Space Telescope together with the advent of a new generation of ground-based gamma-ray detectors such as VERITAS, HESS, MAGIC and CANGAROO, will usher in a new era of high-energy gamma-ray astrophysics. GLAST and the ground based gamma-ray observatories will provide highly complementary capabilities for spectral, temporal and spatial studies of high energy gamma-ray sources. Joint observations will cover a huge energy range, from 20 MeV to over 20 TeV. The LAT will survey the entire sky every three hours, allowing it both to perform uniform, long-term monitoring of variable sources and to detect flaring sources promptly. Both functions complement the high-sensitivity pointed observations provided by ground-based detectors. Finally, the large field of view of GLAST will allow a study of gamma-ray emission on large angular scales and identify interesting regions of the sky for deeper studies at higher energies. In this poster, we will discuss the science returns that might result from joint GLAST/ground-based gamma-ray observations and illustrate them with detailed source simulations.

  10. Prevalence of microbiological contaminants in groundwater sources and risk factor assessment in Juba, South Sudan.

    PubMed

    Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael

    2015-05-15

    In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.

    2012-12-01

    A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less

  12. IN SITU SOURCE TREATMENT OF CR(VI) USING A FE(II)-BASED REDUCTANT BLEND: LONG-TERM MONITORING AND EVALUATION

    EPA Science Inventory

    The long-term effectiveness of a FeSO4 + Na2S2O4 reductant solution blend for in situ saturated zone treatment of dissolved and solid phase Cr(VI) in a high pH chromite ore processing solid waste (COPSW) fill material was investigated. Two field pilot injection studies were cond...

  13. Fatty acid-based formulations for wood protection against mold and sapstain

    Treesearch

    Carol A. Clausen; Robert D. Coleman; Vina W. Yang

    2010-01-01

    Safer, highly effective biocides providing long-term protection of mold growth on wood-based materials is of interest to the wood protection industry. Moldicide formulations containing synergistic combinations of ingredients derived from natural sources are commonly recognized as a promising approach for the next generation of wood protectants. Although fatty acid (FA...

  14. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    PubMed

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  15. Antimatter Requirements and Energy Costs for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities that could exist within the early part of next century. Results show that although it may be impractical to consider systems that rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-term production capabilities. In fact, a new facility designed solely for antiproton production but based on existing technology could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $6.4 million per mission.

  16. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  17. 10-fs-level synchronization of photocathode laser with RF-oscillator for ultrafast electron and X-ray sources

    PubMed Central

    Yang, Heewon; Han, Byungheon; Shin, Junho; Hou, Dong; Chung, Hayun; Baek, In Hyung; Jeong, Young Uk; Kim, Jungwon

    2017-01-01

    Ultrafast electron-based coherent radiation sources, such as free-electron lasers (FELs), ultrafast electron diffraction (UED) and Thomson-scattering sources, are becoming more important sources in today’s ultrafast science. Photocathode laser is an indispensable common subsystem in these sources that generates ultrafast electron pulses. To fully exploit the potentials of these sources, especially for pump-probe experiments, it is important to achieve high-precision synchronization between the photocathode laser and radio-frequency (RF) sources that manipulate electron pulses. So far, most of precision laser-RF synchronization has been achieved by using specially designed low-noise Er-fibre lasers at telecommunication wavelength. Here we show a modular method that achieves long-term (>1 day) stable 10-fs-level synchronization between a commercial 79.33-MHz Ti:sapphire laser oscillator and an S-band (2.856-GHz) RF oscillator. This is an important first step toward a photocathode laser-based femtosecond RF timing and synchronization system that is suitable for various small- to mid-scale ultrafast X-ray and electron sources. PMID:28067288

  18. 10-fs-level synchronization of photocathode laser with RF-oscillator for ultrafast electron and X-ray sources

    NASA Astrophysics Data System (ADS)

    Yang, Heewon; Han, Byungheon; Shin, Junho; Hou, Dong; Chung, Hayun; Baek, In Hyung; Jeong, Young Uk; Kim, Jungwon

    2017-01-01

    Ultrafast electron-based coherent radiation sources, such as free-electron lasers (FELs), ultrafast electron diffraction (UED) and Thomson-scattering sources, are becoming more important sources in today’s ultrafast science. Photocathode laser is an indispensable common subsystem in these sources that generates ultrafast electron pulses. To fully exploit the potentials of these sources, especially for pump-probe experiments, it is important to achieve high-precision synchronization between the photocathode laser and radio-frequency (RF) sources that manipulate electron pulses. So far, most of precision laser-RF synchronization has been achieved by using specially designed low-noise Er-fibre lasers at telecommunication wavelength. Here we show a modular method that achieves long-term (>1 day) stable 10-fs-level synchronization between a commercial 79.33-MHz Ti:sapphire laser oscillator and an S-band (2.856-GHz) RF oscillator. This is an important first step toward a photocathode laser-based femtosecond RF timing and synchronization system that is suitable for various small- to mid-scale ultrafast X-ray and electron sources.

  19. 10-fs-level synchronization of photocathode laser with RF-oscillator for ultrafast electron and X-ray sources.

    PubMed

    Yang, Heewon; Han, Byungheon; Shin, Junho; Hou, Dong; Chung, Hayun; Baek, In Hyung; Jeong, Young Uk; Kim, Jungwon

    2017-01-09

    Ultrafast electron-based coherent radiation sources, such as free-electron lasers (FELs), ultrafast electron diffraction (UED) and Thomson-scattering sources, are becoming more important sources in today's ultrafast science. Photocathode laser is an indispensable common subsystem in these sources that generates ultrafast electron pulses. To fully exploit the potentials of these sources, especially for pump-probe experiments, it is important to achieve high-precision synchronization between the photocathode laser and radio-frequency (RF) sources that manipulate electron pulses. So far, most of precision laser-RF synchronization has been achieved by using specially designed low-noise Er-fibre lasers at telecommunication wavelength. Here we show a modular method that achieves long-term (>1 day) stable 10-fs-level synchronization between a commercial 79.33-MHz Ti:sapphire laser oscillator and an S-band (2.856-GHz) RF oscillator. This is an important first step toward a photocathode laser-based femtosecond RF timing and synchronization system that is suitable for various small- to mid-scale ultrafast X-ray and electron sources.

  20. An Empirical Temperature Variance Source Model in Heated Jets

    NASA Technical Reports Server (NTRS)

    Khavaran, Abbas; Bridges, James

    2012-01-01

    An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.

  1. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Isotropic source terms of San Jacinto fault zone earthquakes based on waveform inversions with a generalized CAP method

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.; Zhu, L.

    2015-02-01

    We analyse source tensor properties of seven Mw > 4.2 earthquakes in the complex trifurcation area of the San Jacinto Fault Zone, CA, with a focus on isotropic radiation that may be produced by rock damage in the source volumes. The earthquake mechanisms are derived with generalized `Cut and Paste' (gCAP) inversions of three-component waveforms typically recorded by >70 stations at regional distances. The gCAP method includes parameters ζ and χ representing, respectively, the relative strength of the isotropic and CLVD source terms. The possible errors in the isotropic and CLVD components due to station variability is quantified with bootstrap resampling for each event. The results indicate statistically significant explosive isotropic components for at least six of the events, corresponding to ˜0.4-8 per cent of the total potency/moment of the sources. In contrast, the CLVD components for most events are not found to be statistically significant. Trade-off and correlation between the isotropic and CLVD components are studied using synthetic tests with realistic station configurations. The associated uncertainties are found to be generally smaller than the observed isotropic components. Two different tests with velocity model perturbation are conducted to quantify the uncertainty due to inaccuracies in the Green's functions. Applications of the Mann-Whitney U test indicate statistically significant explosive isotropic terms for most events consistent with brittle damage production at the source.

  3. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  4. Economic dispatch optimization for system integrating renewable energy sources

    NASA Astrophysics Data System (ADS)

    Jihane, Kartite; Mohamed, Cherkaoui

    2018-05-01

    Nowadays, the use of energy is growing especially in transportation and electricity industries. However this energy is based on conventional sources which pollute the environment. Multi-source system is seen as the best solution to sustainable development. This paper proposes the Economic Dispatch (ED) of hybrid renewable power system. The hybrid system is composed of ten thermal generators, photovoltaic (PV) generator and wind turbine generator. To show the importance of renewable energy sources (RES) in the energy mix we have ran the simulation for system integrated PV only and PV plus wind. The result shows that the system with renewable energy sources (RES) is more compromising than the system without RES in terms of fuel cost.

  5. A hybrid approach for nonlinear computational aeroacoustics predictions

    NASA Astrophysics Data System (ADS)

    Sassanis, Vasileios; Sescu, Adrian; Collins, Eric M.; Harris, Robert E.; Luke, Edward A.

    2017-01-01

    In many aeroacoustics applications involving nonlinear waves and obstructions in the far-field, approaches based on the classical acoustic analogy theory or the linearised Euler equations are unable to fully characterise the acoustic field. Therefore, computational aeroacoustics hybrid methods that incorporate nonlinear wave propagation have to be constructed. In this study, a hybrid approach coupling Navier-Stokes equations in the acoustic source region with nonlinear Euler equations in the acoustic propagation region is introduced and tested. The full Navier-Stokes equations are solved in the source region to identify the acoustic sources. The flow variables of interest are then transferred from the source region to the acoustic propagation region, where the full nonlinear Euler equations with source terms are solved. The transition between the two regions is made through a buffer zone where the flow variables are penalised via a source term added to the Euler equations. Tests were conducted on simple acoustic and vorticity disturbances, two-dimensional jets (Mach 0.9 and 2), and a three-dimensional jet (Mach 1.5), impinging on a wall. The method is proven to be effective and accurate in predicting sound pressure levels associated with the propagation of linear and nonlinear waves in the near- and far-field regions.

  6. Trends in marine debris in the U.S. Caribbean and the Gulf of Mexico, 1996-2003

    USGS Publications Warehouse

    Ribic, Christine; Seba B. Sheavly,; Rugg, David J.

    2011-01-01

    Marine debris is a widespread and globally recognized problem. Sound information is necessary to understand the extent of the problem and to inform resource managers and policy makers about potential mitigation strategies. Although there are many short-term studies on marine debris, a longer-term perspective and the ability to compare among regions has heretofore been missing in the U.S. Caribbean and the Gulf of Mexico. We used data from a national beach monitoring program to evaluate and compare amounts, composition, and trends of indicator marine debris in the U.S. Caribbean (Puerto Rico and the U.S. Virgin Islands) and the Gulf of Mexico from 1996 to 2003. Indicator items provided a standardized set that all surveys collected; each was assigned a probable source: ocean-based, land-based, or general-source. Probable ocean-based debris was related to activities such as recreational boating/fishing, commercial fishing and activities on oil/gas platforms. Probable land-based debris was related to land-based recreation and sewer systems. General-source debris represented plastic items that can come from either ocean- or land-based sources; these items were plastic bags, strapping bands, and plastic bottles (excluding motor oil containers). Debris loads were similar between the U.S. Caribbean and the western Gulf of Mexico; however, debris composition on U.S. Caribbean beaches was dominated by land-based indicators while the western Gulf of Mexico was dominated by ocean-based indicators. Beaches along the eastern Gulf of Mexico had the lowest counts of debris; composition was dominated by land-based indicators, similar to that found for the U.S. Caribbean. Debris loads on beaches in the Gulf of Mexico are likely affected by Gulf circulation patterns, reducing loads in the eastern Gulf and increasing loads in the western Gulf. Over the seven years of monitoring, we found a large linear decrease in total indicator debris, as well as all source categories, for the U.S. Caribbean. Lower magnitude decreases were seen in indicator debris along the eastern Gulf of Mexico. In contrast, only land-based indicators declined in the western Gulf of Mexico; total, ocean-based and general-source indicators remained unchanged. Decreases in land-based indicators were not related to human population in the coastal regions; human population increased in all regions over the time of the study. Significant monthly patterns for indicator debris were found only in the Gulf of Mexico; counts were highest during May through September, with peaks occurring in July. Inclement weather conditions before the time of the survey also accounted for some of the variation in the western Gulf of Mexico; fewer items were found when there were heavy seas or cold fronts in the weeks prior to the survey, while tropical storms (including hurricanes) increased the amount of debris. With the development around the globe of long-term monitoring programs using standardized methodology, there is the potential to help management at individual sites, as well as generate larger-scale perspectives (from regional to global) to inform decision makers. Incorporating mechanisms producing debris into marine debris programs would be a fruitful area for future research.

  7. Supersonic propulsion simulation by incorporating component models in the large perturbation inlet (LAPIN) computer code

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Richard, Jacques C.

    1991-01-01

    An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.

  8. Fermi Large Area Telescope Second Source Catalog

    NASA Technical Reports Server (NTRS)

    Nolan, P. L.; Abdo, A. A.; Ackermann, M.; Ajello, M; Allafort, A.; Antolini, E; Bonnell, J.; Cannon, A.; Celik O.; Corbet, R.; hide

    2012-01-01

    We present the second catalog of high-energy gamma-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24-month period. The Second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurements in 5 energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 11eV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely gamma-ray-producing source classes.

  9. The role of long-term familiarity and attentional maintenance in short-term memory for timbre.

    PubMed

    Siedenburg, Kai; McAdams, Stephen

    2017-04-01

    We study short-term recognition of timbre using familiar recorded tones from acoustic instruments and unfamiliar transformed tones that do not readily evoke sound-source categories. Participants indicated whether the timbre of a probe sound matched with one of three previously presented sounds (item recognition). In Exp. 1, musicians better recognised familiar acoustic compared to unfamiliar synthetic sounds, and this advantage was particularly large in the medial serial position. There was a strong correlation between correct rejection rate and the mean perceptual dissimilarity of the probe to the tones from the sequence. Exp. 2 compared musicians' and non-musicians' performance with concurrent articulatory suppression, visual interference, and with a silent control condition. Both suppression tasks disrupted performance by a similar margin, regardless of musical training of participants or type of sounds. Our results suggest that familiarity with sound source categories and attention play important roles in short-term memory for timbre, which rules out accounts solely based on sensory persistence.

  10. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  11. Direct computation of turbulence and noise

    NASA Technical Reports Server (NTRS)

    Berman, C.; Gordon, G.; Karniadakis, G.; Batcho, P.; Jackson, E.; Orszag, S.

    1991-01-01

    Jet exhaust turbulence noise is computed using a time dependent solution of the three dimensional Navier-Stokes equations to supply the source terms for an acoustic computation based on the Phillips convected wave equation. An extrapolation procedure is then used to determine the far field noise spectrum in terms of the near field sound. This will lay the groundwork for studies of more complex flows typical of noise suppression nozzles.

  12. Blind source separation based on time-frequency morphological characteristics for rigid acoustic scattering by underwater objects

    NASA Astrophysics Data System (ADS)

    Yang, Yang; Li, Xiukun

    2016-06-01

    Separation of the components of rigid acoustic scattering by underwater objects is essential in obtaining the structural characteristics of such objects. To overcome the problem of rigid structures appearing to have the same spectral structure in the time domain, time-frequency Blind Source Separation (BSS) can be used in combination with image morphology to separate the rigid scattering components of different objects. Based on a highlight model, the separation of the rigid scattering structure of objects with time-frequency distribution is deduced. Using a morphological filter, different characteristics in a Wigner-Ville Distribution (WVD) observed for single auto term and cross terms can be simplified to remove any cross-term interference. By selecting time and frequency points of the auto terms signal, the accuracy of BSS can be improved. An experimental simulation has been used, with changes in the pulse width of the transmitted signal, the relative amplitude and the time delay parameter, in order to analyzing the feasibility of this new method. Simulation results show that the new method is not only able to separate rigid scattering components, but can also separate the components when elastic scattering and rigid scattering exist at the same time. Experimental results confirm that the new method can be used in separating the rigid scattering structure of underwater objects.

  13. The Scaling of Broadband Shock-Associated Noise with Increasing Temperature

    NASA Technical Reports Server (NTRS)

    Miller, Steven A.

    2012-01-01

    A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline ( = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline psi = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.

  14. Transparent mediation-based access to multiple yeast data sources using an ontology driven interface.

    PubMed

    Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F

    2012-01-25

    Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.

  15. Seasonally-Dynamic SPARROW Modeling of Nitrogen Flux Using Earth Observation Data

    NASA Astrophysics Data System (ADS)

    Smith, R. A.; Schwarz, G. E.; Brakebill, J. W.; Hoos, A. B.; Moore, R. B.; Shih, J.; Nolin, A. W.; Macauley, M.; Alexander, R. B.

    2013-12-01

    SPARROW models are widely used to identify and quantify the sources of contaminants in watersheds and to predict their flux and concentration at specified locations downstream. Conventional SPARROW models describe the average relationship between sources and stream conditions based on long-term water quality monitoring data and spatially-referenced explanatory information. But many watershed management issues stem from intra- and inter-annual changes in contaminant sources, hydrologic forcing, or other environmental conditions which cause a temporary imbalance between inputs and stream water quality. Dynamic behavior of the system relating to changes in watershed storage and processing then becomes important. In this study, we describe dynamically calibrated SPARROW models of total nitrogen flux in three sub-regional watersheds: the Potomac River Basin, Long Island Sound drainage, and coastal South Carolina drainage. The models are based on seasonal water quality and watershed input data for a total 170 monitoring stations for the period 2001 to 2008. Frequently-reported, spatially-detailed input data on the phenology of agricultural production, terrestrial vegetation growth, and snow melt are often challenging requirements of seasonal modeling of reactive nitrogen. In this NASA-funded research, we use Enhanced Vegetation Index (EVI), gross primary production and snow/ice cover data from MODIS to parameterize seasonal uptake and release of nitrogen from vegetation and snowpack. The spatial reference frames of the models are 1:100,000-scale stream networks, and the computational time steps are 0.25-year seasons. Precipitation and temperature data are from PRISM. The model formulation accounts for storage of nitrogen from nonpoint sources including fertilized cropland, pasture, urban land, and atmospheric deposition. Model calibration is by non-linear regression. Once calibrated, model source terms based on previous season export allow for recursive dynamic simulation of stream flux: gradual increases or decreases in export occur as source supply rates and hydrologic forcing change. Based on an assumption that removal of nitrogen from watershed storage to stream channels and to 'permanent' sinks (e.g. the atmosphere and deep groundwater) occur as parallel first-order processes, the models can be used to estimate the approximate residence times of nonpoint source nitrogen in the watersheds.

  16. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  17. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  18. Indigenous Manufacturing realization of TWIN Source

    NASA Astrophysics Data System (ADS)

    Pandey, R.; Bandyopadhyay, M.; Parmar, D.; Yadav, R.; Tyagi, H.; Soni, J.; Shishangiya, H.; Sudhir Kumar, D.; Shah, S.; Bansal, G.; Pandya, K.; Parmar, K.; Vuppugalla, M.; Gahlaut, A.; Chakraborty, A.

    2017-04-01

    TWIN source is two RF driver based negative ion source that has been planned to bridge the gap between single driver based ROBIN source (currently operational) and eight river based DNB source (to be operated under IN-TF test facility). TWIN source experiments have been planned at IPR keeping the objective of long term domestic fusion programme to gain operational experiences on vacuum immersed multi driver RF based negative ion source. High vacuum compatible components of twin source are designed at IPR keeping an aim on indigenous built in attempt. These components of TWIN source are mainly stainless steel and OFC-Cu. Being high heat flux receiving components, one of the major functional requirements is continuous heat removal via water as cooling medium. Hence for the purpose stainless steel parts are provided with externally milled cooling lines and that shall be covered with a layer of OFC-cu which would be on the receiving side of high heat flux. Manufacturability of twin source components requires joining of these dissimilar materials via process like electrode position, electron beam welding and vacuum brazing. Any of these manufacturing processes shall give a vacuum tight joint having proper joint strength at operating temperature and pressure. Taking the indigenous development effort vacuum brazing (in non-nuclear environment) has been opted for joining of dissimilar materials of twin source being one of the most reliable joining techniques and commercially feasible across the suppliers of country. Manufacturing design improvisation for the components has been done to suit the vacuum brazing process requirement and to ease some of the machining without comprising over the functional and operational requirements. This paper illustrates the details on the indigenous development effort, design improvisation to suits manufacturability, vacuum brazing basics and its procedures for twin source components.

  19. Generalized fractional supertrace identity for Hamiltonian structure of NLS-MKdV hierarchy with self-consistent sources

    NASA Astrophysics Data System (ADS)

    Dong, Huan He; Guo, Bao Yong; Yin, Bao Shu

    2016-06-01

    In the paper, based on the modified Riemann-Liouville fractional derivative and Tu scheme, the fractional super NLS-MKdV hierarchy is derived, especially the self-consistent sources term is considered. Meanwhile, the generalized fractional supertrace identity is proposed, which is a beneficial supplement to the existing literature on integrable system. As an application, the super Hamiltonian structure of fractional super NLS-MKdV hierarchy is obtained.

  20. Prediction of discretization error using the error transport equation

    NASA Astrophysics Data System (ADS)

    Celik, Ismail B.; Parsons, Don Roscoe

    2017-06-01

    This study focuses on an approach to quantify the discretization error associated with numerical solutions of partial differential equations by solving an error transport equation (ETE). The goal is to develop a method that can be used to adequately predict the discretization error using the numerical solution on only one grid/mesh. The primary problem associated with solving the ETE is the formulation of the error source term which is required for accurately predicting the transport of the error. In this study, a novel approach is considered which involves fitting the numerical solution with a series of locally smooth curves and then blending them together with a weighted spline approach. The result is a continuously differentiable analytic expression that can be used to determine the error source term. Once the source term has been developed, the ETE can easily be solved using the same solver that is used to obtain the original numerical solution. The new methodology is applied to the two-dimensional Navier-Stokes equations in the laminar flow regime. A simple unsteady flow case is also considered. The discretization error predictions based on the methodology presented in this study are in good agreement with the 'true error'. While in most cases the error predictions are not quite as accurate as those from Richardson extrapolation, the results are reasonable and only require one numerical grid. The current results indicate that there is much promise going forward with the newly developed error source term evaluation technique and the ETE.

  1. A Generalized Evolution Criterion in Nonequilibrium Convective Systems

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu; Nisizima, Kunisuke

    1989-04-01

    A general evolution criterion, applicable to transport processes such as the conduction of heat and mass diffusion, is obtained as a direct version of the Le Chatelier-Braun principle for stationary states. The present theory is not based on any radical departure from the conventional one. The generalized theory is made determinate by proposing the balance equations for extensive thermodynamic variables which will reflect the character of convective systems under the assumption of local equilibrium. As a consequence of the introduction of source terms in the balance equations, there appear additional terms in the expression of the local entropy production, which are bilinear in terms of the intensive variables and the sources. In the present paper, we show that we can construct a dissipation function for such general cases, in which the premises of the Glansdorff-Prigogine theory are accumulated. The new dissipation function permits us to formulate a generalized evolution criterion for convective systems.

  2. Three-Dimensional Model Synthesis of the Global Methane Cycle

    NASA Technical Reports Server (NTRS)

    Fung, I.; Prather, M.; John, J.; Lerner, J.; Matthews, E.

    1991-01-01

    A synthesis of the global methane cycle is presented to attempt to generate an accurate global methane budget. Methane-flux measurements, energy data, and agricultural statistics are merged with databases of land-surface characteristics and anthropogenic activities. The sources and sinks of methane are estimated based on atmospheric methane composition and variations, and a global 3D transport model simulates the corresponding atmospheric responses. The geographic and seasonal variations of candidate budgets are compared with observational data, and the available observations are used to constrain the plausible methane budgets. The preferred budget includes annual destruction rates and annual emissions for various sources. The lack of direct flux measurements in the regions of many of these fluxes makes the unique determination of each term impossible. OH oxidation is found to be the largest single term, although more measurements of this and other terms are recommended.

  3. Multi-Scale Analysis of Trends in Northeastern Temperate Forest Springtime Phenology

    NASA Astrophysics Data System (ADS)

    Moon, M.; Melaas, E. K.; Sulla-menashe, D. J.; Friedl, M. A.

    2017-12-01

    The timing of spring leaf emergence is highly variable in many ecosystems, exerts first-order control growing season length, and significantly modulates seasonally-integrated photosynthesis. Numerous studies have reported trends toward earlier spring phenology in temperate forests, with some papers indicating that this trend is also leading to increased carbon uptake. At broad spatial scales, however, most of these studies have used data from coarse spatial resolution instruments such as MODIS, which does not resolve ecologically important landscape-scale patterns in phenology. In this work, we examine how long-term trends in spring phenology differ across three data sources acquired at different scales of measurements at the Harvard Forest in central Massachusetts. Specifically, we compared trends in the timing of phenology based on long-term in-situ measurements of phenology, estimates based on eddy-covariance measurements of net carbon uptake transition dates, and from two sources of satellite-based remote sensing (MODIS and Landsat) land surface phenology (LSP) data. Our analysis focused on the flux footprint surrounding the Harvard Forest Environmental Measurements (EMS) tower. Our results reveal clearly defined trends toward earlier springtime phenology in Landsat LSP and in the timing of tower-based net carbon uptake. However, we find no statistically significant trend in springtime phenology measured from MODIS LSP data products, possibly because the time series of MODIS observations is relatively short (13 years). The trend in tower-based transition data exhibited a larger negative value than the trend derived from Landsat LSP data (-0.42 and -0.28 days per year for 21 and 28 years, respectively). More importantly, these results have two key implications regarding how changes in spring phenology are impacting carbon uptake at landscape-scale. First, long-term trends in spring phenology can be quite different, depending on what data source is used to estimate the trend, and 2) the response of carbon uptake to climate change may be more sensitive than the response of land surface phenology itself.

  4. Flow diagram analysis of electrical fatalities in construction industry.

    PubMed

    Chi, Chia-Fen; Lin, Yuan-Yuan; Ikhwan, Mohamad

    2012-01-01

    The current study reanalyzed 250 electrical fatalities in the construction industry from 1996 to 2002 into seven patterns based on source of electricity (power line, energized equipment, improperly installed or damaged equipment), direct contact or indirect contact through some source of injury (boom vehicle, metal bar or pipe, and other conductive material). Each fatality was coded in terms of age, company size, experience, performing tasks, source of injury, accident cause and hazard pattern. The Chi-square Automatic Interaction Detector (CHAID) was applied to the coded data of the fatal electrocution to find a subset of predictors that might derive meaningful classifications or accidents scenarios. A series of Flow Diagrams was constructed based on CHAID result to illustrate the flow of electricity travelling from electrical source to human body. Each of the flow diagrams can be directly linked with feasible prevention strategies by cutting the flow of electricity.

  5. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  6. Full range line-field parallel swept source imaging utilizing digital refocusing

    NASA Astrophysics Data System (ADS)

    Fechtig, Daniel J.; Kumar, Abhishek; Drexler, Wolfgang; Leitgeb, Rainer A.

    2015-12-01

    We present geometric optics-based refocusing applied to a novel off-axis line-field parallel swept source imaging (LPSI) system. LPSI is an imaging modality based on line-field swept source optical coherence tomography, which permits 3-D imaging at acquisition speeds of up to 1 MHz. The digital refocusing algorithm applies a defocus-correcting phase term to the Fourier representation of complex-valued interferometric image data, which is based on the geometrical optics information of the LPSI system. We introduce the off-axis LPSI system configuration, the digital refocusing algorithm and demonstrate the effectiveness of our method for refocusing volumetric images of technical and biological samples. An increase of effective in-focus depth range from 255 μm to 4.7 mm is achieved. The recovery of the full in-focus depth range might be especially valuable for future high-speed and high-resolution diagnostic applications of LPSI in ophthalmology.

  7. Phase-and-amplitude recovery from a single phase-contrast image using partially spatially coherent x-ray radiation

    NASA Astrophysics Data System (ADS)

    Beltran, Mario A.; Paganin, David M.; Pelliccia, Daniele

    2018-05-01

    A simple method of phase-and-amplitude extraction is derived that corrects for image blurring induced by partially spatially coherent incident illumination using only a single intensity image as input. The method is based on Fresnel diffraction theory for the case of high Fresnel number, merged with the space-frequency description formalism used to quantify partially coherent fields and assumes the object under study is composed of a single-material. A priori knowledge of the object’s complex refractive index and information obtained by characterizing the spatial coherence of the source is required. The algorithm was applied to propagation-based phase-contrast data measured with a laboratory-based micro-focus x-ray source. The blurring due to the finite spatial extent of the source is embedded within the algorithm as a simple correction term to the so-called Paganin algorithm and is also numerically stable in the presence of noise.

  8. A Monte Carlo simulation study for the gamma-ray/neutron dual-particle imager using rotational modulation collimator (RMC).

    PubMed

    Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun

    2018-03-01

    The aim of this work is to develop a gamma-ray/neutron dual-particle imager, based on rotational modulation collimators (RMCs) and pulse shape discrimination (PSD)-capable scintillators, for possible applications for radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources in various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation-maximization method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio, showing the viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators.

  9. A Numerical Experiment on the Role of Surface Shear Stress in the Generation of Sound

    NASA Technical Reports Server (NTRS)

    Shariff, Karim; Wang, Meng; Merriam, Marshal (Technical Monitor)

    1996-01-01

    The sound generated due to a localized flow over an infinite flat surface is considered. It is known that the unsteady surface pressure, while appearing in a formal solution to the Lighthill equation, does not constitute a source of sound but rather represents the effect of image quadrupoles. The question of whether a similar surface shear stress term constitutes a true source of dipole sound is less settled. Some have boldly assumed it is a true source while others have argued that, like the surface pressure, it depends on the sound field (via an acoustic boundary layer) and is therefore not a true source. A numerical experiment based on the viscous, compressible Navier-Stokes equations was undertaken to investigate the issue. A small region of a wall was oscillated tangentially. The directly computed sound field was found to to agree with an acoustic analogy based calculation which regards the surface shear as an acoustically compact dipole source of sound.

  10. Information Communication using Knowledge Engine on Flood Issues

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.

  11. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish releasemore » fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.« less

  12. Analysis of Unmanned Systems in Military Logistics

    DTIC Science & Technology

    2016-12-01

    opportunities to employ unmanned systems to support logistic operations. 14. SUBJECT TERMS unmanned systems, robotics , UAVs, UGVs, USVs, UUVs, military...Industrial Robots at Warehouses / Distribution Centers .............................................................................. 17 2. Unmanned...Autonomous Robot Gun Turret. Source: Blain (2010)................................................... 33 Figure 4. Robot Sentries for Base Patrol

  13. Semantic Annotations and Querying of Web Data Sources

    NASA Astrophysics Data System (ADS)

    Hornung, Thomas; May, Wolfgang

    A large part of the Web, actually holding a significant portion of the useful information throughout the Web, consists of views on hidden databases, provided by numerous heterogeneous interfaces that are partly human-oriented via Web forms ("Deep Web"), and partly based on Web Services (only machine accessible). In this paper we present an approach for annotating these sources in a way that makes them citizens of the Semantic Web. We illustrate how queries can be stated in terms of the ontology, and how the annotations are used to selected and access appropriate sources and to answer the queries.

  14. Conglomeration or Chameleon? Teachers' Representations of Language in the Assessment of Learners with English as an Additional Language.

    ERIC Educational Resources Information Center

    Gardner, Sheena; Rea-Dickins, Pauline

    2001-01-01

    Investigates teacher representations of language in relation to assessment contexts. Analyzes not only what is represented in teachers' use of metalanguage, but also how it is presented--in terms of expression, voice, and source. The analysis is based on interviews with teachers, transcripts of lessons, and classroom-based assessments, formal…

  15. A goal-based angular adaptivity method for thermal radiation modelling in non grey media

    NASA Astrophysics Data System (ADS)

    Soucasse, Laurent; Dargaville, Steven; Buchan, Andrew G.; Pain, Christopher C.

    2017-10-01

    This paper investigates for the first time a goal-based angular adaptivity method for thermal radiation transport, suitable for non grey media when the radiation field is coupled with an unsteady flow field through an energy balance. Anisotropic angular adaptivity is achieved by using a Haar wavelet finite element expansion that forms a hierarchical angular basis with compact support and does not require any angular interpolation in space. The novelty of this work lies in (1) the definition of a target functional to compute the goal-based error measure equal to the radiative source term of the energy balance, which is the quantity of interest in the context of coupled flow-radiation calculations; (2) the use of different optimal angular resolutions for each absorption coefficient class, built from a global model of the radiative properties of the medium. The accuracy and efficiency of the goal-based angular adaptivity method is assessed in a coupled flow-radiation problem relevant for air pollution modelling in street canyons. Compared to a uniform Haar wavelet expansion, the adapted resolution uses 5 times fewer angular basis functions and is 6.5 times quicker, given the same accuracy in the radiative source term.

  16. Environmental performance of bio-based and biodegradable plastics: the road ahead.

    PubMed

    Lambert, Scott; Wagner, Martin

    2017-11-13

    Future plastic materials will be very different from those that are used today. The increasing importance of sustainability promotes the development of bio-based and biodegradable polymers, sometimes misleadingly referred to as 'bioplastics'. Because both terms imply "green" sources and "clean" removal, this paper aims at critically discussing the sometimes-conflicting terminology as well as renewable sources with a special focus on the degradation of these polymers in natural environments. With regard to the former we review innovations in feedstock development (e.g. microalgae and food wastes). In terms of the latter, we highlight the effects that polymer structure, additives, and environmental variables have on plastic biodegradability. We argue that the 'biodegradable' end-product does not necessarily degrade once emitted to the environment because chemical additives used to make them fit for purpose will increase the longevity. In the future, this trend may continue as the plastics industry also is expected to be a major user of nanocomposites. Overall, there is a need to assess the performance of polymer innovations in terms of their biodegradability especially under realistic waste management and environmental conditions, to avoid the unwanted release of plastic degradation products in receiving environments.

  17. The importance of quadrupole sources in prediction of transonic tip speed propeller noise

    NASA Technical Reports Server (NTRS)

    Hanson, D. B.; Fink, M. R.

    1978-01-01

    A theoretical analysis is presented for the harmonic noise of high speed, open rotors. Far field acoustic radiation equations based on the Ffowcs-Williams/Hawkings theory are derived for a static rotor with thin blades and zero lift. Near the plane of rotation, the dominant sources are the volume displacement and the rho U(2) quadrupole, where u is the disturbance velocity component in the direction blade motion. These sources are compared in both the time domain and the frequency domain using two dimensional airfoil theories valid in the subsonic, transonic, and supersonic speed ranges. For nonlifting parabolic arc blades, the two sources are equally important at speeds between the section critical Mach number and a Mach number of one. However, for moderately subsonic or fully supersonic flow over thin blade sections, the quadrupole term is negligible. It is concluded for thin blades that significant quadrupole noise radiation is strictly a transonic phenomenon and that it can be suppressed with blade sweep. Noise calculations are presented for two rotors, one simulating a helicopter main rotor and the other a model propeller. For the latter, agreement with test data was substantially improved by including the quadrupole source term.

  18. Hybrid BEM/empirical approach for scattering of correlated sources in rocket noise prediction

    NASA Astrophysics Data System (ADS)

    Barbarino, Mattia; Adamo, Francesco P.; Bianco, Davide; Bartoccini, Daniele

    2017-09-01

    Empirical models such as the Eldred standard model are commonly used for rocket noise prediction. Such models directly provide a definition of the Sound Pressure Level through the quadratic pressure term by uncorrelated sources. In this paper, an improvement of the Eldred Standard model has been formulated. This new formulation contains an explicit expression for the acoustic pressure of each noise source, in terms of amplitude and phase, in order to investigate the sources correlation effects and to propagate them through a wave equation. In particular, the correlation effects between adjacent and not-adjacent sources have been modeled and analyzed. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach that allows an evaluation of the scattering effects. In the framework of the European Space Agency funded program VECEP (VEga Consolidation and Evolution Programme), these models have been applied for the prediction of the aeroacoustics loads of the VEGA (Vettore Europeo di Generazione Avanzata - Advanced Generation European Carrier Rocket) launch vehicle at lift-off and the results have been compared with experimental data.

  19. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  20. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  1. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  2. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  3. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  4. Use of Online Sources of Information by Dental Practitioners: Findings from The Dental Practice-Based Research Network

    PubMed Central

    Funkhouser, Ellen; Agee, Bonita S.; Gordan, Valeria V.; Rindal, D. Brad; Fellows, Jeffrey L.; Qvist, Vibeke; McClelland, Jocelyn; Gilbert, Gregg H.

    2013-01-01

    Objectives Estimate the proportion of dental practitioners who use online sources of information for practice guidance. Methods From a survey of 657 dental practitioners in The Dental Practice Based Research Network, four indicators of online use for practice guidance were calculated: read journals online, obtained continuing education (CDE) through online sources, rated an online source as most influential, and reported frequently using an online source for guidance. Demographics, journals read, and use of various sources of information for practice guidance in terms of frequency and influence were ascertained for each. Results Overall, 21% (n=138) were classified into one of the four indicators of online use: 14% (n=89) rated an online source as most influential and 13% (n=87) reported frequently using an online source for guidance; few practitioners (5%, n=34) read journals online, fewer (3%, n=17) obtained CDE through online sources. Use of online information sources varied considerably by region and practice characteristics. In general, the 4 indicators represented practitioners with as many differences as similarities to each other and to offline users. Conclusion A relatively small proportion of dental practitioners use information from online sources for practice guidance. Variation exists regarding practitioners’ use of online source resources and how they rate the value of offline information sources for practice guidance. PMID:22994848

  5. Inverse modelling-based reconstruction of the Chernobyl source term available for long-range transport

    NASA Astrophysics Data System (ADS)

    Davoine, X.; Bocquet, M.

    2007-03-01

    The reconstruction of the Chernobyl accident source term has been previously carried out using core inventories, but also back and forth confrontations between model simulations and activity concentration or deposited activity measurements. The approach presented in this paper is based on inverse modelling techniques. It relies both on the activity concentration measurements and on the adjoint of a chemistry-transport model. The location of the release is assumed to be known, and one is looking for a source term available for long-range transport that depends both on time and altitude. The method relies on the maximum entropy on the mean principle and exploits source positivity. The inversion results are mainly sensitive to two tuning parameters, a mass scale and the scale of the prior errors in the inversion. To overcome this hardship, we resort to the statistical L-curve method to estimate balanced values for these two parameters. Once this is done, many of the retrieved features of the source are robust within a reasonable range of parameter values. Our results favour the acknowledged three-step scenario, with a strong initial release (26 to 27 April), followed by a weak emission period of four days (28 April-1 May) and again a release, longer but less intense than the initial one (2 May-6 May). The retrieved quantities of iodine-131, caesium-134 and caesium-137 that have been released are in good agreement with the latest reported estimations. Yet, a stronger apportionment of the total released activity is ascribed to the first period and less to the third one. Finer chronological details are obtained, such as a sequence of eruptive episodes in the first two days, likely related to the modulation of the boundary layer diurnal cycle. In addition, the first two-day release surges are found to have effectively reached an altitude up to the top of the domain (5000 m).

  6. Maximizing the spatial representativeness of NO2 monitoring data using a combination of local wind-based sectoral division and seasonal and diurnal correction factors.

    PubMed

    Donnelly, Aoife; Naughton, Owen; Misstear, Bruce; Broderick, Brian

    2016-10-14

    This article describes a new methodology for increasing the spatial representativeness of individual monitoring sites. Air pollution levels at a given point are influenced by emission sources in the immediate vicinity. Since emission sources are rarely uniformly distributed around a site, concentration levels will inevitably be most affected by the sources in the prevailing upwind direction. The methodology provides a means of capturing this effect and providing additional information regarding source/pollution relationships. The methodology allows for the division of the air quality data from a given monitoring site into a number of sectors or wedges based on wind direction and estimation of annual mean values for each sector, thus optimising the information that can be obtained from a single monitoring station. The method corrects for short-term data, diurnal and seasonal variations in concentrations (which can produce uneven weighting of data within each sector) and uneven frequency of wind directions. Significant improvements in correlations between the air quality data and the spatial air quality indicators were obtained after application of the correction factors. This suggests the application of these techniques would be of significant benefit in land-use regression modelling studies. Furthermore, the method was found to be very useful for estimating long-term mean values and wind direction sector values using only short-term monitoring data. The methods presented in this article can result in cost savings through minimising the number of monitoring sites required for air quality studies while also capturing a greater degree of variability in spatial characteristics. In this way, more reliable, but also more expensive monitoring techniques can be used in preference to a higher number of low-cost but less reliable techniques. The methods described in this article have applications in local air quality management, source receptor analysis, land-use regression mapping and modelling and population exposure studies.

  7. Prospective Surface Marker-Based Isolation and Expansion of Fetal Endothelial Colony-Forming Cells From Human Term Placenta

    PubMed Central

    Patel, Jatin; Seppanen, Elke; Chong, Mark S.K.; Yeo, Julie S.L.; Teo, Erin Y.L.; Chan, Jerry K.Y.; Fisk, Nicholas M.

    2013-01-01

    The term placenta is a highly vascularized tissue and is usually discarded upon birth. Our objective was to isolate clinically relevant quantities of fetal endothelial colony-forming cells (ECFCs) from human term placenta and to compare them to the well-established donor-matched umbilical cord blood (UCB)-derived ECFCs. A sorting strategy was devised to enrich for CD45−CD34+CD31Lo cells prior to primary plating to obtain pure placental ECFCs (PL-ECFCs) upon culture. UCB-ECFCs were derived using a well-described assay. PL-ECFCs were fetal in origin and expressed the same cell surface markers as UCB-ECFCs. Most importantly, a single term placenta could yield as many ECFCs as 27 UCB donors. PL-ECFCs and UCB-ECFCs had similar in vitro and in vivo vessel forming capacities and restored mouse hind limb ischemia in similar proportions. Gene expression profiles were only minimally divergent between PL-ECFCs and UCB-ECFCs, probably reflecting a vascular source versus a circulating source. Finally, PL-ECFCs and UCB-ECFCs displayed similar hierarchies between high and low proliferative colonies. We report a robust strategy to isolate ECFCs from human term placentas based on their cell surface expression. This yielded much larger quantities of ECFCs than UCB, but the cells were comparable in immunophenotype, gene expression, and in vivo functional ability. We conclude that PL-ECFCs have significant bio-banking and clinical translatability potential. PMID:24106336

  8. Fermi large area telescope second source catalog

    DOE PAGES

    Nolan, P. L.; Abdo, A. A.; Ackermann, M.; ...

    2012-03-28

    Here, we present the second catalog of high-energy γ-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are fluxmore » measurements in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. Furthermore, we provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. Finally, the 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely γ-ray-producing source classes.« less

  9. FERMI LARGE AREA TELESCOPE SECOND SOURCE CATALOG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolan, P. L.; Ajello, M.; Allafort, A.

    We present the second catalog of high-energy {gamma}-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurementsmore » in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely {gamma}-ray-producing source classes.« less

  10. Biomarkers and isotopic fingerprinting to track sediment origin and connectivity at Baldegg Lake (Switzerland)

    NASA Astrophysics Data System (ADS)

    Lavrieux, Marlène; Meusburger, Katrin; Birkholz, Axel; Alewell, Christine

    2017-04-01

    Slope destabilization and associated sediment transfer are among the major causes of aquatic ecosystems and surface water quality impairment. Through land uses and agricultural practices, human activities modify the soil erosive risk and the catchment connectivity, becoming a key factor of sediment dynamics. Hence, restoration and management plans of water bodies can only be efficient if the sediment sources and the proportion attributable to different land uses and agricultural practices are identified. Several sediment fingerprinting methods, based on the geochemical (elemental composition), color, magnetic or isotopic (137Cs) sediment properties, are currently in use. However, these tools are not suitable for a land-use based fingerprinting. New organic geochemical approaches are now developed to discriminate source-soil contributions under different land-uses: The compound-specific stable isotopes (CSSI) technique, based on the biomarkers isotopic signature (here, fatty acids δ13C) variability within the plant species, The analysis of highly specific (i.e. source-family- or even source-species-specific) biomarkers assemblages, which use is until now mainly restricted to palaeoenvironmental reconstructions, and which offer also promising prospects for tracing current sediment origin. The approach was applied to reconstruct the spatio-temporal variability of the main sediment sources of Baldegg Lake (Lucern Canton, Switzerland), which suffers from a substantial eutrophication, despite several restoration attempts during the last 40 years. The sediment supplying areas and the exported volumes were identified using CSSI technique and highly specific biomarkers, coupled to a sediment connectivity model. The sediment origin variability was defined through the analysis of suspended river sediments sampled at high flow conditions (short term), and by the analysis of a lake sediment core covering the last 130 years (long term). The results show the utility of biomarkers and CSSI to track organic sources in contrasted land-use settings. Associated to other fingerprinting methods, this approach could in the future become a decision support tool for catchments management.

  11. Logic-based assessment of the compatibility of UMLS ontology sources

    PubMed Central

    2011-01-01

    Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571

  12. The use of mud as an alternative source for bioelectricity using microbial fuel cells

    NASA Astrophysics Data System (ADS)

    Darmawan, Raden; Widjaja, Arief; Juliastuti, Sri Rachmania; Hendrianie, Nuniek; Hidaya, Chanifah; Sari, Dessy Rosita; Suwito, Morimura, Shigeru; Tominaga, Masato

    2017-05-01

    Alternative energy sources to substitute fossil-based energy is expected, as the fossil energy reserves decreasing every day. Mud is considered to be economical as the material sources for generating the electricity where it could be found easily and abundantly in Indonesia. The existence of a lot of mud that contains organic material has great potential as a source of electrical energy using microbial fuel cells (MFCs). It provides a promising technology by degrading organic compounds to yield the sustainable energy. The different sampling sites were determined to find out the electricity production, i.e. mud from soil water, brackish water and sea water using an anode immersed of 10 cm2. The results suggest that the electricity generation of the three areas are 0.331, 0.327 and 0.398 V (in terms of voltage); 0.221, 0.050 and 0.325 mA (in terms of electric current), respectively. It is investigated that the mud obtained the sea water exhibits the highest power potential compared to that obtained from the brackish and soil water.

  13. Multi-Decadal Variation of Aerosols: Sources, Transport, and Climate Effects

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Diehl, Thomas; Bian, Huisheng; Streets, David

    2008-01-01

    We present a global model study of multi-decadal changes of atmospheric aerosols and their climate effects using a global chemistry transport model along with the near-term to longterm data records. We focus on a 27-year time period of satellite era from 1980 to 2006, during which a suite of aerosol data from satellite observations, ground-based measurements, and intensive field experiments have become available. We will use the Goddard Chemistry Aerosol Radiation and Transport (GOCART) model, which involves a time-varying, comprehensive global emission dataset that we put together in our previous investigations and will be improved/extended in this project. This global emission dataset includes emissions of aerosols and their precursors from fuel combustion, biomass burning, volcanic eruptions, and other sources from 1980 to the present. Using the model and satellite data, we will analyze (1) the long-term global and regional aerosol trends and their relationship to the changes of aerosol and precursor emissions from anthropogenic and natural sources, (2) the intercontinental source-receptor relationships controlled by emission, transport pathway, and climate variability.

  14. A Comparison of Mathematical Models of Fish Mercury Concentration as a Function of Atmospheric Mercury Deposition Rate and Watershed Characteristics

    NASA Astrophysics Data System (ADS)

    Smith, R. A.; Moore, R. B.; Shanley, J. B.; Miller, E. K.; Kamman, N. C.; Nacci, D.

    2009-12-01

    Mercury (Hg) concentrations in fish and aquatic wildlife are complex functions of atmospheric Hg deposition rate, terrestrial and aquatic watershed characteristics that influence Hg methylation and export, and food chain characteristics determining Hg bioaccumulation. Because of the complexity and incomplete understanding of these processes, regional-scale models of fish tissue Hg concentration are necessarily empirical in nature, typically constructed through regression analysis of fish tissue Hg concentration data from many sampling locations on a set of potential explanatory variables. Unless the data sets are unusually long and show clear time trends, the empirical basis for model building must be based solely on spatial correlation. Predictive regional scale models are highly useful for improving understanding of the relevant biogeochemical processes, as well as for practical fish and wildlife management and human health protection. Mechanistically, the logical arrangement of explanatory variables is to multiply each of the individual Hg source terms (e.g. dry, wet, and gaseous deposition rates, and residual watershed Hg) for a given fish sampling location by source-specific terms pertaining to methylation, watershed transport, and biological uptake for that location (e.g. SO4 availability, hill slope, lake size). This mathematical form has the desirable property that predicted tissue concentration will approach zero as all individual source terms approach zero. One complication with this form, however, is that it is inconsistent with the standard linear multiple regression equation in which all terms (including those for sources and physical conditions) are additive. An important practical disadvantage of a model in which the Hg source terms are additive (rather than multiplicative) with their modifying factors is that predicted concentration is not zero when all sources are zero, making it unreliable for predicting the effects of large future reductions in Hg deposition. In this paper we compare the results of using several different linear and non-linear models in an analysis of watershed and fish Hg data for 450 New England lakes. The differences in model results pertain to both their utility in interpreting methylation and export processes as well as in fisheries management.

  15. An imaging-based photometric and colorimetric measurement method for characterizing OLED panels for lighting applications

    NASA Astrophysics Data System (ADS)

    Zhu, Yiting; Narendran, Nadarajah; Tan, Jianchuan; Mou, Xi

    2014-09-01

    The organic light-emitting diode (OLED) has demonstrated its novelty in displays and certain lighting applications. Similar to white light-emitting diode (LED) technology, it also holds the promise of saving energy. Even though the luminous efficacy values of OLED products have been steadily growing, their longevity is still not well understood. Furthermore, currently there is no industry standard for photometric and colorimetric testing, short and long term, of OLEDs. Each OLED manufacturer tests its OLED panels under different electrical and thermal conditions using different measurement methods. In this study, an imaging-based photometric and colorimetric measurement method for OLED panels was investigated. Unlike an LED that can be considered as a point source, the OLED is a large form area source. Therefore, for an area source to satisfy lighting application needs, it is important that it maintains uniform light level and color properties across the emitting surface of the panel over a long period. This study intended to develop a measurement procedure that can be used to test long-term photometric and colorimetric properties of OLED panels. The objective was to better understand how test parameters such as drive current or luminance and temperature affect the degradation rate. In addition, this study investigated whether data interpolation could allow for determination of degradation and lifetime, L70, at application conditions based on the degradation rates measured at different operating conditions.

  16. Network of anatomical texts (NAnaTex), an open-source project for visualizing the interaction between anatomical terms.

    PubMed

    Momota, Ryusuke; Ohtsuka, Aiji

    2018-01-01

    Anatomy is the science and art of understanding the structure of the body and its components in relation to the functions of the whole-body system. Medicine is based on a deep understanding of anatomy, but quite a few introductory-level learners are overwhelmed by the sheer amount of anatomical terminology that must be understood, so they regard anatomy as a dull and dense subject. To help them learn anatomical terms in a more contextual way, we started a new open-source project, the Network of Anatomical Texts (NAnaTex), which visualizes relationships of body components by integrating text-based anatomical information using Cytoscape, a network visualization software platform. Here, we present a network of bones and muscles produced from literature descriptions. As this network is primarily text-based and does not require any programming knowledge, it is easy to implement new functions or provide extra information by making changes to the original text files. To facilitate collaborations, we deposited the source code files for the network into the GitHub repository ( https://github.com/ryusukemomota/nanatex ) so that anybody can participate in the evolution of the network and use it for their own non-profit purposes. This project should help not only introductory-level learners but also professional medical practitioners, who could use it as a quick reference.

  17. Spurious Behavior of Shock-Capturing Methods: Problems Containing Stiff Source Terms and Discontinuities

    NASA Technical Reports Server (NTRS)

    Yee, Helen M. C.; Kotov, D. V.; Wang, Wei; Shu, Chi-Wang

    2013-01-01

    The goal of this paper is to relate numerical dissipations that are inherited in high order shock-capturing schemes with the onset of wrong propagation speed of discontinuities. For pointwise evaluation of the source term, previous studies indicated that the phenomenon of wrong propagation speed of discontinuities is connected with the smearing of the discontinuity caused by the discretization of the advection term. The smearing introduces a nonequilibrium state into the calculation. Thus as soon as a nonequilibrium value is introduced in this manner, the source term turns on and immediately restores equilibrium, while at the same time shifting the discontinuity to a cell boundary. The present study is to show that the degree of wrong propagation speed of discontinuities is highly dependent on the accuracy of the numerical method. The manner in which the smearing of discontinuities is contained by the numerical method and the overall amount of numerical dissipation being employed play major roles. Moreover, employing finite time steps and grid spacings that are below the standard Courant-Friedrich-Levy (CFL) limit on shockcapturing methods for compressible Euler and Navier-Stokes equations containing stiff reacting source terms and discontinuities reveals surprising counter-intuitive results. Unlike non-reacting flows, for stiff reactions with discontinuities, employing a time step and grid spacing that are below the CFL limit (based on the homogeneous part or non-reacting part of the governing equations) does not guarantee a correct solution of the chosen governing equations. Instead, depending on the numerical method, time step and grid spacing, the numerical simulation may lead to (a) the correct solution (within the truncation error of the scheme), (b) a divergent solution, (c) a wrong propagation speed of discontinuities solution or (d) other spurious solutions that are solutions of the discretized counterparts but are not solutions of the governing equations. The present investigation for three very different stiff system cases confirms some of the findings of Lafon & Yee (1996) and LeVeque & Yee (1990) for a model scalar PDE. The findings might shed some light on the reported difficulties in numerical combustion and problems with stiff nonlinear (homogeneous) source terms and discontinuities in general.

  18. Cost-effective bidirectional digitized radio-over-fiber systems employing sigma delta modulation

    NASA Astrophysics Data System (ADS)

    Lee, Kyung Woon; Jung, HyunDo; Park, Jung Ho

    2016-11-01

    We propose a cost effective digitized radio-over-fiber (D-RoF) system employing a sigma delta modulation (SDM) and a bidirectional transmission technique using phase modulated downlink and intensity modulated uplink. SDM is transparent to different radio access technologies and modulation formats, and more suitable for a downlink of wireless system because a digital to analog converter (DAC) can be avoided at the base station (BS). Also, Central station and BS share the same light source by using a phase modulation for the downlink and an intensity modulation for the uplink transmission. Avoiding DACs and light sources have advantages in terms of cost reduction, power consumption, and compatibility with conventional wireless network structure. We have designed a cost effective bidirectional D-RoF system using a low pass SDM and measured the downlink and uplink transmission performance in terms of error vector magnitude, signal spectra, and constellations, which are based on the 10MHz LTE 64-QAM standard.

  19. Reducing mortality risk by targeting specific air pollution sources: Suva, Fiji.

    PubMed

    Isley, C F; Nelson, P F; Taylor, M P; Stelcer, E; Atanacio, A J; Cohen, D D; Mani, F S; Maata, M

    2018-01-15

    Health implications of air pollution vary dependent upon pollutant sources. This work determines the value, in terms of reduced mortality, of reducing ambient particulate matter (PM 2.5 : effective aerodynamic diameter 2.5μm or less) concentration due to different emission sources. Suva, a Pacific Island city with substantial input from combustion sources, is used as a case-study. Elemental concentration was determined, by ion beam analysis, for PM 2.5 samples from Suva, spanning one year. Sources of PM 2.5 have been quantified by positive matrix factorisation. A review of recent literature has been carried out to delineate the mortality risk associated with these sources. Risk factors have then been applied for Suva, to calculate the possible mortality reduction that may be achieved through reduction in pollutant levels. Higher risk ratios for black carbon and sulphur resulted in mortality predictions for PM 2.5 from fossil fuel combustion, road vehicle emissions and waste burning that surpass predictions for these sources based on health risk of PM 2.5 mass alone. Predicted mortality for Suva from fossil fuel smoke exceeds the national toll from road accidents in Fiji. The greatest benefit for Suva, in terms of reduced mortality, is likely to be accomplished by reducing emissions from fossil fuel combustion (diesel), vehicles and waste burning. Copyright © 2017. Published by Elsevier B.V.

  20. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  1. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  2. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  3. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  4. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  5. KaBOB: ontology-based semantic integration of biomedical databases.

    PubMed

    Livingston, Kevin M; Bada, Michael; Baumgartner, William A; Hunter, Lawrence E

    2015-04-23

    The ability to query many independent biological databases using a common ontology-based semantic model would facilitate deeper integration and more effective utilization of these diverse and rapidly growing resources. Despite ongoing work moving toward shared data formats and linked identifiers, significant problems persist in semantic data integration in order to establish shared identity and shared meaning across heterogeneous biomedical data sources. We present five processes for semantic data integration that, when applied collectively, solve seven key problems. These processes include making explicit the differences between biomedical concepts and database records, aggregating sets of identifiers denoting the same biomedical concepts across data sources, and using declaratively represented forward-chaining rules to take information that is variably represented in source databases and integrating it into a consistent biomedical representation. We demonstrate these processes and solutions by presenting KaBOB (the Knowledge Base Of Biomedicine), a knowledge base of semantically integrated data from 18 prominent biomedical databases using common representations grounded in Open Biomedical Ontologies. An instance of KaBOB with data about humans and seven major model organisms can be built using on the order of 500 million RDF triples. All source code for building KaBOB is available under an open-source license. KaBOB is an integrated knowledge base of biomedical data representationally based in prominent, actively maintained Open Biomedical Ontologies, thus enabling queries of the underlying data in terms of biomedical concepts (e.g., genes and gene products, interactions and processes) rather than features of source-specific data schemas or file formats. KaBOB resolves many of the issues that routinely plague biomedical researchers intending to work with data from multiple data sources and provides a platform for ongoing data integration and development and for formal reasoning over a wealth of integrated biomedical data.

  6. Citizen Sensors for SHM: Towards a Crowdsourcing Platform

    PubMed Central

    Ozer, Ekin; Feng, Maria Q.; Feng, Dongming

    2015-01-01

    This paper presents an innovative structural health monitoring (SHM) platform in terms of how it integrates smartphone sensors, the web, and crowdsourcing. The ubiquity of smartphones has provided an opportunity to create low-cost sensor networks for SHM. Crowdsourcing has given rise to citizen initiatives becoming a vast source of inexpensive, valuable but heterogeneous data. Previously, the authors have investigated the reliability of smartphone accelerometers for vibration-based SHM. This paper takes a step further to integrate mobile sensing and web-based computing for a prospective crowdsourcing-based SHM platform. An iOS application was developed to enable citizens to measure structural vibration and upload the data to a server with smartphones. A web-based platform was developed to collect and process the data automatically and store the processed data, such as modal properties of the structure, for long-term SHM purposes. Finally, the integrated mobile and web-based platforms were tested to collect the low-amplitude ambient vibration data of a bridge structure. Possible sources of uncertainties related to citizens were investigated, including the phone location, coupling conditions, and sampling duration. The field test results showed that the vibration data acquired by smartphones operated by citizens without expertise are useful for identifying structural modal properties with high accuracy. This platform can be further developed into an automated, smart, sustainable, cost-free system for long-term monitoring of structural integrity of spatially distributed urban infrastructure. Citizen Sensors for SHM will be a novel participatory sensing platform in the way that it offers hybrid solutions to transitional crowdsourcing parameters. PMID:26102490

  7. TE/TM decomposition of electromagnetic sources

    NASA Technical Reports Server (NTRS)

    Lindell, Ismo V.

    1988-01-01

    Three methods are given by which bounded EM sources can be decomposed into two parts radiating transverse electric (TE) and transverse magnetic (TM) fields with respect to a given constant direction in space. The theory applies source equivalence and nonradiating source concepts, which lead to decomposition methods based on a recursive formula or two differential equations for the determination of the TE and TM components of the original source. Decompositions for a dipole in terms of point, line, and plane sources are studied in detail. The planar decomposition is seen to match to an earlier result given by Clemmow (1963). As an application of the point decomposition method, it is demonstrated that the general exact image expression for the Sommerfeld half-space problem, previously derived through heuristic reasoning, can be more straightforwardly obtained through the present decomposition method.

  8. The resolution of point sources of light as analyzed by quantum detection theory

    NASA Technical Reports Server (NTRS)

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  9. Investigation of Magnetotelluric Source Effect Based on Twenty Years of Telluric and Geomagnetic Observation

    NASA Astrophysics Data System (ADS)

    Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.

    2016-12-01

    Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.

  10. Use of corn steep liquor as an economical nitrogen source for biosuccinic acid production by Actinobacillus succinogenes

    NASA Astrophysics Data System (ADS)

    Tan, J. P.; Jahim, J. M.; Wu, T. Y.; Harun, S.; Mumtaz, T.

    2016-06-01

    Expensive raw materials are the driving force that leads to the shifting of the petroleum-based succinic acid production into bio-based succinic acid production by microorganisms. Cost of fermentation medium is among the main factors contributing to the total production cost of bio-succinic acid. After carbon source, nitrogen source is the second largest component of the fermentation medium, the cost of which has been overlooked for the past years. The current study aimed at replacing yeast extract- a costly nitrogen source with corn steep liquor for economical production of bio-succinic acid by Actinobacillus succinogenes 130Z. In this study, a final succinic acid concentration of 20.6 g/L was obtained from the use of corn steep liquor as the nitrogen source, which was comparable with the use of yeast extract as the nitrogen source that had a final succinate concentration of 21.4 g/l. In terms of economical wise, corn steep liquor was priced at 200 /ton, which was one fifth of the cost of yeast extract at 1000 /ton. Therefore, corn steep liquor can be considered as a potential nitrogen source in biochemical industries instead of the costly yeast extract.

  11. Photovoltaic highway applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Scudder, L. R.; Bifano, W. J.; Poley, W. A.

    1977-01-01

    A preliminary assessment of the near-term market for photovoltaic highway applications is presented. Among the potential users, two market sectors are considered: government and commercial. Within these sectors, two possible application areas, signs and motorist aids, are discussed. Based on judgemental information, obtained by a brief survey of representatives of the two user sectors, the government sector appears more amenable to the introduction of photovoltaic power sources for highway applications in the near-term. However, considerable interest and potential opportunities were also found to exist in the commercial sector. Further studies to quantify the market for highway applications appear warranted.

  12. #nowplaying Madonna: a large-scale evaluation on estimating similarities between music artists and between movies from microblogs.

    PubMed

    Schedl, Markus

    2012-01-01

    Different term weighting techniques such as [Formula: see text] or BM25 have been used intensely for manifold text-based information retrieval tasks. Their use for modeling term profiles for named entities and subsequent calculation of similarities between these named entities have been studied to a much smaller extent. The recent trend of microblogging made available massive amounts of information about almost every topic around the world. Therefore, microblogs represent a valuable source for text-based named entity modeling. In this paper, we present a systematic and comprehensive evaluation of different term weighting measures , normalization techniques , query schemes , index term sets , and similarity functions for the task of inferring similarities between named entities, based on data extracted from microblog posts . We analyze several thousand combinations of choices for the above mentioned dimensions, which influence the similarity calculation process, and we investigate in which way they impact the quality of the similarity estimates. Evaluation is performed using three real-world data sets: two collections of microblogs related to music artists and one related to movies. For the music collections, we present results of genre classification experiments using as benchmark genre information from allmusic.com. For the movie collection, we present results of multi-class classification experiments using as benchmark categories from IMDb. We show that microblogs can indeed be exploited to model named entity similarity with remarkable accuracy, provided the correct settings for the analyzed aspects are used. We further compare the results to those obtained when using Web pages as data source.

  13. Above and beyond short-term mating, long-term mating is uniquely tied to human personality.

    PubMed

    Holtzman, Nicholas S; Strube, Michael J

    2013-12-16

    To what extent are personality traits and sexual strategies linked? The literature does not provide a clear answer, as it is based on the Sociosexuality model, a one-dimensional model that fails to measure long-term mating (LTM). An improved two-dimensional model separately assesses long-term and short-term mating (STM; Jackson and Kirkpatrick, 2007). In this paper, we link this two-dimensional model to an array of personality traits (Big 5, Dark Triad, and Schizoid Personality). We collected data from different sources (targets and peers; Study 1), and from different nations (United States, Study 1; India, Study 2). We demonstrate for the first time that, above and beyond STM, LTM captures variation in personality.

  14. Numerical simulation of hydrothermal circulation in the Cascade Range, north-central Oregon

    USGS Publications Warehouse

    Ingebritsen, S.E.; Paulson, K.M.

    1990-01-01

    Alternate conceptual models to explain near-surface heat-flow observations in the central Oregon Cascade Range involve (1) an extensive mid-crustal magmatic heat source underlying both the Quaternary arc and adjacent older rocks or (2) a narrower deep heat source which is flanked by a relatively shallow conductive heat-flow anomaly caused by regional ground-water flow (the lateral-flow model). Relative to the mid-crustal heat source model, the lateral-flow model suggests a more limited geothermal resource base, but a better-defined exploration target. We simulated ground-water flow and heat transport through two cross sections trending west from the Cascade range crest in order to explore the implications of the two models. The thermal input for the alternate conceptual models was simulated by varying the width and intensity of a basal heat-flow anomaly and, in some cases, by introducing shallower heat sources beneath the Quaternary arc. Near-surface observations in the Breitenbush Hot Springs area are most readily explained in terms of lateral heat transport by regional ground-water flow; however, the deep thermal structure still cannot be uniquely inferred. The sparser thermal data set from the McKenzie River area can be explained either in terms of deep regional ground-water flow or in terms of a conduction-dominated system, with ground-water flow essentially confined to Quaternary rocks and fault zones.

  15. A new aerodynamic integral equation based on an acoustic formula in the time domain

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1984-01-01

    An aerodynamic integral equation for bodies moving at transonic and supersonic speeds is presented. Based on a time-dependent acoustic formula for calculating the noise emanating from the outer portion of a propeller blade travelling at high speed (the Ffowcs Williams-Hawking formulation), the loading terms and a conventional thickness source terms are retained. Two surface and three line integrals are employed to solve an equation for the loading noise. The near-field term is regularized using the collapsing sphere approach to obtain semiconvergence on the blade surface. A singular integral equation is thereby derived for the unknown surface pressure, and is amenable to numerical solutions using Galerkin or collocation methods. The technique is useful for studying the nonuniform inflow to the propeller.

  16. Integrating Physics and Math through Microcomputer-Based Laboratories (MBL): Effects on Discourse Type, Quality, and Mathematization

    ERIC Educational Resources Information Center

    BouJaoude, Saouma B.; Jurdak, Murad E.

    2010-01-01

    The purposes of this study were to understand the nature of discourse in terms of knowledge types and cognitive process, source of utterances (student or teacher), and time use in microcomputer-based labs (MBL) and verification type labs (VTL) and to gain an understanding of the role of MBL in promoting mathematization. The study was conducted in…

  17. Numerical models analysis of energy conversion process in air-breathing laser propulsion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong Yanji; Song Junling; Cui Cunyan

    Energy source was considered as a key essential in this paper to describe energy conversion process in air-breathing laser propulsion. Some secondary factors were ignored when three independent modules, ray transmission module, energy source term module and fluid dynamic module, were established by simultaneous laser radiation transportation equation and fluid mechanics equation. The incidence laser beam was simulated based on ray tracing method. The calculated results were in good agreement with those of theoretical analysis and experiments.

  18. Quantum key distribution with passive decoy state selection

    NASA Astrophysics Data System (ADS)

    Mauerer, Wolfgang; Silberhorn, Christine

    2007-05-01

    We propose a quantum key distribution scheme which closely matches the performance of a perfect single photon source. It nearly attains the physical upper bound in terms of key generation rate and maximally achievable distance. Our scheme relies on a practical setup based on a parametric downconversion source and present day, nonideal photon-number detection. Arbitrary experimental imperfections which lead to bit errors are included. We select decoy states by classical postprocessing. This allows one to improve the effective signal statistics and achievable distance.

  19. Fabrication and In Situ Testing of Scalable Nitrate-Selective Electrodes for Distributed Observations

    NASA Astrophysics Data System (ADS)

    Harmon, T. C.; Rat'ko, A.; Dietrich, H.; Park, Y.; Wijsboom, Y. H.; Bendikov, M.

    2008-12-01

    Inorganic nitrogen (nitrate (NO3-) and ammonium (NH+)) from chemical fertilizer and livestock waste is a major source of pollution in groundwater, surface water and the air. While some sources of these chemicals, such as waste lagoons, are well-defined, their application as fertilizer has the potential to create distributed or non-point source pollution problems. Scalable nitrate sensors (small and inexpensive) would enable us to better assess non-point source pollution processes in agronomic soils, groundwater and rivers subject to non-point source inputs. This work describes the fabrication and testing of inexpensive PVC-membrane- based ion selective electrodes (ISEs) for monitoring nitrate levels in soil water environments. ISE-based sensors have the advantages of being easy to fabricate and use, but suffer several shortcomings, including limited sensitivity, poor precision, and calibration drift. However, modern materials have begun to yield more robust ISE types in laboratory settings. This work emphasizes the in situ behavior of commercial and fabricated sensors in soils subject to irrigation with dairy manure water. Results are presented in the context of deployment techniques (in situ versus soil lysimeters), temperature compensation, and uncertainty analysis. Observed temporal responses of the nitrate sensors exhibited diurnal cycling with elevated nitrate levels at night and depressed levels during the day. Conventional samples collected via lysimeters validated this response. It is concluded that while modern ISEs are not yet ready for long-term, unattended deployment, short-term installations (on the order of 2 to 4 days) are viable and may provide valuable insights into nitrogen dynamics in complex soil systems.

  20. Communicating Science to Impact Learning? A Phenomenological Inquiry into 4th and 5th Graders' Perceptions of Science Information Sources

    NASA Astrophysics Data System (ADS)

    Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah

    2016-04-01

    Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered through classroom observations and interviews in four Turkish elementary schools. Focus group interviews with 47 students and individual interviews with 17 teachers and 10 parents were conducted. Participants identified a wide range of SIS, including TV, magazines, newspapers, internet, peers, teachers, families, science centers/museums, science exhibitions, textbooks, science books, and science camps. Students reported using various SIS in school-based and non-school contexts to satisfy their cognitive, affective, personal, and social integrative needs. SIS were used for science courses, homework/project assignments, examination/test preparations, and individual science-related research. Students assessed SIS in terms of the perceived accessibility of the sources, the quality of the content, and the content presentation. In particular, some sources such as teachers, families, TV, science magazines, textbooks, and science centers/museums ("directive sources") predictably led students to other sources such as teachers, families, internet, and science books ("directed sources"). A small number of sources crossed context boundaries, being useful in both school and out. Results shed light on the connection between science education and science communication in terms of promoting science learning.

  1. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  2. The Current Status of Behaviorism and Neurofeedback

    ERIC Educational Resources Information Center

    Fultz, Dwight E.

    2009-01-01

    There appears to be no dominant conceptual model for the process and outcomes of neurofeedback among practitioners or manufacturers. Behaviorists are well-positioned to develop a neuroscience-based source code in which neural activity is described in behavioral terms, providing a basis for behavioral conceptualization and education of…

  3. Parametrized energy spectrum of cosmic-ray protons with kinetic energies down to 1 GeV

    NASA Technical Reports Server (NTRS)

    Tan, L. C.

    1985-01-01

    A new estimation of the interstellar proton spectrum is made in which the source term of primary protons is taken from shock acceleration theory and the cosmic ray propagation calculation is based on a proposed nonuniform galactic disk model.

  4. A comprehensive classification method for VOC emission sources to tackle air pollution based on VOC species reactivity and emission amounts.

    PubMed

    Li, Guohao; Wei, Wei; Shao, Xia; Nie, Lei; Wang, Hailin; Yan, Xiao; Zhang, Rui

    2018-05-01

    In China, volatile organic compound (VOC) control directives have been continuously released and implemented for important sources and regions to tackle air pollution. The corresponding control requirements were based on VOC emission amounts (EA), but never considered the significant differentiation of VOC species in terms of atmospheric chemical reactivity. This will adversely influence the effect of VOC reduction on air quality improvement. Therefore, this study attempted to develop a comprehensive classification method for typical VOC sources in the Beijing-Tianjin-Hebei region (BTH), by combining the VOC emission amounts with the chemical reactivities of VOC species. Firstly, we obtained the VOC chemical profiles by measuring 5 key sources in the BTH region and referencing another 10 key sources, and estimated the ozone formation potential (OFP) per ton VOC emission for these sources by using the maximum incremental reactivity (MIR) index as the characteristic of source reactivity (SR). Then, we applied the data normalization method to respectively convert EA and SR to normalized EA (NEA) and normalized SR (NSR) for various sources in the BTH region. Finally, the control index (CI) was calculated, and these sources were further classified into four grades based on the normalized CI (NCI). The study results showed that in the BTH region, furniture coating, automobile coating, and road vehicles are characterized by high NCI and need to be given more attention; however, the petro-chemical industry, which was designated as an important control source by air quality managers, has a lower NCI. Copyright © 2017. Published by Elsevier B.V.

  5. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    PubMed

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  6. Asymptotically and exactly energy balanced augmented flux-ADER schemes with application to hyperbolic conservation laws with geometric source terms

    NASA Astrophysics Data System (ADS)

    Navas-Montilla, A.; Murillo, J.

    2016-07-01

    In this work, an arbitrary order HLL-type numerical scheme is constructed using the flux-ADER methodology. The proposed scheme is based on an augmented Derivative Riemann solver that was used for the first time in Navas-Montilla and Murillo (2015) [1]. Such solver, hereafter referred to as Flux-Source (FS) solver, was conceived as a high order extension of the augmented Roe solver and led to the generation of a novel numerical scheme called AR-ADER scheme. Here, we provide a general definition of the FS solver independently of the Riemann solver used in it. Moreover, a simplified version of the solver, referred to as Linearized-Flux-Source (LFS) solver, is presented. This novel version of the FS solver allows to compute the solution without requiring reconstruction of derivatives of the fluxes, nevertheless some drawbacks are evidenced. In contrast to other previously defined Derivative Riemann solvers, the proposed FS and LFS solvers take into account the presence of the source term in the resolution of the Derivative Riemann Problem (DRP), which is of particular interest when dealing with geometric source terms. When applied to the shallow water equations, the proposed HLLS-ADER and AR-ADER schemes can be constructed to fulfill the exactly well-balanced property, showing that an arbitrary quadrature of the integral of the source inside the cell does not ensure energy balanced solutions. As a result of this work, energy balanced flux-ADER schemes that provide the exact solution for steady cases and that converge to the exact solution with arbitrary order for transient cases are constructed.

  7. Metrological-grade tunable coherent source in the mid-infrared for molecular precision spectroscopy

    NASA Astrophysics Data System (ADS)

    Insero, G.; Clivati, C.; D'Ambrosio, D.; Cancio Pastor, P.; Verde, M.; Schunemann, P. G.; Zondy, J.-J.; Inguscio, M.; Calonico, D.; Levi, F.; De Natale, P.; Santambrogio, G.; Borri, S.

    2018-02-01

    We report on a metrological-grade mid-IR source with a 10-14 short-term instability for high-precision spectroscopy. Our source is based on the combination of a quantum cascade laser and a coherent radiation obtained by difference-frequency generation in an orientation-patterned gallium phosphide (OP-GaP) crystal. The pump and signal lasers are locked to an optical frequency comb referenced to the primary frequency standard via an optical fiber link. We demonstrate the robustness of the apparatus by measuring a vibrational transition around 6 μm on a metastable state of CO molecuels with 11 digits of precision.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabrera-Palmer, Belkis

    Predicting the performance of radiation detection systems at field sites based on measured performance acquired under controlled conditions at test locations, e.g., the Nevada National Security Site (NNSS), remains an unsolved and standing issue within DNDO’s testing methodology. Detector performance can be defined in terms of the system’s ability to detect and/or identify a given source or set of sources, and depends on the signal generated by the detector for the given measurement configuration (i.e., source strength, distance, time, surrounding materials, etc.) and on the quality of the detection algorithm. Detector performance is usually evaluated in the performance and operationalmore » testing phases, where the measurement configurations are selected to represent radiation source and background configurations of interest to security applications.« less

  9. Discriminating Simulated Vocal Tremor Source Using Amplitude Modulation Spectra

    PubMed Central

    Carbonell, Kathy M.; Lester, Rosemary A.; Story, Brad H.; Lotto, Andrew J.

    2014-01-01

    Objectives/Hypothesis Sources of vocal tremor are difficult to categorize perceptually and acoustically. This paper describes a preliminary attempt to discriminate vocal tremor sources through the use of spectral measures of the amplitude envelope. The hypothesis is that different vocal tremor sources are associated with distinct patterns of acoustic amplitude modulations. Study Design Statistical categorization methods (discriminant function analysis) were used to discriminate signals from simulated vocal tremor with different sources using only acoustic measures derived from the amplitude envelopes. Methods Simulations of vocal tremor were created by modulating parameters of a vocal fold model corresponding to oscillations of respiratory driving pressure (respiratory tremor), degree of vocal fold adduction (adductory tremor) and fundamental frequency of vocal fold vibration (F0 tremor). The acoustic measures were based on spectral analyses of the amplitude envelope computed across the entire signal and within select frequency bands. Results The signals could be categorized (with accuracy well above chance) in terms of the simulated tremor source using only measures of the amplitude envelope spectrum even when multiple sources of tremor were included. Conclusions These results supply initial support for an amplitude-envelope based approach to identify the source of vocal tremor and provide further evidence for the rich information about talker characteristics present in the temporal structure of the amplitude envelope. PMID:25532813

  10. Unequal-strength source zROC slopes reflect criteria placement and not (necessarily) memory processes

    PubMed Central

    Starns, Jeffrey J.; Pazzaglia, Angela M.; Rotello, Caren M.; Hautus, Michael J.; Macmillan, Neil A.

    2014-01-01

    Source memory zROC slopes change from below 1 to above 1 depending on which source gets the strongest learning. This effect has been attributed to memory processes, either in terms of a threshold source recollection process or changes in the variability of continuous source evidence. We propose two decision mechanisms that can produce the slope effect, and we test them in three experiments. The evidence mixing account assumes that people change how they weight item versus source evidence based on which source is stronger, and the converging criteria account assumes that participants become more willing to make high confidence source responses for test probes that have higher item strength. Results failed to support the evidence mixing account, in that the slope effect emerged even when item evidence was not informative for the source judgment (that is, in tests that included strong and weak items from both sources). In contrast, results showed strong support for the converging criteria account. This account not only accommodated the unequal-strength slope effect, but also made a prediction for unstudied (new) items that was empirically confirmed: participants made more high confidence source responses for new items when they were more confident that the item was studied. The converging criteria account has an advantage over accounts based on source recollection or evidence variability, as the latter accounts do not predict the relationship between recognition and source confidence for new items. PMID:23565789

  11. SoS Navigator 2.0: A Context-Based Approach to System-of-Systems Challenges

    DTIC Science & Technology

    2008-06-01

    in a Postindustrial Age. MIT Press, 1984. [ Kolb 1984] Kolb , David A. Experiential Learning : Experience as the Source of Learning and Develop- ment...terms of experiential learning , and the work of Rosen [Rosen 1991] in terms of the relational approach to understanding anticipa- tive systems. Our...Supporting Techniques and Tools 17  3.2  The Learning /Transformation Cycle 19  3.3  Summary of SoS Navigator Processes and Techniques 20  4  Case Summaries 22

  12. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  13. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  14. SISSY: An efficient and automatic algorithm for the analysis of EEG sources based on structured sparsity.

    PubMed

    Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I

    2017-08-15

    Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Chandra Detection of Intracluster X-Ray sources in Virgo

    NASA Astrophysics Data System (ADS)

    Hou, Meicun; Li, Zhiyuan; Peng, Eric W.; Liu, Chengze

    2017-09-01

    We present a survey of X-ray point sources in the nearest and dynamically young galaxy cluster, Virgo, using archival Chandra observations that sample the vicinity of 80 early-type member galaxies. The X-ray source populations at the outskirts of these galaxies are of particular interest. We detect a total of 1046 point sources (excluding galactic nuclei) out to a projected galactocentric radius of ˜40 kpc and down to a limiting 0.5-8 keV luminosity of ˜ 2× {10}38 {erg} {{{s}}}-1. Based on the cumulative spatial and flux distributions of these sources, we statistically identify ˜120 excess sources that are not associated with the main stellar content of the individual galaxies, nor with the cosmic X-ray background. This excess is significant at a 3.5σ level, when Poisson error and cosmic variance are taken into account. On the other hand, no significant excess sources are found at the outskirts of a control sample of field galaxies, suggesting that at least some fraction of the excess sources around the Virgo galaxies are truly intracluster X-ray sources. Assisted with ground-based and HST optical imaging of Virgo, we discuss the origins of these intracluster X-ray sources, in terms of supernova-kicked low-mass X-ray binaries (LMXBs), globular clusters, LMXBs associated with the diffuse intracluster light, stripped nucleated dwarf galaxies and free-floating massive black holes.

  16. Frequency stabilization of an optically pumped far-infrared laser to the harmonic of a microwave synthesizer.

    PubMed

    Danylov, A A; Light, A R; Waldman, J; Erickson, N

    2015-12-10

    Measurements of the frequency stability of a far-infrared molecular laser have been made by mixing the harmonic of an ultrastable microwave source with a portion of the laser output signal in a terahertz (THz) Schottky diode balanced mixer. A 3 GHz difference-frequency signal was used in a frequency discriminator circuit to lock the laser to the microwave source. Comparisons of the short- and long-term laser frequency stability under free-running and locked conditions show a significant improvement with locking. Short-term frequency jitter was reduced by an order of magnitude, from approximately 40 to 4 kHz, and long-term drift was reduced by more than three orders of magnitude, from approximately 250 kHz to 80 Hz. The results, enabled by the efficient Schottky diode balanced mixer downconverter, demonstrate that ultrastable microwave-based frequency stabilization of THz optically pumped lasers (OPLs) will now be possible at frequencies extending well above 4.0 THz.

  17. [Schizophrenia and psychosis on the internet].

    PubMed

    Schrank, Beate; Seyringer, Michaela-Elena; Berger, Peter; Katschnig, Heinz; Amering, Michaela

    2006-09-01

    The internet is an increasingly important source of information for patients concerning their illness. This has to be borne in mind concerning its growing influence on communications between patients and clinicians. The aim of this study is to assess the quality of German-language information on schizophrenia on the internet. Two searches of the terms schizophrenia and psychosis were conducted, using the Google search engine set to produce only German hits. The quality of the first hundred resulting sites was assessed according to a range of criteria, including diagnosis and therapy, links and interactive offers. Evidence-based medical information was provided by more than half of the sites resulting from the search term schizophrenia and by less than one third of psychosis hits. Information and discussion on the relationship between drugs and psychosis appeared almost exclusively under the term psychosis. It is suggested that mental health care professionals can use knowledge on what sort of information their patients are confronted with on the internet in order to assist them in profiting from this source of information.

  18. Multi-Decadal Change of Atmospheric Aerosols and Their Effect on Surface Radiation

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Diehl, Thomas; Tan, Qian; Wild, Martin; Qian, Yun; Yu, Hongbin; Bian, Huisheng; Wang, Weiguo

    2012-01-01

    We present an investigation on multi-decadal changes of atmospheric aerosols and their effects on surface radiation using a global chemistry transport model along with the near-term to long-term data records. We focus on a 28-year time period of satellite era from 1980 to 2007, during which a suite of aerosol data from satellite observations and ground-based remote sensing and in-situ measurements have become available. We analyze the long-term global and regional aerosol optical depth and concentration trends and their relationship to the changes of emissions" and assess the role aerosols play in the multi-decadal change of solar radiation reaching the surface (known as "dimming" or "brightening") at different regions of the world, including the major anthropogenic source regions (North America, Europe, Asia) that have been experiencing considerable changes of emissions, dust and biomass burning regions that have large interannual variabilities, downwind regions that are directly affected by the changes in the source area, and remote regions that are considered to representing "background" conditions.

  19. Upper and lower bounds of ground-motion variabilities: implication for source properties

    NASA Astrophysics Data System (ADS)

    Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino

    2017-04-01

    One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).

  20. Real-Time N2O Gas Detection System for Agricultural Production Using a 4.6-μm-Band Laser Source Based on a Periodically Poled LiNbO3 Ridge Waveguide

    PubMed Central

    Tokura, Akio; Asobe, Masaki; Enbutsu, Koji; Yoshihara, Toshihiro; Hashida, Shin-nosuke; Takenouchi, Hirokazu

    2013-01-01

    This article describes a gas monitoring system for detecting nitrous oxide (N2O) gas using a compact mid-infrared laser source based on difference-frequency generation in a quasi-phase-matched LiNbO3 waveguide. We obtained a stable output power of 0.62 mW from a 4.6-μm-band continuous-wave laser source operating at room temperature. This laser source enabled us to detect atmospheric N2O gas at a concentration as low as 35 parts per billion. Using this laser source, we constructed a new real-time in-situ monitoring system for detecting N2O gas emitted from potted plants. A few weeks of monitoring with the developed detection system revealed a strong relationship between nitrogen fertilization and N2O emission. This system is promising for the in-situ long-term monitoring of N2O in agricultural production, and it is also applicable to the detection of other greenhouse gases. PMID:23921829

  1. Real-time N2O gas detection system for agricultural production using a 4.6-µm-band laser source based on a periodically poled LiNbO3 ridge waveguide.

    PubMed

    Tokura, Akio; Asobe, Masaki; Enbutsu, Koji; Yoshihara, Toshihiro; Hashida, Shin-nosuke; Takenouchi, Hirokazu

    2013-08-05

    This article describes a gas monitoring system for detecting nitrous oxide (N2O) gas using a compact mid-infrared laser source based on difference-frequency generation in a quasi-phase-matched LiNbO3 waveguide. We obtained a stable output power of 0.62 mW from a 4.6-μm-band continuous-wave laser source operating at room temperature. This laser source enabled us to detect atmospheric N2O gas at a concentration as low as 35 parts per billion. Using this laser source, we constructed a new real-time in-situ monitoring system for detecting N2O gas emitted from potted plants. A few weeks of monitoring with the developed detection system revealed a strong relationship between nitrogen fertilization and N2O emission. This system is promising for the in-situ long-term monitoring of N2O in agricultural production, and it is also applicable to the detection of other greenhouse gases.

  2. Why Save Wilderness?--Fruits and Veggies!

    ERIC Educational Resources Information Center

    Kowalewski, David

    2015-01-01

    Why save wilderness? Environmental educators usually offer ecosystemic and aesthetic reasons, yet clearly this abstract approach has failed to resonate with the wider public. In this article I adopt a nutritional strategy based on a broad array of sources. Wild plant food, in terms of economics, ubiquity, and other measures, performs very well…

  3. The telecommunications and data acquisition report

    NASA Technical Reports Server (NTRS)

    Renzetti, N. A. (Editor)

    1982-01-01

    Developments in Earth-based radio technology are reported. The Deep Space Network is discussed in terms of its advanced systems, network and facility engineering and implementation, operations, and energy sources. Problems in pulse communication and radio frequency interference are addressed with emphasis on pulse position modulation and laser beam collimation.

  4. Gestalt concept of closure: a construct without closure.

    PubMed

    Wasserstein, Jeanette

    2002-12-01

    This comment reviews the original Gestalt literature which introduced the concept of 'closure'. It is argued that the meaning of 'closure' was confounded in the source literature and, thus, the term connotes more than it denotes. Research based on different measures of this ambiguous construct inevitably may not always converge.

  5. 25 CFR 41.25 - Reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preceding academic year, the annual cost of the education programs of the College from all sources for such academic year, and a final report of the performance based upon the criteria set forth in the College's... of Education its FTE Indian Student enrollment for each academic term of the academic year within...

  6. A novel method for detecting light source for digital images forensic

    NASA Astrophysics Data System (ADS)

    Roy, A. K.; Mitra, S. K.; Agrawal, R.

    2011-06-01

    Manipulation in image has been in practice since centuries. These manipulated images are intended to alter facts — facts of ethics, morality, politics, sex, celebrity or chaos. Image forensic science is used to detect these manipulations in a digital image. There are several standard ways to analyze an image for manipulation. Each one has some limitation. Also very rarely any method tried to capitalize on the way image was taken by the camera. We propose a new method that is based on light and its shade as light and shade are the fundamental input resources that may carry all the information of the image. The proposed method measures the direction of light source and uses the light based technique for identification of any intentional partial manipulation in the said digital image. The method is tested for known manipulated images to correctly identify the light sources. The light source of an image is measured in terms of angle. The experimental results show the robustness of the methodology.

  7. Inverse Source Data-Processing Strategies for Radio-Frequency Localization in Indoor Environments.

    PubMed

    Gennarelli, Gianluca; Al Khatib, Obada; Soldovieri, Francesco

    2017-10-27

    Indoor positioning of mobile devices plays a key role in many aspects of our daily life. These include real-time people tracking and monitoring, activity recognition, emergency detection, navigation, and numerous location based services. Despite many wireless technologies and data-processing algorithms have been developed in recent years, indoor positioning is still a problem subject of intensive research. This paper deals with the active radio-frequency (RF) source localization in indoor scenarios. The localization task is carried out at the physical layer thanks to receiving sensor arrays which are deployed on the border of the surveillance region to record the signal emitted by the source. The localization problem is formulated as an imaging one by taking advantage of the inverse source approach. Different measurement configurations and data-processing/fusion strategies are examined to investigate their effectiveness in terms of localization accuracy under both line-of-sight (LOS) and non-line of sight (NLOS) conditions. Numerical results based on full-wave synthetic data are reported to support the analysis.

  8. 2D joint inversion of CSAMT and magnetic data based on cross-gradient theory

    NASA Astrophysics Data System (ADS)

    Wang, Kun-Peng; Tan, Han-Dong; Wang, Tao

    2017-06-01

    A two-dimensional forward and backward algorithm for the controlled-source audio-frequency magnetotelluric (CSAMT) method is developed to invert data in the entire region (near, transition, and far) and deal with the effects of artificial sources. First, a regularization factor is introduced in the 2D magnetic inversion, and the magnetic susceptibility is updated in logarithmic form so that the inversion magnetic susceptibility is always positive. Second, the joint inversion of the CSAMT and magnetic methods is completed with the introduction of the cross gradient. By searching for the weight of the cross-gradient term in the objective function, the mutual influence between two different physical properties at different locations are avoided. Model tests show that the joint inversion based on cross-gradient theory offers better results than the single-method inversion. The 2D forward and inverse algorithm for CSAMT with source can effectively deal with artificial sources and ensures the reliability of the final joint inversion algorithm.

  9. Inverse Source Data-Processing Strategies for Radio-Frequency Localization in Indoor Environments

    PubMed Central

    Gennarelli, Gianluca; Al Khatib, Obada; Soldovieri, Francesco

    2017-01-01

    Indoor positioning of mobile devices plays a key role in many aspects of our daily life. These include real-time people tracking and monitoring, activity recognition, emergency detection, navigation, and numerous location based services. Despite many wireless technologies and data-processing algorithms have been developed in recent years, indoor positioning is still a problem subject of intensive research. This paper deals with the active radio-frequency (RF) source localization in indoor scenarios. The localization task is carried out at the physical layer thanks to receiving sensor arrays which are deployed on the border of the surveillance region to record the signal emitted by the source. The localization problem is formulated as an imaging one by taking advantage of the inverse source approach. Different measurement configurations and data-processing/fusion strategies are examined to investigate their effectiveness in terms of localization accuracy under both line-of-sight (LOS) and non-line of sight (NLOS) conditions. Numerical results based on full-wave synthetic data are reported to support the analysis. PMID:29077071

  10. Programmable solid state atom sources for nanofabrication.

    PubMed

    Han, Han; Imboden, Matthias; Stark, Thomas; del Corro, Pablo G; Pardo, Flavio; Bolle, Cristian A; Lally, Richard W; Bishop, David J

    2015-06-28

    In this paper we discuss the development of a MEMS-based solid state atom source that can provide controllable atom deposition ranging over eight orders of magnitude, from ten atoms per square micron up to hundreds of atomic layers, on a target ∼1 mm away. Using a micron-scale silicon plate as a thermal evaporation source we demonstrate the deposition of indium, silver, gold, copper, iron, aluminum, lead and tin. Because of their small sizes and rapid thermal response times, pulse width modulation techniques are a powerful way to control the atomic flux. Pulsing the source with precise voltages and timing provides control in terms of when and how many atoms get deposited. By arranging many of these devices into an array, one has a multi-material, programmable solid state evaporation source. These micro atom sources are a complementary technology that can enhance the capability of a variety of nano-fabrication techniques.

  11. Evaluation of Chemistry-Climate Model Results using Long-Term Satellite and Ground-Based Data

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S.

    2005-01-01

    Chemistry-climate models attempt to bring together our best knowledge of the key processes that govern the composition of the atmosphere and its response to changes in forcing. We test these models on a process by process basis by comparing model results to data from many sources. A more difficult task is testing the model response to changes. One way to do this is to use the natural and anthropogenic experiments that have been done on the atmosphere and are continuing to be done. These include the volcanic eruptions of El Chichon and Pinatubo, the solar cycle, and the injection of chlorine and bromine from CFCs and methyl bromide. The test of the model's response to these experiments is their ability to produce the long-term variations in ozone and the trace gases that affect ozone. We now have more than 25 years of satellite ozone data. We have more than 15 years of satellite and ground-based data of HC1, HN03, and many other gases. I will discuss the testing of models using long-term satellite data sets, long-term measurements from the Network for Detection of Stratospheric Change (NDSC) , long-term ground-based measurements of ozone.

  12. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  13. Multimodal Medical Image Fusion by Adaptive Manifold Filter.

    PubMed

    Geng, Peng; Liu, Shuaiqi; Zhuang, Shanna

    2015-01-01

    Medical image fusion plays an important role in diagnosis and treatment of diseases such as image-guided radiotherapy and surgery. The modified local contrast information is proposed to fuse multimodal medical images. Firstly, the adaptive manifold filter is introduced into filtering source images as the low-frequency part in the modified local contrast. Secondly, the modified spatial frequency of the source images is adopted as the high-frequency part in the modified local contrast. Finally, the pixel with larger modified local contrast is selected into the fused image. The presented scheme outperforms the guided filter method in spatial domain, the dual-tree complex wavelet transform-based method, nonsubsampled contourlet transform-based method, and four classic fusion methods in terms of visual quality. Furthermore, the mutual information values by the presented method are averagely 55%, 41%, and 62% higher than the three methods and those values of edge based similarity measure by the presented method are averagely 13%, 33%, and 14% higher than the three methods for the six pairs of source images.

  14. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Effects of metals within ambient air particulate matter (PM) on human health.

    PubMed

    Chen, Lung Chi; Lippmann, Morton

    2009-01-01

    We review literature providing insights on health-related effects caused by inhalation of ambient air particulate matter (PM) containing metals, emphasizing effects associated with in vivo exposures at or near contemporary atmospheric concentrations. Inhalation of much higher concentrations, and high-level exposures via intratracheal (IT) instillation that inform mechanistic processes, are also reviewed. The most informative studies of effects at realistic exposure levels, in terms of identifying influential individual PM components or source-related mixtures, have been based on (1) human and laboratory animal exposures to concentrated ambient particles (CAPs), and (2) human population studies for which both health-related effects were observed and PM composition data were available for multipollutant regression analyses or source apportionment. Such studies have implicated residual oil fly ash (ROFA) as the most toxic source-related mixture, and Ni and V, which are characteristic tracers of ROFA, as particularly influential components in terms of acute cardiac function changes and excess short-term mortality. There is evidence that other metals within ambient air PM, such as Pb and Zn, also affect human health. Most evidence now available is based on the use of ambient air PM components concentration data, rather than actual exposures, to determine significant associations and/or effects coefficients. Therefore, considerable uncertainties about causality are associated with exposure misclassification and measurement errors. As more PM speciation data and more refined modeling techniques become available, and as more CAPs studies involving PM component analyses are performed, the roles of specific metals and other components within PM will become clearer.

  16. Assessment of infrasound signals recorded on seismic stations and infrasound arrays in the western United States using ground truth sources

    NASA Astrophysics Data System (ADS)

    Park, Junghyun; Hayward, Chris; Stump, Brian W.

    2018-06-01

    Ground truth sources in Utah during 2003-2013 are used to assess the contribution of temporal atmospheric conditions to infrasound detection and the predictive capabilities of atmospheric models. Ground truth sources consist of 28 long duration static rocket motor burn tests and 28 impulsive rocket body demolitions. Automated infrasound detections from a hybrid of regional seismometers and infrasound arrays use a combination of short-term time average/long-term time average ratios and spectral analyses. These detections are grouped into station triads using a Delaunay triangulation network and then associated to estimate phase velocity and azimuth to filter signals associated with a particular source location. The resulting range and azimuth distribution from sources to detecting stations varies seasonally and is consistent with predictions based on seasonal atmospheric models. Impulsive signals from rocket body detonations are observed at greater distances (>700 km) than the extended duration signals generated by the rocket burn test (up to 600 km). Infrasound energy attenuation associated with the two source types is quantified as a function of range and azimuth from infrasound amplitude measurements. Ray-tracing results using Ground-to-Space atmospheric specifications are compared to these observations and illustrate the degree to which the time variations in characteristics of the observations can be predicted over a multiple year time period.

  17. A new method of optimal capacitor switching based on minimum spanning tree theory in distribution systems

    NASA Astrophysics Data System (ADS)

    Li, H. W.; Pan, Z. Y.; Ren, Y. B.; Wang, J.; Gan, Y. L.; Zheng, Z. Z.; Wang, W.

    2018-03-01

    According to the radial operation characteristics in distribution systems, this paper proposes a new method based on minimum spanning trees method for optimal capacitor switching. Firstly, taking the minimal active power loss as objective function and not considering the capacity constraints of capacitors and source, this paper uses Prim algorithm among minimum spanning trees algorithms to get the power supply ranges of capacitors and source. Then with the capacity constraints of capacitors considered, capacitors are ranked by the method of breadth-first search. In term of the order from high to low of capacitor ranking, capacitor compensation capacity based on their power supply range is calculated. Finally, IEEE 69 bus system is adopted to test the accuracy and practicality of the proposed algorithm.

  18. Toward an Integrated Executable Architecture and M&S Based Analysis for Counter Terrorism and Homeland Security

    DTIC Science & Technology

    2006-09-01

    Lavoie, D. Kurts, SYNTHETIC ENVIRONMENTS AT THE ENTREPRISE LEVEL: OVERVIEW OF A GOVERNMENT OF CANADA (GOC), ACADEMIA and INDUSTRY DISTRIBUTED...vehicle (UAV) focused to locate the radiological source, and by comparing the performance of these assets in terms of various capability based...framework to analyze homeland security capabilities • Illustrate how a rapidly configured distributed simulation involving academia, industry and

  19. Green materials for sustainable development

    NASA Astrophysics Data System (ADS)

    Purwasasmita, B. S.

    2017-03-01

    Sustainable development is an integrity of multidiscipline concept combining ecological, social and economic aspects to construct a liveable human living system. The sustainable development can be support through the development of green materials. Green materials offers a unique characteristic and properties including abundant in nature, less toxic, economically affordable and versatility in term of physical and chemical properties. Green materials can be applied for a numerous field in science and technology applications including for energy, building, construction and infrastructures, materials science and engineering applications and pollution management and technology. For instance, green materials can be developed as a source for energy production. Green materials including biomass-based source can be developed as a source for biodiesel and bioethanol production. Biomass-based materials also can be transformed into advanced functionalized materials for advanced bio-applications such as the transformation of chitin into chitosan which further used for biomedicine, biomaterials and tissue engineering applications. Recently, cellulose-based material and lignocellulose-based materials as a source for the developing functional materials attracted the potential prospect for biomaterials, reinforcing materials and nanotechnology. Furthermore, the development of pigment materials has gaining interest by using the green materials as a source due to their unique properties. Eventually, Indonesia as a large country with a large biodiversity can enhance the development of green material to strengthen our nation competitiveness and develop the materials technology for the future.

  20. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  1. The availability of health information system for decision-making with evidence-based medicine approach-a case study: Kermanshah, Iran.

    PubMed

    Safari, Ameneh; Safari, Yahya

    2018-08-01

    Evidence-based medicine (EBM) is defining proper and wise use of the best evidence in clinical decision for patient׳s care. This study have done with the aim of evaluating health information system for decision-making with EBM approach in educational hospital of Kermanshah city. The statistical population include all the specialist and specialty, and also head nurses of educational hospitals in Kermanshah city. The data collected by researcher made questionnaire. The content validities of the questionnaire were confirmed by experts to complete the questions of the questionnaire. Then, the reliability of the questionnaire was evaluated using the Cronbach׳s alpha coefficient. The results have showed that the accessibility rate to the internet sources is in desirable level. The results have showed that there was a significant difference at least in one group between the availability of hospital information system EBM establishment in terms of accessing to the internet based data, according to the academic major ( P = 0.021 ). The sufficiency of hospital information system in evidence-based medicine establishment in terms of necessary knowledge for implementing it according to the educational major have showed a significant statistical difference at least in one group ( P = 0.001 ). Kermanshah׳s hospital have a desirable condition in terms of accessibility to the internet sources, knowledge of EBM and its implementation which this have showed the availability of desirable platform for decision-making with the EBM approach. However, it is better to implement regulate educational periods for educating the doctors and nurses in order to reach practical implementation of the EBM approach.

  2. Evaluation of a LED-based flatbed document scanner for radiochromic film dosimetry in transmission mode.

    PubMed

    Lárraga-Gutiérrez, José Manuel; García-Garduño, Olivia Amanda; Treviño-Palacios, Carlos; Herrera-González, José Alfredo

    2018-03-01

    Flatbed scanners are the most frequently used reading instrument for radiochromic film dosimetry because its low cost, high spatial resolution, among other advantages. These scanners use a fluorescent lamp and a CCD array as light source and detector, respectively. Recently, manufacturers of flatbed scanners replaced the fluorescent lamp by light emission diodes (LED) as a light source. The goal of this work is to evaluate the performance of a commercial flatbed scanner with LED based source light for radiochromic film dosimetry. Film read out consistency, response uniformity, film-scanner sensitivity, long term stability and total dose uncertainty was evaluated. In overall, the performance of the LED flatbed scanner is comparable to that of a cold cathode fluorescent lamp (CCFL). There are important spectral differences between LED and CCFL lamps that results in a higher sensitivity of the LED scanner in the green channel. Total dose uncertainty, film response reproducibility and long-term stability of LED scanner are slightly better than those of the CCFL. However, the LED based scanner has a strong non-uniform response, up to 9%, that must be adequately corrected for radiotherapy dosimetry QA. The differences in light emission spectra between LED and CCFL lamps and its potential impact on film-scanner sensitivity suggest that the design of a dedicated flat-bed scanner with LEDs may improve sensitivity and dose uncertainty in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Source apportionment of speciated PM2.5 and non-parametric regressions of PM2.5 and PM(coarse) mass concentrations from Denver and Greeley, Colorado, and construction and evaluation of dichotomous filter samplers

    NASA Astrophysics Data System (ADS)

    Piedrahita, Ricardo A.

    The Denver Aerosol Sources and Health study (DASH) was a long-term study of the relationship between the variability in fine particulate mass and chemical constituents (PM2.5, particulate matter less than 2.5mum) and adverse health effects such as cardio-respiratory illnesses and mortality. Daily filter samples were chemically analyzed for multiple species. We present findings based on 2.8 years of DASH data, from 2003 to 2005. Multilinear Engine 2 (ME-2), a receptor-based source apportionment model was applied to the data to estimate source contributions to PM2.5 mass concentrations. This study relied on two different ME-2 models: (1) a 2-way model that closely reflects PMF-2; and (2) an enhanced model with meteorological data that used additional temporal and meteorological factors. The Coarse Rural Urban Sources and Health study (CRUSH) is a long-term study of the relationship between the variability in coarse particulate mass (PMcoarse, particulate matter between 2.5 and 10mum) and adverse health effects such as cardio-respiratory illnesses, pre-term births, and mortality. Hourly mass concentrations of PMcoarse and fine particulate matter (PM2.5) are measured using tapered element oscillating microbalances (TEOMs) with Filter Dynamics Measurement Systems (FDMS), at two rural and two urban sites. We present findings based on nine months of mass concentration data, including temporal trends, and non-parametric regressions (NPR) results, which were used to characterize the wind speed and wind direction relationships that might point to sources. As part of CRUSH, 1-year coarse and fine mode particulate matter filter sampling network, will allow us to characterize the chemical composition of the particulate matter collected and perform spatial comparisons. This work describes the construction and validation testing of four dichotomous filter samplers for this purpose. The use of dichotomous splitters with an approximate 2.5mum cut point, coupled with a 10mum cut diameter inlet head allows us to collect the separated size fractions that the collocated TEOMs collect continuously. Chemical analysis of the filters will include inorganic ions, organic compounds, EC, OC, and biological analyses. Side by side testing showed the cut diameters were in agreement with each other, and with a well characterized virtual impactor lent to the group by the University of Southern California. Error propagation was performed and uncertainty results were similar to the observed standard deviations.

  4. A UMLS-based spell checker for natural language processing in vaccine safety.

    PubMed

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-02-12

    The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74-75), 100% (95% CI: 100-100), and 47% (95% CI: 46%-48%), respectively. We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest.

  5. A UMLS-based spell checker for natural language processing in vaccine safety

    PubMed Central

    Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C

    2007-01-01

    Background The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. Methods We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. Results We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74–75), 100% (95% CI: 100–100), and 47% (95% CI: 46%–48%), respectively. Conclusion We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest. PMID:17295907

  6. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  7. Energy Spectra of Abundant Cosmic-ray Nuclei in Sources, According to the ATIC Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panov, A. D.; Sokolskaya, N. V.; Zatsepin, V. I., E-mail: panov@dec1.sinp.msu.ru

    One of the main results of the ATIC (Advanced Thin Ionization Calorimeter) experiment is a collection of energy spectra of abundant cosmic-ray nuclei: protons, He, C, O, Ne, Mg, Si, and Fe measured in terms of energy per particle in the energy range from 50 GeV to tens of teraelectronvolts. In this paper, the ATIC energy spectra of abundant primary nuclei are back-propagated to the spectra in sources in terms of magnetic rigidity using a leaky-box approximation of three different GALPROP-based diffusion models of propagation that fit the latest B/C data of the AMS-02 experiment. It is shown that themore » results of a comparison of the slopes of the spectra in sources are weakly model dependent; therefore the differences of spectral indices are reliable data. A regular growth of the steepness of spectra in sources in the range of magnetic rigidity of 50–1350 GV is found for a charge range from helium to iron. This conclusion is statistically reliable with significance better than 3.2 standard deviations. The results are discussed and compared to the data of other modern experiments.« less

  8. Remedial investigation work plan for Bear Creek Valley Operable Unit 4 (shallow groundwater in Bear Creek Valley) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-07-01

    To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODs) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regimes, which are labeled as integrator OUs. This remedial investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the feasibility study to evaluate all probable or likely alternatives.« less

  9. Remedial Investigation work plan for Bear Creek Valley Operable Unit 4 (shallow groundwater in Bear Creek Valley) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-09-01

    To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODS) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regime`s, which are labeled as integrator OUs. This Remedial Investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the Feasibility Study to evaluate all probable or likely alternatives.« less

  10. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  11. QCD sum rules study of meson-baryon sigma terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erkol, Gueray; Oka, Makoto; Turan, Guersevil

    2008-11-01

    The pion-baryon sigma terms and the strange-quark condensates of the octet and the decuplet baryons are calculated by employing the method of QCD sum rules. We evaluate the vacuum-to-vacuum transition matrix elements of two baryon interpolating fields in an external isoscalar-scalar field and use a Monte Carlo-based approach to systematically analyze the sum rules and the uncertainties in the results. We extract the ratios of the sigma terms, which have rather high accuracy and minimal dependence on QCD parameters. We discuss the sources of uncertainties and comment on possible strangeness content of the nucleon and the Delta.

  12. Apparatus And Method For Osl-Based, Remote Radiation Monitoring And Spectrometry

    DOEpatents

    Miller, Steven D.; Smith, Leon Eric; Skorpik, James R.

    2006-03-07

    Compact, OSL-based devices for long-term, unattended radiation detection and spectroscopy are provided. In addition, a method for extracting spectroscopic information from these devices is taught. The devices can comprise OSL pixels and at least one radiation filter surrounding at least a portion of the OSL pixels. The filter can modulate an incident radiation flux. The devices can further comprise a light source and a detector, both proximally located to the OSL pixels, as well as a power source and a wireless communication device, each operably connected to the light source and the detector. Power consumption of the device ranges from ultra-low to zero. The OSL pixels can retain data regarding incident radiation events as trapped charges. The data can be extracted wirelessly or manually. The method for extracting spectroscopic data comprises optically stimulating the exposed OSL pixels, detecting a readout luminescence, and reconstructing an incident-energy spectrum from the luminescence.

  13. Apparatus and method for OSL-based, remote radiation monitoring and spectrometry

    DOEpatents

    Smith, Leon Eric [Richland, WA; Miller, Steven D [Richland, WA; Bowyer, Theodore W [Oakton, VA

    2008-05-20

    Compact, OSL-based devices for long-term, unattended radiation detection and spectroscopy are provided. In addition, a method for extracting spectroscopic information from these devices is taught. The devices can comprise OSL pixels and at least one radiation filter surrounding at least a portion of the OSL pixels. The filter can modulate an incident radiation flux. The devices can further comprise a light source and a detector, both proximally located to the OSL pixels, as well as a power source and a wireless communication device, each operably connected to the light source and the detector. Power consumption of the device ranges from ultra-low to zero. The OSL pixels can retain data regarding incident radiation events as trapped charges. The data can be extracted wirelessly or manually. The method for extracting spectroscopic data comprises optically stimulating the exposed OSL pixels, detecting a readout luminescence, and reconstructing an incident-energy spectrum from the luminescence.

  14. Work Life Stress and Career Resilience of Licensed Nursing Facility Administrators.

    PubMed

    Myers, Dennis R; Rogers, Rob; LeCrone, Harold H; Kelley, Katherine; Scott, Joel H

    2018-04-01

    Career resilience provided a frame for understanding how Licensed Nursing Facility Administrators (LNFAs) sustain role performance and even thrive in stressful skilled nursing facility work environments. Quantitative and qualitative analyses of in-depth interviews with18 LNFAs, averaging 24 years of experience were conducted by a five-member research team. Analysis was informed by evidence-based frameworks for career resilience in the health professions as well as the National Association of Long-Term Care Administrator Boards' (NAB) five domains of competent administrative practice. Findings included six sources of work stressors and six sources of professional satisfaction. Also, participants identified seven strategic principles and 10 administrative practices for addressing major sources of stress. Recommendations are provided for research and evidence-based application of the career resilience perspective to LNFA practice aimed at reducing role abandonment and energizing the delivery of the quality of care that each resident deserves.

  15. Annual Rates on Seismogenic Italian Sources with Models of Long-Term Predictability for the Time-Dependent Seismic Hazard Assessment In Italy

    NASA Astrophysics Data System (ADS)

    Murru, Maura; Falcone, Giuseppe; Console, Rodolfo

    2016-04-01

    The present study is carried out in the framework of the Center for Seismic Hazard (CPS) INGV, under the agreement signed in 2015 with the Department of Civil Protection for developing a new model of seismic hazard of the country that can update the current reference (MPS04-S1; zonesismiche.mi.ingv.it and esse1.mi.ingv.it) released between 2004 and 2006. In this initiative, we participate with the Long-Term Stress Transfer (LTST) Model to provide the annual occurrence rate of a seismic event on the entire Italian territory, from a Mw4.5 minimum magnitude, considering bins of 0.1 magnitude units on geographical cells of 0.1° x 0.1°. Our methodology is based on the fusion of a statistical time-dependent renewal model (Brownian Passage Time, BPT, Matthews at al., 2002) with a physical model which considers the permanent effect in terms of stress that undergoes a seismogenic source in result of the earthquakes that occur on surrounding sources. For each considered catalog (historical, instrumental and individual seismogenic sources) we determined a distinct rate value for each cell of 0.1° x 0.1° for the next 50 yrs. If the cell falls within one of the sources in question, we adopted the respective value of rate, which is referred only to the magnitude of the event characteristic. This value of rate is divided by the number of grid cells that fall on the horizontal projection of the source. If instead the cells fall outside of any seismic source we considered the average value of the rate obtained from the historical and the instrumental catalog, using the method of Frankel (1995). The annual occurrence rate was computed for any of the three considered distributions (Poisson, BPT and BPT with inclusion of stress transfer).

  16. Multi-wavelength mid-IR light source for gas sensing

    NASA Astrophysics Data System (ADS)

    Karioja, Pentti; Alajoki, Teemu; Cherchi, Matteo; Ollila, Jyrki; Harjanne, Mikko; Heinilehto, Noora; Suomalainen, Soile; Viheriälä, Jukka; Zia, Nouman; Guina, Mircea; Buczyński, Ryszard; Kasztelanic, Rafał; Kujawa, Ireneusz; Salo, Tomi; Virtanen, Sami; Kluczyński, Paweł; Sagberg, Hâkon; Ratajczyk, Marcin; Kalinowski, Przemyslaw

    2017-02-01

    Cost effective multi-wavelength light sources are key enablers for wide-scale penetration of gas sensors at Mid-IR wavelength range. Utilizing novel Mid-IR Si-based photonic integrated circuits (PICs) filter and wide-band Mid-IR Super Luminescent Light Emitting Diodes (SLEDs), we show the concept of a light source that covers 2.5…3.5 μm wavelength range with a resolution of <1nm. The spectral bands are switchable and tunable and they can be modulated. The source allows for the fabrication of an affordable multi-band gas sensor with good selectivity and sensitivity. The unit price can be lowered in high volumes by utilizing tailored molded IR lens technology and automated packaging and assembling technologies. The status of the development of the key components of the light source are reported. The PIC is based on the use of micron-scale SOI technology, SLED is based on AlGaInAsSb materials and the lenses are tailored heavy metal oxide glasses fabricated by the use of hot-embossing. The packaging concept utilizing automated assembly tools is depicted. In safety and security applications, the Mid-IR wavelength range covered by the novel light source allows for detecting several harmful gas components with a single sensor. At the moment, affordable sources are not available. The market impact is expected to be disruptive, since the devices currently in the market are either complicated, expensive and heavy instruments, or the applied measurement principles are inadequate in terms of stability and selectivity.

  17. Short-term microbial release during rain events from on-site sewers and cattle in a surface water source.

    PubMed

    Aström, Johan; Pettersson, Thomas J R; Reischer, Georg H; Hermansson, Malte

    2013-09-01

    The protection of drinking water from pathogens such as Cryptosporidium and Giardia requires an understanding of the short-term microbial release from faecal contamination sources in the catchment. Flow-weighted samples were collected during two rainfall events in a stream draining an area with on-site sewers and during two rainfall events in surface runoff from a bovine cattle pasture. Samples were analysed for human (BacH) and ruminant (BacR) Bacteroidales genetic markers through quantitative polymerase chain reaction (qPCR) and for sorbitol-fermenting bifidobacteria through culturing as a complement to traditional faecal indicator bacteria, somatic coliphages and the parasitic protozoa Cryptosporidium spp. and Giardia spp. analysed by standard methods. Significant positive correlations were observed between BacH, Escherichia coli, intestinal enterococci, sulphite-reducing Clostridia, turbidity, conductivity and UV254 in the stream contaminated by on-site sewers. For the cattle pasture, no correlation was found between any of the genetic markers and the other parameters. Although parasitic protozoa were not detected, the analysis for genetic markers provided baseline data on the short-term faecal contamination due to these potential sources of parasites. Background levels of BacH and BacR makers in soil emphasise the need to including soil reference samples in qPCR-based analyses for Bacteroidales genetic markers.

  18. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.

    PubMed

    Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi

    2011-10-13

    Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.

  19. Genes2WordCloud: a quick way to identify biological themes from gene lists and free text

    PubMed Central

    2011-01-01

    Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939

  20. Coarse Grid CFD for underresolved simulation

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Viellieber, Mathias O.; Himmel, Steffen R.

    2010-11-01

    CFD simulation of the complete reactor core of a nuclear power plant requires exceedingly huge computational resources so that this crude power approach has not been pursued yet. The traditional approach is 1D subchannel analysis employing calibrated transport models. Coarse grid CFD is an attractive alternative technique based on strongly under-resolved CFD and the inviscid Euler equations. Obviously, using inviscid equations and coarse grids does not resolve all the physics requiring additional volumetric source terms modelling viscosity and other sub-grid effects. The source terms are implemented via correlations derived from fully resolved representative simulations which can be tabulated or computed on the fly. The technique is demonstrated for a Carnot diffusor and a wire-wrap fuel assembly [1]. [4pt] [1] Himmel, S.R. phd thesis, Stuttgart University, Germany 2009, http://bibliothek.fzk.de/zb/berichte/FZKA7468.pdf

  1. Identification of Blood Meal Sources in Aedes vexans and Culex quinquefasciatus in Bernalillo County, New Mexico

    PubMed Central

    Greenberg, Jacob A.; Lujan, Daniel A.; DiMenna, Mark A.; Wearing, Helen J.; Hofkin, Bruce V.

    2013-01-01

    Culex quinquefasciatus Say (Diptera: Culicidae) and Aedes vexans Meigen are two of the most abundant mosquitoes in Bernalillo County, New Mexico, USA. In this study, a polymerase chain reaction based methodology was used to identify the sources of blood meals taken by these two species. Ae. vexans was found to take a large proportion of its meals from mammals. Although less specific in terms of its blood meal preferences, Cx. quinquefasciatus was found to feed more commonly on birds. The results for Ae. vexans are similar to those reported for this species in other parts of their geographic range. Cx. quinquefasciatus appears to be more variable in terms of its host feeding under different environmental or seasonal circumstances. The implications of these results for arbovirus transmission are discussed. PMID:24224615

  2. From a Content Delivery Portal to a Knowledge Management System for Standardized Cancer Documentation.

    PubMed

    Schlue, Danijela; Mate, Sebastian; Haier, Jörg; Kadioglu, Dennis; Prokosch, Hans-Ulrich; Breil, Bernhard

    2017-01-01

    Heterogeneous tumor documentation and its challenges of interpretation of medical terms lead to problems in analyses of data from clinical and epidemiological cancer registries. The objective of this project was to design, implement and improve a national content delivery portal for oncological terms. Data elements of existing handbooks and documentation sources were analyzed, combined and summarized by medical experts of different comprehensive cancer centers. Informatics experts created a generic data model based on an existing metadata repository. In order to establish a national knowledge management system for standardized cancer documentation, a prototypical tumor wiki was designed and implemented. Requirements engineering techniques were applied to optimize this platform. It is targeted to user groups such as documentation officers, physicians and patients. The linkage to other information sources like PubMed and MeSH was realized.

  3. High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2010-12-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Promotional literature: how do we critically appraise?

    PubMed

    Shetty, V V; Karve, A V

    2008-01-01

    There has been a tremendous increase in the number of new and generic drugs coming into the market. The busy practitioner obtains the information from various sources, of which promotional literature forms an important source. The promotional literature provided by the pharmaceutical companies cannot be entirely relied upon; moreover, very few physicians are equipped with the skills of critically appraising it. The new drug should be relevant to the clinician's practice in terms of population studied, the disease and the need for new treatment. The methodology of the study should be carefully judged to determine the authenticity of the evidence. The new drug should be preferred over the existing one if it offers clear advantages in terms of safety, tolerability, efficacy and price. Critical appraisal of promotional literature can provide valuable information to the busy physician to practice evidence-based medicine.

  5. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  6. Evaluation of actuator energy storage and power sources for spacecraft applications

    NASA Technical Reports Server (NTRS)

    Simon, William E.; Young, Fred M.

    1993-01-01

    The objective of this evaluation is to determine an optimum energy storage/power source combination for electrical actuation systems for existing (Solid Rocket Booster (SRB), Shuttle) and future (Advanced Launch System (ALS), Shuttle Derivative) vehicles. Characteristic of these applications is the requirement for high power pulses (50-200 kW) for short times (milliseconds to seconds), coupled with longer-term base or 'housekeeping' requirements (5-16 kW). Specific study parameters (e.g., weight, volume, etc.) as stated in the proposal and specified in the Statement of Work (SOW) are included.

  7. A new DOD and DOA estimation method for MIMO radar

    NASA Astrophysics Data System (ADS)

    Gong, Jian; Lou, Shuntian; Guo, Yiduo

    2018-04-01

    The battlefield electromagnetic environment is becoming more and more complex, and MIMO radar will inevitably be affected by coherent and non-stationary noise. To solve this problem, an angle estimation method based on oblique projection operator and Teoplitz matrix reconstruction is proposed. Through the reconstruction of Toeplitz, nonstationary noise is transformed into Gauss white noise, and then the oblique projection operator is used to separate independent and correlated sources. Finally, simulations are carried out to verify the performance of the proposed algorithm in terms of angle estimation performance and source overload.

  8. OntoFox: web-based support for ontology reuse

    PubMed Central

    2010-01-01

    Background Ontology development is a rapidly growing area of research, especially in the life sciences domain. To promote collaboration and interoperability between different projects, the OBO Foundry principles require that these ontologies be open and non-redundant, avoiding duplication of terms through the re-use of existing resources. As current options to do so present various difficulties, a new approach, MIREOT, allows specifying import of single terms. Initial implementations allow for controlled import of selected annotations and certain classes of related terms. Findings OntoFox http://ontofox.hegroup.org/ is a web-based system that allows users to input terms, fetch selected properties, annotations, and certain classes of related terms from the source ontologies and save the results using the RDF/XML serialization of the Web Ontology Language (OWL). Compared to an initial implementation of MIREOT, OntoFox allows additional and more easily configurable options for selecting and rewriting annotation properties, and for inclusion of all or a computed subset of terms between low and top level terms. Additional methods for including related classes include a SPARQL-based ontology term retrieval algorithm that extracts terms related to a given set of signature terms and an option to extract the hierarchy rooted at a specified ontology term. OntoFox's output can be directly imported into a developer's ontology. OntoFox currently supports term retrieval from a selection of 15 ontologies accessible via SPARQL endpoints and allows users to extend this by specifying additional endpoints. An OntoFox application in the development of the Vaccine Ontology (VO) is demonstrated. Conclusions OntoFox provides a timely publicly available service, providing different options for users to collect terms from external ontologies, making them available for reuse by import into client OWL ontologies. PMID:20569493

  9. A first near real-time seismology-based landquake monitoring system.

    PubMed

    Chao, Wei-An; Wu, Yih-Min; Zhao, Li; Chen, Hongey; Chen, Yue-Gau; Chang, Jui-Ming; Lin, Che-Min

    2017-03-02

    Hazards from gravity-driven instabilities on hillslope (termed 'landquake' in this study) are an important problem facing us today. Rapid detection of landquake events is crucial for hazard mitigation and emergency response. Based on the real-time broadband data in Taiwan, we have developed a near real-time landquake monitoring system, which is a fully automatic process based on waveform inversion that yields source information (e.g., location and mechanism) and identifies the landquake source by examining waveform fitness for different types of source mechanisms. This system has been successfully tested offline using seismic records during the passage of the 2009 Typhoon Morakot in Taiwan and has been in online operation during the typhoon season in 2015. In practice, certain levels of station coverage (station gap < 180°), signal-to-noise ratio (SNR ≥ 5.0), and a threshold of event size (volume >10 6  m 3 and area > 0.20 km 2 ) are required to ensure good performance (fitness > 0.6 for successful source identification) of the system, which can be readily implemented in other places in the world with real-time seismic networks and high landquake activities.

  10. A first near real-time seismology-based landquake monitoring system

    PubMed Central

    Chao, Wei-An; Wu, Yih-Min; Zhao, Li; Chen, Hongey; Chen, Yue-Gau; Chang, Jui-Ming; Lin, Che-Min

    2017-01-01

    Hazards from gravity-driven instabilities on hillslope (termed ‘landquake’ in this study) are an important problem facing us today. Rapid detection of landquake events is crucial for hazard mitigation and emergency response. Based on the real-time broadband data in Taiwan, we have developed a near real-time landquake monitoring system, which is a fully automatic process based on waveform inversion that yields source information (e.g., location and mechanism) and identifies the landquake source by examining waveform fitness for different types of source mechanisms. This system has been successfully tested offline using seismic records during the passage of the 2009 Typhoon Morakot in Taiwan and has been in online operation during the typhoon season in 2015. In practice, certain levels of station coverage (station gap < 180°), signal-to-noise ratio (SNR ≥ 5.0), and a threshold of event size (volume >106 m3 and area > 0.20 km2) are required to ensure good performance (fitness > 0.6 for successful source identification) of the system, which can be readily implemented in other places in the world with real-time seismic networks and high landquake activities. PMID:28252039

  11. An IR-Based Approach Utilizing Query Expansion for Plagiarism Detection in MEDLINE.

    PubMed

    Nawab, Rao Muhammad Adeel; Stevenson, Mark; Clough, Paul

    2017-01-01

    The identification of duplicated and plagiarized passages of text has become an increasingly active area of research. In this paper, we investigate methods for plagiarism detection that aim to identify potential sources of plagiarism from MEDLINE, particularly when the original text has been modified through the replacement of words or phrases. A scalable approach based on Information Retrieval is used to perform candidate document selection-the identification of a subset of potential source documents given a suspicious text-from MEDLINE. Query expansion is performed using the ULMS Metathesaurus to deal with situations in which original documents are obfuscated. Various approaches to Word Sense Disambiguation are investigated to deal with cases where there are multiple Concept Unique Identifiers (CUIs) for a given term. Results using the proposed IR-based approach outperform a state-of-the-art baseline based on Kullback-Leibler Distance.

  12. Feasibility of future epidemiological studies on possible health effects of mobile phone base stations.

    PubMed

    Neubauer, Georg; Feychting, Maria; Hamnerius, Yngve; Kheifets, Leeka; Kuster, Niels; Ruiz, Ignacio; Schüz, Joachim; Uberbacher, Richard; Wiart, Joe; Röösli, Martin

    2007-04-01

    The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach. (c) 2006 Wiley-Liss, Inc.

  13. The Weak Fe Fluorescence Line and Long-Term X-Ray Evolution of the Compton-Thick Active Galactic Nucleus in NGC7674

    NASA Technical Reports Server (NTRS)

    Ghandi, P.; Annuar, A.; Lansbury, G. B.; Stern, D.; Alexander, D. M.; Bauer, F. E.; Bianchi, S.; Boggs, S. E.; Boorman, P. G.; Brandt, W. N.; hide

    2017-01-01

    We present NuSTAR X-ray observations of the active galactic nucleus (AGN) in NGC7674.The source shows a flat X-ray spectrum, suggesting that it is obscured by Compton-thick gas columns. Based upon long-term flux dimming, previous work suggested the alternate possibility that the source is a recently switched-off AGN with the observed X-rays being the lagged echo from the torus. Our high-quality data show the source to be reflection-dominated in hard X-rays, but with a relatively weak neutral Fe K(alpha) emission line (equivalent width [EW] of approximately 0.4 keV) and a strong Fe XXVI ionized line (EW approximately 0.2 keV).We construct an updated long-term X-ray light curve of NGC7674 and find that the observed 2-10 keV flux has remained constant for the past approximately 20 yr, following a high-flux state probed by Ginga. Light travel time arguments constrain the minimum radius of the reflector to be approximately 3.2 pc under the switched-off AGN scenario, approximately 30 times larger than the expected dust sublimation radius, rendering this possibility unlikely. A patchy Compton-thick AGN (CTAGN) solution is plausible, requiring a minimum line-of-sight column density (N(sub H)) of 3 x 10(exp 24) cm(exp -2) at present, and yields an intrinsic 2-10 keV luminosity of (3-5) x 10(exp 43) erg s(exp -1). Realistic uncertainties span the range of approximately (1-13) x 10(exp 43) erg s1. The source has one of the weakest fluorescence lines amongst bona fide CTAGN, and is potentially a local analogue of bolometrically luminous systems showing complex neutral and ionized Fe emission. It exemplifies the difficulty of identification and proper characterization of distant CTAGN based on the strength of the neutral Fe K line

  14. The weak Fe fluorescence line and long-term X-ray evolution of the Compton-thick active galactic nucleus in NGC 7674

    NASA Astrophysics Data System (ADS)

    Gandhi, P.; Annuar, A.; Lansbury, G. B.; Stern, D.; Alexander, D. M.; Bauer, F. E.; Bianchi, S.; Boggs, S. E.; Boorman, P. G.; Brandt, W. N.; Brightman, M.; Christensen, F. E.; Comastri, A.; Craig, W. W.; Del Moro, A.; Elvis, M.; Guainazzi, M.; Hailey, C. J.; Harrison, F. A.; Koss, M.; Lamperti, I.; Malaguti, G.; Masini, A.; Matt, G.; Puccetti, S.; Ricci, C.; Rivers, E.; Walton, D. J.; Zhang, W. W.

    2017-06-01

    We present NuSTAR X-ray observations of the active galactic nucleus (AGN) in NGC 7674. The source shows a flat X-ray spectrum, suggesting that it is obscured by Compton-thick gas columns. Based upon long-term flux dimming, previous work suggested the alternate possibility that the source is a recently switched-off AGN with the observed X-rays being the lagged echo from the torus. Our high-quality data show the source to be reflection-dominated in hard X-rays, but with a relatively weak neutral Fe Kα emission line (equivalent width [EW] of ≈ 0.4 keV) and a strong Fe xxvi ionized line (EW ≈ 0.2 keV). We construct an updated long-term X-ray light curve of NGC 7674 and find that the observed 2-10 keV flux has remained constant for the past ≈ 20 yr, following a high-flux state probed by Ginga. Light travel time arguments constrain the minimum radius of the reflector to be ˜ 3.2 pc under the switched-off AGN scenario, ≈ 30 times larger than the expected dust sublimation radius, rendering this possibility unlikely. A patchy Compton-thick AGN (CTAGN) solution is plausible, requiring a minimum line-of-sight column density (NH) of 3 × 1024 cm-2 at present, and yields an intrinsic 2-10 keV luminosity of (3-5) × 1043 erg s-1. Realistic uncertainties span the range of ≈ (1-13) × 1043 erg s-1. The source has one of the weakest fluorescence lines amongst bona fide CTAGN, and is potentially a local analogue of bolometrically luminous systems showing complex neutral and ionized Fe emission. It exemplifies the difficulty of identification and proper characterization of distant CTAGN based on the strength of the neutral Fe Kα line.

  15. Chlorine bleaches - A significant long term source of mercury pollution

    NASA Technical Reports Server (NTRS)

    Siegel, S. M.; Eshleman, A.

    1975-01-01

    Products of industrial electrolysis of brine - NaOCl-based bleaches and NaOH - yielded 17 to 1290 ppb of Hg upon flameless atomic absorption analysis. Compared with current U.S. rejection value of 5 ppb for potable waters, the above levels seem sufficiently high to be a matter of environmental concern.

  16. Evaluation of a Brief Homework Assignment Designed to Reduce Citation Problems

    ERIC Educational Resources Information Center

    Schuetze, Pamela

    2004-01-01

    I evaluated a brief homework assignment designed to reduce citation problems in research-based term papers. Students in 2 developmental psychology classes received a brief presentation and handout defining plagiarism with tips on how to cite sources to avoid plagiarizing. In addition, students in 1 class completed 2 brief homework assignments in…

  17. Educational Reform in the Sunshine State: High Need, Low Funding, and a Disaffected Electorate.

    ERIC Educational Resources Information Center

    Herrington, Carolyn D.; Trimble, Susan

    1997-01-01

    Claims voters and elected officials allow only short-term solutions to financial problems and Florida fails to address need or adequacy. Explains sources of problems by describing state funding formula and discussing structurally inadequate tax base, new political and constitutional barriers to tax expansion, and sectoral rivalry for state…

  18. Openness, Web 2.0 Technology, and Open Science

    ERIC Educational Resources Information Center

    Peters, Michael A.

    2010-01-01

    Open science is a term that is being used in the literature to designate a form of science based on open source models or that utilizes principles of open access, open archiving and open publishing to promote scientific communication. Open science increasingly also refers to open governance and more democratized engagement and control of science…

  19. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  20. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  1. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  2. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  3. 13 CFR 120.101 - Credit not available elsewhere.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... available on reasonable terms from non-Federal sources. SBA requires the Lender or CDC to certify or... periods of time. Submission of an application to SBA by a Lender or CDC constitutes certification by the Lender or CDC that it has examined the availability of credit to the applicant, has based its...

  4. Stable source reconstruction from a finite number of measurements in the multi-frequency inverse source problem

    NASA Astrophysics Data System (ADS)

    Karamehmedović, Mirza; Kirkeby, Adrian; Knudsen, Kim

    2018-06-01

    We consider the multi-frequency inverse source problem for the scalar Helmholtz equation in the plane. The goal is to reconstruct the source term in the equation from measurements of the solution on a surface outside the support of the source. We study the problem in a certain finite dimensional setting: from measurements made at a finite set of frequencies we uniquely determine and reconstruct sources in a subspace spanned by finitely many Fourier–Bessel functions. Further, we obtain a constructive criterion for identifying a minimal set of measurement frequencies sufficient for reconstruction, and under an additional, mild assumption, the reconstruction method is shown to be stable. Our analysis is based on a singular value decomposition of the source-to-measurement forward operators and the distribution of positive zeros of the Bessel functions of the first kind. The reconstruction method is implemented numerically and our theoretical findings are supported by numerical experiments.

  5. Differences in carbon source utilization of Salmonella Oranienburg and Saintpaul isolated from river water.

    PubMed

    Medrano-Félix, Andrés; Estrada-Acosta, Mitzi; Peraza-Garay, Felipe; Castro-Del Campo, Nohelia; Martínez-Urtaza, Jaime; Chaidez, Cristóbal

    2017-08-01

    Long-term exposure to river water by non-indigenous micro-organisms such as Salmonella may affect metabolic adaptation to carbon sources. This study was conducted to determine differences in carbon source utilization of Salmonella Oranienburg and Salmonella Saintpaul (isolated from tropical river water) as well as the control strain Salmonella Typhimurium exposed to laboratory, river water, and host cells (Hep-2 cell line) growth conditions. Results showed that Salmonella Oranienburg and Salmonella Saintpaul showed better ability for carbon source utilization under the three growth conditions evaluated; however, S. Oranienburg showed the fastest and highest utilization on different carbon sources, including D-Glucosaminic acid, N-acetyl-D-Glucosamine, Glucose-1-phosphate, and D-Galactonic acid, while Salmonella Saintpaul and S. Typhimurium showed a limited utilization of carbon sources. In conclusion, this study suggests that environmental Salmonella strains show better survival and preconditioning abilities to external environments than the control strain based on their plasticity on diverse carbon sources use.

  6. Verification and Improvement of Flamelet Approach for Non-Premixed Flames

    NASA Technical Reports Server (NTRS)

    Zaitsev, S.; Buriko, Yu.; Guskov, O.; Kopchenov, V.; Lubimov, D.; Tshepin, S.; Volkov, D.

    1997-01-01

    Studies in the mathematical modeling of the high-speed turbulent combustion has received renewal attention in the recent years. The review of fundamentals, approaches and extensive bibliography was presented by Bray, Libbi and Williams. In order to obtain accurate predictions for turbulent combustible flows, the effects of turbulent fluctuations on the chemical source terms should be taken into account. The averaging of chemical source terms requires to utilize probability density function (PDF) model. There are two main approaches which are dominant in high-speed combustion modeling now. In the first approach, PDF form is assumed based on intuitia of modelliers (see, for example, Spiegler et.al.; Girimaji; Baurle et.al.). The second way is much more elaborate and it is based on the solution of evolution equation for PDF. This approach was proposed by S.Pope for incompressible flames. Recently, it was modified for modeling of compressible flames in studies of Farschi; Hsu; Hsu, Raji, Norris; Eifer, Kollman. But its realization in CFD is extremely expensive in computations due to large multidimensionality of PDF evolution equation (Baurle, Hsu, Hassan).

  7. Remediation and its effect represented on long term monitoring data at a chlorinated ethenes contaminated site, Wonju, Korea

    NASA Astrophysics Data System (ADS)

    Lee, Seong-Sun; Lee, Seung Hyun; Lee, Kang-Kun

    2016-04-01

    A research for the contamination of chlorinated ethenes such as trichloroethylene (TCE) at an industrial complex, Wonju, Korea, was carried out based on 17 rounds of groundwater quality data collection from 2009 to 2015. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones to groundwater discharge area like a stream. The remediation efficiency according to the remedial actions was evaluated by tracing a time-series of plume evaluation and temporal mass discharge at three transects (Source, Transect-1, Transect-2) which was assigned along the groundwater flow path. Also, based on long term monitoring data, dissolved TCE concentration and mass of residual TCE in the initial stage of disposal were estimated to evaluate the efficiency of in situ remediation. The results of temporal and spatial monitoring before remedial actions showed that a TCE plume originating from main and local source zones continues to be discharged to a stream. However, from the end of intensive remedial actions from 2012 to 2013, the aqueous concentrations of TCE plume present at and around the main source areas decreased significantly. Especially, during the intensive remediation period, the early average mass discharge (26.58 g/day) at source transect was decreased to average 4.99 g/day. Estimated initial dissolved concentration and residual mass of TCE in the initial stage of disposal decreased rapidly after an intensive remedial action in 2013 and it is expected to be continuously decreased from the end of remedial actions to 2020. This study demonstrates that long term monitoring data are useful in assessing the effectiveness of remedial actions at chlorinated ethenes contaminated site. Acknowledgements This project is supported by the Korea Ministry of Environment under "The GAIA Project (173-092-009)"and "R&D Project on Environmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).

  8. Weak unique continuation property and a related inverse source problem for time-fractional diffusion-advection equations

    NASA Astrophysics Data System (ADS)

    Jiang, Daijun; Li, Zhiyuan; Liu, Yikan; Yamamoto, Masahiro

    2017-05-01

    In this paper, we first establish a weak unique continuation property for time-fractional diffusion-advection equations. The proof is mainly based on the Laplace transform and the unique continuation properties for elliptic and parabolic equations. The result is weaker than its parabolic counterpart in the sense that we additionally impose the homogeneous boundary condition. As a direct application, we prove the uniqueness for an inverse problem on determining the spatial component in the source term by interior measurements. Numerically, we reformulate our inverse source problem as an optimization problem, and propose an iterative thresholding algorithm. Finally, several numerical experiments are presented to show the accuracy and efficiency of the algorithm.

  9. Patent Retrieval in Chemistry based on Semantically Tagged Named Entities

    DTIC Science & Technology

    2009-11-01

    their corresponding synonyms. An ex- ample query for TS-15 is: (" Betaine " OR "Glycine betaine " OR "Glycocol betaine " OR "Glycylbetaine" OR ...) AND...task in an automatic way based on noun- phrase detection incorporating the OpenNLP chun- 3 Informative Term Synonyms Source Betaine Glycine betaine ...Glycocol betaine , Glycylbetaine etc. ATC Peripheral Artery Disease Peripheral Artery Disorder, Peripheral Arterial Disease etc. MeSH Diels-Alder reaction

  10. Nonlinear dispersion-based incoherent photonic processing for microwave pulse generation with full reconfigurability.

    PubMed

    Bolea, Mario; Mora, José; Ortega, Beatriz; Capmany, José

    2012-03-12

    A novel all-optical technique based on the incoherent processing of optical signals using high-order dispersive elements is analyzed for microwave arbitrary pulse generation. We show an approach which allows a full reconfigurability of a pulse in terms of chirp, envelope and central frequency by the proper control of the second-order dispersion and the incoherent optical source power distribution, achieving large values of time-bandwidth product.

  11. MEqTrees Telescope and Radio-sky Simulations and CPU Benchmarking

    NASA Astrophysics Data System (ADS)

    Shanmugha Sundaram, G. A.

    2009-09-01

    MEqTrees is a Python-based implementation of the classical Measurement Equation, wherein the various 2×2 Jones matrices are parametrized representations in the spatial and sky domains for any generic radio telescope. Customized simulations of radio-source sky models and corrupt Jones terms are demonstrated based on a policy framework, with performance estimates derived for array configurations, ``dirty''-map residuals and processing power requirements for such computations on conventional platforms.

  12. Effect of animal-source food supplement prior to and during pregnancy on birthweight and prematurity in rural Vietnam: a brief study description.

    PubMed

    Tu, Ngu; King, Janet C; Dirren, Henri; Thu, Hoang Nga; Ngoc, Quyen Phi; Diep, Anh Nguyen Thi

    2014-12-01

    Maternal nutritional status is an important predictor of infant birthweight. Most previous attempts to improve birthweight through multiple micronutrient supplementation have been initiated after women are pregnant. Interventions to improve maternal nutritional status prior to conception may be more effective in preventing low birthweight and improving other infant health outcomes. To compare the effects of maternal supplementation with animal-source food from preconception to term or from mid-gestation to term with routine prenatal care on birthweight, the prevalence of preterm births, intrauterine growth restriction, and infant growth during the first 12 months of life and on maternal nutrient status and the incidence of maternal and infant infections. Young women from 29 rural communes in northwestern Vietnam were recruited when they registered to marry and were randomized to one of three interventions: animal-source food supplement 5 days per week from marriage to term (approximately 13 months), animal-source food supplement 5 days per week from 16 weeks of gestation to term (approximately 5 months), or routine prenatal care without supplementalfeeding. Data on infant birthweight and gestational age, maternal and infant anthropometry, micronutrient status, and infections in the infant and mother were collected at various time points. In a preliminary study of women of reproductive age in this area of Vietnam, 40% of the women were underweight (body mass index < 18.5) and anemic. About 50% had infections. Rice was the dietary staple, and nutrient-rich, animal-source foods were rarely consumed by women. Iron, zinc, vitamin A, folate, and vitamin B12 intakes were inadequate in about 40% of the women. The study is still ongoing, and further data are not yet available. The results of this study will provide important data regarding whether improved intake of micronutrient-rich animal-source foods that are locally available and affordable before and during pregnancy improves maternal and infant health and development. This food-based approach may have global implications regarding how and when to initiate sustainable nutritional interventions to improve maternal and infant health.

  13. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru

    PubMed Central

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Background Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. Methods We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey’s ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case – an 8.4 magnitude earthquake that hit southern Peru in 2001. Results and conclusions Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post- earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters. PMID:26090999

  14. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    PubMed

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  15. Multisource Estimation of Long-term Global Terrestrial Surface Radiation

    NASA Astrophysics Data System (ADS)

    Peng, L.; Sheffield, J.

    2017-12-01

    Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.

  16. Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant: Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Saunier, Olivier; Mathieu, Anne

    2012-03-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativity of the measurements, those that are instrumental, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. We propose to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We apply the method to the estimation of the Fukushima Daiichi source term using activity concentrations in the air. The results are compared to an L-curve estimation technique and to Desroziers's scheme. The total reconstructed activities significantly depend on the chosen method. Because of the poor observability of the Fukushima Daiichi emissions, these methods provide lower bounds for cesium-137 and iodine-131 reconstructed activities. These lower bound estimates, 1.2 × 1016 Bq for cesium-137, with an estimated standard deviation range of 15%-20%, and 1.9 - 3.8 × 1017 Bq for iodine-131, with an estimated standard deviation range of 5%-10%, are of the same order of magnitude as those provided by the Japanese Nuclear and Industrial Safety Agency and about 5 to 10 times less than the Chernobyl atmospheric releases.

  17. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  18. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    NASA Astrophysics Data System (ADS)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  19. Effect of internal and external conditions on ionization processes in the FAPA ambient desorption/ionization source.

    PubMed

    Orejas, Jaime; Pfeuffer, Kevin P; Ray, Steven J; Pisonero, Jorge; Sanz-Medel, Alfredo; Hieftje, Gary M

    2014-11-01

    Ambient desorption/ionization (ADI) sources coupled to mass spectrometry (MS) offer outstanding analytical features: direct analysis of real samples without sample pretreatment, combined with the selectivity and sensitivity of MS. Since ADI sources typically work in the open atmosphere, ambient conditions can affect the desorption and ionization processes. Here, the effects of internal source parameters and ambient humidity on the ionization processes of the flowing atmospheric pressure afterglow (FAPA) source are investigated. The interaction of reagent ions with a range of analytes is studied in terms of sensitivity and based upon the processes that occur in the ionization reactions. The results show that internal parameters which lead to higher gas temperatures afforded higher sensitivities, although fragmentation is also affected. In the case of humidity, only extremely dry conditions led to higher sensitivities, while fragmentation remained unaffected.

  20. Comparison of Physics Frameworks for WebGL-Based Game Engine

    NASA Astrophysics Data System (ADS)

    Yogya, Resa; Kosala, Raymond

    2014-03-01

    Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.

  1. pyJac: Analytical Jacobian generator for chemical kinetics

    NASA Astrophysics Data System (ADS)

    Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen

    2017-06-01

    Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using pyJac via matrix evaluation timing comparisons; the routines produced by pyJac outperformed first-order finite differences by 3-7.5 times and the existing analytical Jacobian software TChem by 1.1-2.2 times on a single-threaded basis. It is noted that TChem is not thread-safe, while pyJac is easily parallelized, and hence can greatly outperform TChem on multicore CPUs. The Jacobian matrix generator we describe here will be useful for reducing the cost of integrating chemical source terms with implicit algorithms in particular and algorithms that require an accurate Jacobian matrix in general. Furthermore, the open-source release of the program and Python-based implementation will enable wide adoption.

  2. Assessing the risk of second malignancies after modern radiotherapy

    PubMed Central

    Newhauser, Wayne D.; Durante, Marco

    2014-01-01

    Recent advances in radiotherapy have enabled the use of different types of particles, such as protons and heavy ions, as well as refinements to the treatment of tumours with standard sources (photons). However, the risk of second cancers arising in long-term survivors continues to be a problem. The long-term risks from treatments such as particle therapy have not yet been determined and are unlikely to become apparent for many years. Therefore, there is a need to develop risk assessments based on our current knowledge of radiation-induced carcinogenesis. PMID:21593785

  3. The 'CommTech' Methodology: A Demand-Driven Approach to Efficient, Productive and Measurable Technology Transfer

    NASA Technical Reports Server (NTRS)

    Horsham, Gray A. P.

    1998-01-01

    Market research sources were used to initially gather primary technological problems and needs data from non-aerospace companies in targeted industry sectors. The company-supplied information served as input data to activate or start-up an internal, phased match-making process. This process was based on technical-level relationship exploration followed by business-level agreement negotiations, and culminated with project management and execution. Space Act Agreements represented near-term outputs. Company product or process commercialization derived from Lewis support and measurable economic effects represented far-term outputs.

  4. Stochastic Estimation and Non-Linear Wall-Pressure Sources in a Separating/Reattaching Flow

    NASA Technical Reports Server (NTRS)

    Naguib, A.; Hudy, L.; Humphreys, W. M., Jr.

    2002-01-01

    Simultaneous wall-pressure and PIV measurements are used to study the conditional flow field associated with surface-pressure generation in a separating/reattaching flow established over a fence-with-splitter-plate geometry. The conditional flow field is captured using linear and quadratic stochastic estimation based on the occurrence of positive and negative pressure events in the vicinity of the mean reattachment location. The results shed light on the dominant flow structures associated with significant wall-pressure generation. Furthermore, analysis based on the individual terms in the stochastic estimation expansion shows that both the linear and non-linear flow sources of the coherent (conditional) velocity field are equally important contributors to the generation of the conditional surface pressure.

  5. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  6. Survey of ion plating sources

    NASA Technical Reports Server (NTRS)

    Spalvins, T.

    1979-01-01

    Ion plating is a plasma deposition technique where ions of the gas and the evaporant have a decisive role in the formation of a coating in terms of adherence, coherence, and morphological growth. The range of materials that can be ion plated is predominantly determined by the selection of the evaporation source. Based on the type of evaporation source, gaseous media and mode of transport, the following will be discussed: resistance, electron beam sputtering, reactive and ion beam evaporation. Ionization efficiencies and ion energies in the glow discharge determine the percentage of atoms which are ionized under typical ion plating conditions. The plating flux consists of a small number of energetic ions and a large number of energetic neutrals. The energy distribution ranges from thermal energies up to a maximum energy of the discharge. The various reaction mechanisms which contribute to the exceptionally strong adherence - formation of a graded substrate/coating interface are not fully understood, however the controlling factors are evaluated. The influence of process variables on the nucleation and growth characteristics are illustrated in terms of morphological changes which affect the mechanical and tribological properties of the coating.

  7. Representing Thoughts, Words, and Things in the UMLS

    PubMed Central

    Campbell, Keith E.; Oliver, Diane E.; Spackman, Kent A.; Shortliffe, Edward H.

    1998-01-01

    The authors describe a framework, based on the Ogden-Richards semiotic triangle, for understanding the relationship between the Unified Medical Language System (UMLS) and the source terminologies from which the UMLS derives its content. They pay particular attention to UMLS's Concept Unique Identifier (CUI) and the sense of “meaning” it represents as contrasted with the sense of “meaning” represented by the source terminologies. The CUI takes on emergent meaning through linkage to terms in different terminology systems. In some cases, a CUI's emergent meaning can differ significantly from the original sources' intended meanings of terms linked by that CUI. Identification of these different senses of meaning within the UMLS is consistent with historical themes of semantic interpretation of language. Examination of the UMLS within such a historical framework makes it possible to better understand the strengths and limitations of the UMLS approach for integrating disparate terminologic systems and to provide a model, or theoretic foundation, for evaluating the UMLS as a Possible World—that is, as a mathematical formalism that represents propositions about some perspective or interpretation of the physical world. PMID:9760390

  8. Taste and odor occurrence in Lake William C. Bowen and Municipal Reservoir #1, Spartanburg County, South Carolina

    USGS Publications Warehouse

    Journey, Celeste; Arrington, Jane M.

    2009-01-01

    The U.S. Geological Survey and Spartanburg Water are working cooperatively on an ongoing study of Lake Bowen and Reservoir #1 to identify environmental factors that enhance or influence the production of geosmin in the source-water reservoirs. Spartanburg Water is using information from this study to develop management strategies to reduce (short-term solution) and prevent (long-term solution) geosmin occurrence. Spartanburg Water utility treats and distributes drinking water to the Spartanburg area of South Carolina. The drinking water sources for the area are Lake William C. Bowen (Lake Bowen) and Municipal Reservoir #1 (Reservoir #1), located north of Spartanburg. These reservoirs, which were formed by the impoundment of the South Pacolet River, were assessed in 2006 by the South Carolina Department of Health and Environmental Control (SCDHEC) as being fully supportive of all uses based on established criteria. Nonetheless, Spartanburg Water had noted periodic taste and odor problems due to the presence of geosmin, a naturally occurring compound in the source water. Geosmin is not harmful, but its presence in drinking water is aesthetically unpleasant.

  9. Microwave implementation of two-source energy balance approach for estimating evapotranspiration

    NASA Astrophysics Data System (ADS)

    Holmes, Thomas R. H.; Hain, Christopher R.; Crow, Wade T.; Anderson, Martha C.; Kustas, William P.

    2018-02-01

    A newly developed microwave (MW) land surface temperature (LST) product is used to substitute thermal infrared (TIR)-based LST in the Atmosphere-Land Exchange Inverse (ALEXI) modeling framework for estimating evapotranspiration (ET) from space. ALEXI implements a two-source energy balance (TSEB) land surface scheme in a time-differential approach, designed to minimize sensitivity to absolute biases in input records of LST through the analysis of the rate of temperature change in the morning. Thermal infrared retrievals of the diurnal LST curve, traditionally from geostationary platforms, are hindered by cloud cover, reducing model coverage on any given day. This study tests the utility of diurnal temperature information retrieved from a constellation of satellites with microwave radiometers that together provide six to eight observations of Ka-band brightness temperature per location per day. This represents the first ever attempt at a global implementation of ALEXI with MW-based LST and is intended as the first step towards providing all-weather capability to the ALEXI framework. The analysis is based on 9-year-long, global records of ALEXI ET generated using both MW- and TIR-based diurnal LST information as input. In this study, the MW-LST (MW-based LST) sampling is restricted to the same clear-sky days as in the IR-based implementation to be able to analyze the impact of changing the LST dataset separately from the impact of sampling all-sky conditions. The results show that long-term bulk ET estimates from both LST sources agree well, with a spatial correlation of 92 % for total ET in the Europe-Africa domain and agreement in seasonal (3-month) totals of 83-97 % depending on the time of year. Most importantly, the ALEXI-MW (MW-based ALEXI) also matches ALEXI-IR (IR-based ALEXI) very closely in terms of 3-month inter-annual anomalies, demonstrating its ability to capture the development and extent of drought conditions. Weekly ET output from the two parallel ALEXI implementations is further compared to a common ground measured reference provided by the Fluxnet consortium. Overall, the two model implementations generate similar performance metrics (correlation and RMSE) for all but the most challenging sites in terms of spatial heterogeneity and level of aridity. It is concluded that a constellation of MW satellites can effectively be used to provide LST for estimating ET through ALEXI, which is an important step towards all-sky satellite-based retrieval of ET using an energy balance framework.

  10. Replacing effective spectral radiance by temperature in occupational exposure limits to protect against retinal thermal injury from light and near IR radiation.

    PubMed

    Madjidi, Faramarz; Behroozy, Ali

    2014-01-01

    Exposure to visible light and near infrared (NIR) radiation in the wavelength region of 380 to 1400 nm may cause thermal retinal injury. In this analysis, the effective spectral radiance of a hot source is replaced by its temperature in the exposure limit values in the region of 380-1400 nm. This article describes the development and implementation of a computer code to predict those temperatures, corresponding to the exposure limits proposed by the American Conference of Governmental Industrial Hygienists (ACGIH). Viewing duration and apparent diameter of the source were inputs for the computer code. At the first stage, an infinite series was created for calculation of spectral radiance by integration with Planck's law. At the second stage for calculation of effective spectral radiance, the initial terms of this infinite series were selected and integration was performed by multiplying these terms by a weighting factor R(λ) in the wavelength region 380-1400 nm. At the third stage, using a computer code, the source temperature that can emit the same effective spectral radiance was found. As a result, based only on measuring the source temperature and accounting for the exposure time and the apparent diameter of the source, it is possible to decide whether the exposure to visible and NIR in any 8-hr workday is permissible. The substitution of source temperature for effective spectral radiance provides a convenient way to evaluate exposure to visible light and NIR.

  11. Integrating new Storage Technologies into EOS

    NASA Astrophysics Data System (ADS)

    Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul

    2015-12-01

    The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.

  12. Time-based comparative transcriptomics in engineered xylose-utilizing Saccharomyces cerevisiae identifies temperature-responsive genes during ethanol production.

    PubMed

    Ismail, Ku Syahidah Ku; Sakamoto, Takatoshi; Hasunuma, Tomohisa; Kondo, Akihiko

    2013-09-01

    Agricultural residues comprising lignocellulosic materials are excellent sources of pentose sugar, which can be converted to ethanol as fuel. Ethanol production via consolidated bioprocessing requires a suitable microorganism to withstand the harsh fermentation environment of high temperature, high ethanol concentration, and exposure to inhibitors. We genetically enhanced an industrial Saccharomyces cerevisiae strain, sun049, enabling it to uptake xylose as the sole carbon source at high fermentation temperature. This strain was able to produce 13.9 g/l ethanol from 50 g/l xylose at 38 °C. To better understand the xylose consumption ability during long-term, high-temperature conditions, we compared by transcriptomics two fermentation conditions: high temperature (38 °C) and control temperature (30 °C) during the first 12 h of fermentation. This is the first long-term, time-based transcriptomics approach, and it allowed us to discover the role of heat-responsive genes when xylose is the sole carbon source. The results suggest that genes related to amino acid, cell wall, and ribosomal protein synthesis are down-regulated under heat stress. To allow cell stability and continuous xylose uptake in order to produce ethanol, hexose transporter HXT5, heat shock proteins, ubiquitin proteins, and proteolysis were all induced at high temperature. We also speculate that the strong relationship between high temperature and increased xylitol accumulation represents the cell's mechanism to protect itself from heat degradation.

  13. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas

    2016-11-01

    Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  14. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  15. Parameterized source term in the diffusion approximation for enhanced near-field modeling of collimated light

    NASA Astrophysics Data System (ADS)

    Jia, Mengyu; Wang, Shuang; Chen, Xueying; Gao, Feng; Zhao, Huijuan

    2016-03-01

    Most analytical methods for describing light propagation in turbid medium exhibit low effectiveness in the near-field of a collimated source. Motivated by the Charge Simulation Method in electromagnetic theory as well as the established discrete source based modeling, we have reported on an improved explicit model, referred to as "Virtual Source" (VS) diffuse approximation (DA), to inherit the mathematical simplicity of the DA while considerably extend its validity in modeling the near-field photon migration in low-albedo medium. In this model, the collimated light in the standard DA is analogously approximated as multiple isotropic point sources (VS) distributed along the incident direction. For performance enhancement, a fitting procedure between the calculated and realistic reflectances is adopted in the nearfield to optimize the VS parameters (intensities and locations). To be practically applicable, an explicit 2VS-DA model is established based on close-form derivations of the VS parameters for the typical ranges of the optical parameters. The proposed VS-DA model is validated by comparing with the Monte Carlo simulations, and further introduced in the image reconstruction of the Laminar Optical Tomography system.

  16. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  17. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  18. A geodata warehouse: Using denormalisation techniques as a tool for delivering spatially enabled integrated geological information to geologists

    NASA Astrophysics Data System (ADS)

    Kingdon, Andrew; Nayembil, Martin L.; Richardson, Anne E.; Smith, A. Graham

    2016-11-01

    New requirements to understand geological properties in three dimensions have led to the development of PropBase, a data structure and delivery tools to deliver this. At the BGS, relational database management systems (RDBMS) has facilitated effective data management using normalised subject-based database designs with business rules in a centralised, vocabulary controlled, architecture. These have delivered effective data storage in a secure environment. However, isolated subject-oriented designs prevented efficient cross-domain querying of datasets. Additionally, the tools provided often did not enable effective data discovery as they struggled to resolve the complex underlying normalised structures providing poor data access speeds. Users developed bespoke access tools to structures they did not fully understand sometimes delivering them incorrect results. Therefore, BGS has developed PropBase, a generic denormalised data structure within an RDBMS to store property data, to facilitate rapid and standardised data discovery and access, incorporating 2D and 3D physical and chemical property data, with associated metadata. This includes scripts to populate and synchronise the layer with its data sources through structured input and transcription standards. A core component of the architecture includes, an optimised query object, to deliver geoscience information from a structure equivalent to a data warehouse. This enables optimised query performance to deliver data in multiple standardised formats using a web discovery tool. Semantic interoperability is enforced through vocabularies combined from all data sources facilitating searching of related terms. PropBase holds 28.1 million spatially enabled property data points from 10 source databases incorporating over 50 property data types with a vocabulary set that includes 557 property terms. By enabling property data searches across multiple databases PropBase has facilitated new scientific research, previously considered impractical. PropBase is easily extended to incorporate 4D data (time series) and is providing a baseline for new "big data" monitoring projects.

  19. Spatial patterning in PM2.5 constituents under an inversion-focused sampling design across an urban area of complex terrain

    PubMed Central

    Tunno, Brett J; Dalton, Rebecca; Michanowicz, Drew R; Shmool, Jessie L C; Kinnee, Ellen; Tripathy, Sheila; Cambal, Leah; Clougherty, Jane E

    2016-01-01

    Health effects of fine particulate matter (PM2.5) vary by chemical composition, and composition can help to identify key PM2.5 sources across urban areas. Further, this intra-urban spatial variation in concentrations and composition may vary with meteorological conditions (e.g., mixing height). Accordingly, we hypothesized that spatial sampling during atmospheric inversions would help to better identify localized source effects, and reveal more distinct spatial patterns in key constituents. We designed a 2-year monitoring campaign to capture fine-scale intra-urban variability in PM2.5 composition across Pittsburgh, PA, and compared both spatial patterns and source effects during “frequent inversion” hours vs 24-h weeklong averages. Using spatially distributed programmable monitors, and a geographic information systems (GIS)-based design, we collected PM2.5 samples across 37 sampling locations per year to capture variation in local pollution sources (e.g., proximity to industry, traffic density) and terrain (e.g., elevation). We used inductively coupled plasma mass spectrometry (ICP-MS) to determine elemental composition, and unconstrained factor analysis to identify source suites by sampling scheme and season. We examined spatial patterning in source factors using land use regression (LUR), wherein GIS-based source indicators served to corroborate factor interpretations. Under both summer sampling regimes, and for winter inversion-focused sampling, we identified six source factors, characterized by tracers associated with brake and tire wear, steel-making, soil and road dust, coal, diesel exhaust, and vehicular emissions. For winter 24-h samples, four factors suggested traffic/fuel oil, traffic emissions, coal/industry, and steel-making sources. In LURs, as hypothesized, GIS-based source terms better explained spatial variability in inversion-focused samples, including a greater contribution from roadway, steel, and coal-related sources. Factor analysis produced source-related constituent suites under both sampling designs, though factors were more distinct under inversion-focused sampling. PMID:26507005

  20. Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions

    NASA Astrophysics Data System (ADS)

    Valentine, John S.

    2013-09-01

    By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.

  1. An efficient hole-filling method based on depth map in 3D view generation

    NASA Astrophysics Data System (ADS)

    Liang, Haitao; Su, Xiu; Liu, Yilin; Xu, Huaiyuan; Wang, Yi; Chen, Xiaodong

    2018-01-01

    New virtual view is synthesized through depth image based rendering(DIBR) using a single color image and its associated depth map in 3D view generation. Holes are unavoidably generated in the 2D to 3D conversion process. We propose a hole-filling method based on depth map to address the problem. Firstly, we improve the process of DIBR by proposing a one-to-four (OTF) algorithm. The "z-buffer" algorithm is used to solve overlap problem. Then, based on the classical patch-based algorithm of Criminisi et al., we propose a hole-filling algorithm using the information of depth map to handle the image after DIBR. In order to improve the accuracy of the virtual image, inpainting starts from the background side. In the calculation of the priority, in addition to the confidence term and the data term, we add the depth term. In the search for the most similar patch in the source region, we define the depth similarity to improve the accuracy of searching. Experimental results show that the proposed method can effectively improve the quality of the 3D virtual view subjectively and objectively.

  2. A discontinuous Galerkin approach for conservative modeling of fully nonlinear and weakly dispersive wave transformations

    NASA Astrophysics Data System (ADS)

    Sharifian, Mohammad Kazem; Kesserwani, Georges; Hassanzadeh, Yousef

    2018-05-01

    This work extends a robust second-order Runge-Kutta Discontinuous Galerkin (RKDG2) method to solve the fully nonlinear and weakly dispersive flows, within a scope to simultaneously address accuracy, conservativeness, cost-efficiency and practical needs. The mathematical model governing such flows is based on a variant form of the Green-Naghdi (GN) equations decomposed as a hyperbolic shallow water system with an elliptic source term. Practical features of relevance (i.e. conservative modeling over irregular terrain with wetting and drying and local slope limiting) have been restored from an RKDG2 solver to the Nonlinear Shallow Water (NSW) equations, alongside new considerations to integrate elliptic source terms (i.e. via a fourth-order local discretization of the topography) and to enable local capturing of breaking waves (i.e. via adding a detector for switching off the dispersive terms). Numerical results are presented, demonstrating the overall capability of the proposed approach in achieving realistic prediction of nearshore wave processes involving both nonlinearity and dispersion effects within a single model.

  3. Frequency tunable electronic sources working at room temperature in the 1 to 3 THz band

    NASA Astrophysics Data System (ADS)

    Maestrini, Alain; Mehdi, Imran; Siles, José V.; Lin, Robert; Lee, Choonsup; Chattopadhyay, Goutam; Pearson, John; Siegel, Peter

    2012-10-01

    Compact, room temperature terahertz sources are much needed in the 1 to 3 THz band for developing multi-pixel heterodyne receivers for astrophysics and planetary science or for building short-range high spatial resolution THz imaging systems able to see through low water content and non metallic materials, smoke or dust for a variety of applications ranging from the inspection of art artifacts to the detection of masked or concealed objects. All solid-sate electronic sources based on a W-band synthesizer followed by a high-power W-band amplifier and a cascade of Schottky diode based THz frequency multipliers are now capable of producing more than 1 mW at 0.9THz, 50 μW at 2 THz and 18 μW at 2.6 THz without the need of any cryogenic system. These sources are frequency agile and have a relative bandwidth of 10 to 15%, limited by the high power W-band amplifiers. The paper will present the latest developments of this technology and its perspective in terms of frequency range, bandwidth and power.

  4. Rhythmic entrainment source separation: Optimizing analyses of neural responses to rhythmic sensory stimulation.

    PubMed

    Cohen, Michael X; Gulbinaite, Rasa

    2017-02-15

    Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. The National Geographic Names Data Base: Phase II instructions

    USGS Publications Warehouse

    Orth, Donald J.; Payne, Roger L.

    1987-01-01

    not recorded on topographic maps be added. The systematic collection of names from other sources, including maps, charts, and texts, is termed Phase II. In addition, specific types of features not compiled during Phase I are encoded and added to the data base. Other names of importance to researchers and users, such as historical and variant names, are also included. The rules and procedures for Phase II research, compilation, and encoding are contained in this publication.

  6. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  7. Long-Term Deflection Prediction from Computer Vision-Measured Data History for High-Speed Railway Bridges

    PubMed Central

    Lee, Jaebeom; Lee, Young-Joo

    2018-01-01

    Management of the vertical long-term deflection of a high-speed railway bridge is a crucial factor to guarantee traffic safety and passenger comfort. Therefore, there have been efforts to predict the vertical deflection of a railway bridge based on physics-based models representing various influential factors to vertical deflection such as concrete creep and shrinkage. However, it is not an easy task because the vertical deflection of a railway bridge generally involves several sources of uncertainty. This paper proposes a probabilistic method that employs a Gaussian process to construct a model to predict the vertical deflection of a railway bridge based on actual vision-based measurement and temperature. To deal with the sources of uncertainty which may cause prediction errors, a Gaussian process is modeled with multiple kernels and hyperparameters. Once the hyperparameters are identified through the Gaussian process regression using training data, the proposed method provides a 95% prediction interval as well as a predictive mean about the vertical deflection of the bridge. The proposed method is applied to an arch bridge under operation for high-speed trains in South Korea. The analysis results obtained from the proposed method show good agreement with the actual measurement data on the vertical deflection of the example bridge, and the prediction results can be utilized for decision-making on railway bridge maintenance. PMID:29747421

  8. Long-Term Deflection Prediction from Computer Vision-Measured Data History for High-Speed Railway Bridges.

    PubMed

    Lee, Jaebeom; Lee, Kyoung-Chan; Lee, Young-Joo

    2018-05-09

    Management of the vertical long-term deflection of a high-speed railway bridge is a crucial factor to guarantee traffic safety and passenger comfort. Therefore, there have been efforts to predict the vertical deflection of a railway bridge based on physics-based models representing various influential factors to vertical deflection such as concrete creep and shrinkage. However, it is not an easy task because the vertical deflection of a railway bridge generally involves several sources of uncertainty. This paper proposes a probabilistic method that employs a Gaussian process to construct a model to predict the vertical deflection of a railway bridge based on actual vision-based measurement and temperature. To deal with the sources of uncertainty which may cause prediction errors, a Gaussian process is modeled with multiple kernels and hyperparameters. Once the hyperparameters are identified through the Gaussian process regression using training data, the proposed method provides a 95% prediction interval as well as a predictive mean about the vertical deflection of the bridge. The proposed method is applied to an arch bridge under operation for high-speed trains in South Korea. The analysis results obtained from the proposed method show good agreement with the actual measurement data on the vertical deflection of the example bridge, and the prediction results can be utilized for decision-making on railway bridge maintenance.

  9. An object-oriented design for automated navigation of semantic networks inside a medical data dictionary.

    PubMed

    Ruan, W; Bürkle, T; Dudeck, J

    2000-01-01

    In this paper we present a data dictionary server for the automated navigation of information sources. The underlying knowledge is represented within a medical data dictionary. The mapping between medical terms and information sources is based on a semantic network. The key aspect of implementing the dictionary server is how to represent the semantic network in a way that is easier to navigate and to operate, i.e. how to abstract the semantic network and to represent it in memory for various operations. This paper describes an object-oriented design based on Java that represents the semantic network in terms of a group of objects. A node and its relationships to its neighbors are encapsulated in one object. Based on such a representation model, several operations have been implemented. They comprise the extraction of parts of the semantic network which can be reached from a given node as well as finding all paths between a start node and a predefined destination node. This solution is independent of any given layout of the semantic structure. Therefore the module, called Giessen Data Dictionary Server can act independent of a specific clinical information system. The dictionary server will be used to present clinical information, e.g. treatment guidelines or drug information sources to the clinician in an appropriate working context. The server is invoked from clinical documentation applications which contain an infobutton. Automated navigation will guide the user to all the information relevant to her/his topic, which is currently available inside our closed clinical network.

  10. Standardization of terminology in field of ionizing radiations and their measurements

    NASA Astrophysics Data System (ADS)

    Yudin, M. F.; Karaveyev, F. M.

    1984-03-01

    A new standard terminology was introduced on 1 January 1982 by the Scientific-Technical Commission on All-Union State Standards to cover ionizing radiations and their measurements. It is based on earlier standards such as GOST 15484-74/81, 18445-70/73, 19849-74, 22490-77 as well as the latest recommendations by international committees. One hundred eighty-six terms and definitions in 14 paragraphs are contained. Fundamental concepts, sources and forms of ionizing radiations, characteristics and parameters of ionizing radiations, and methods of measuring their characteristics and parameters are covered. New terms have been added to existing ones. The equivalent English, French, and German terms are also given. The terms measurement of ionizing radiation and transfer of ionizing particles (equivalent of particle fluence of energy fluence) are still under discussion.

  11. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup usingmore » H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.« less

  12. A time reversal algorithm in acoustic media with Dirac measure approximations

    NASA Astrophysics Data System (ADS)

    Bretin, Élie; Lucas, Carine; Privat, Yannick

    2018-04-01

    This article is devoted to the study of a photoacoustic tomography model, where one is led to consider the solution of the acoustic wave equation with a source term writing as a separated variables function in time and space, whose temporal component is in some sense close to the derivative of the Dirac distribution at t  =  0. This models a continuous wave laser illumination performed during a short interval of time. We introduce an algorithm for reconstructing the space component of the source term from the measure of the solution recorded by sensors during a time T all along the boundary of a connected bounded domain. It is based at the same time on the introduction of an auxiliary equivalent Cauchy problem allowing to derive explicit reconstruction formula and then to use of a deconvolution procedure. Numerical simulations illustrate our approach. Finally, this algorithm is also extended to elasticity wave systems.

  13. A Two-moment Radiation Hydrodynamics Module in ATHENA Using a Godunov Method

    NASA Astrophysics Data System (ADS)

    Skinner, M. A.; Ostriker, E. C.

    2013-04-01

    We describe a module for the Athena code that solves the grey equations of radiation hydrodynamics (RHD) using a local variable Eddington tensor (VET) based on the M1 closure of the two-moment hierarchy of the transfer equation. The variables are updated via a combination of explicit Godunov methods to advance the gas and radiation variables including the non-stiff source terms, and a local implicit method to integrate the stiff source terms. We employ the reduced speed of light approximation (RSLA) with subcycling of the radiation variables in order to reduce computational costs. The streaming and diffusion limits are well-described by the M1 closure model, and our implementation shows excellent behavior for problems containing both regimes simultaneously. Our operator-split method is ideally suited for problems with a slowly-varying radiation field and dynamical gas flows, in which the effect of the RSLA is minimal.

  14. Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media

    DOE PAGES

    Gabitto, Jorge; Tsouris, Costas

    2015-05-05

    Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less

  15. Estimates of long-term mean-annual nutrient loads considered for use in SPARROW models of the Midcontinental region of Canada and the United States, 2002 base year

    USGS Publications Warehouse

    Saad, David A.; Benoy, Glenn A.; Robertson, Dale M.

    2018-05-11

    Streamflow and nutrient concentration data needed to compute nitrogen and phosphorus loads were compiled from Federal, State, Provincial, and local agency databases and also from selected university databases. The nitrogen and phosphorus loads are necessary inputs to Spatially Referenced Regressions on Watershed Attributes (SPARROW) models. SPARROW models are a way to estimate the distribution, sources, and transport of nutrients in streams throughout the Midcontinental region of Canada and the United States. After screening the data, approximately 1,500 sites sampled by 34 agencies were identified as having suitable data for calculating the long-term mean-annual nutrient loads required for SPARROW model calibration. These final sites represent a wide range in watershed sizes, types of nutrient sources, and land-use and watershed characteristics in the Midcontinental region of Canada and the United States.

  16. Reprint of: High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2011-05-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Rebound of a coal tar creosote plume following partial source zone treatment with permanganate.

    PubMed

    Thomson, N R; Fraser, M J; Lamarche, C; Barker, J F; Forsey, S P

    2008-11-14

    The long-term management of dissolved plumes originating from a coal tar creosote source is a technical challenge. For some sites stabilization of the source may be the best practical solution to decrease the contaminant mass loading to the plume and associated off-site migration. At the bench-scale, the deposition of manganese oxides, a permanganate reaction byproduct, has been shown to cause pore plugging and the formation of a manganese oxide layer adjacent to the non-aqueous phase liquid creosote which reduces post-treatment mass transfer and hence mass loading from the source. The objective of this study was to investigate the potential of partial permanganate treatment to reduce the ability of a coal tar creosote source zone to generate a multi-component plume at the pilot-scale over both the short-term (weeks to months) and the long-term (years) at a site where there is >10 years of comprehensive synoptic plume baseline data available. A series of preliminary bench-scale experiments were conducted to support this pilot-scale investigation. The results from the bench-scale experiments indicated that if sufficient mass removal of the reactive compounds is achieved then the effective solubility, aqueous concentration and rate of mass removal of the more abundant non-reactive coal tar creosote compounds such as biphenyl and dibenzofuran can be increased. Manganese oxide formation and deposition caused an order-of-magnitude decrease in hydraulic conductivity. Approximately 125 kg of permanganate were delivered into the pilot-scale source zone over 35 days, and based on mass balance estimates <10% of the initial reactive coal tar creosote mass in the source zone was oxidized. Mass discharge estimated at a down-gradient fence line indicated >35% reduction for all monitored compounds except for biphenyl, dibenzofuran and fluoranthene 150 days after treatment, which is consistent with the bench-scale experimental results. Pre- and post-treatment soil core data indicated a highly variable and random spatial distribution of mass within the source zone and provided no insight into the mass removed of any of the monitored species. The down-gradient plume was monitored approximately 1, 2 and 4 years following treatment. The data collected at 1 and 2 years post-treatment showed a decrease in mass discharge (10 to 60%) and/or total plume mass (0 to 55%); however, by 4 years post-treatment there was a rebound in both mass discharge and total plume mass for all monitored compounds to pre-treatment values or higher. The variability of the data collected was too large to resolve subtle changes in plume morphology, particularly near the source zone, that would provide insight into the impact of the formation and deposition of manganese oxides that occurred during treatment on mass transfer and/or flow by-passing. Overall, the results from this pilot-scale investigation indicate that there was a significant but short-term (months) reduction of mass emanating from the source zone as a result of permanganate treatment but there was no long-term (years) impact on the ability of this coal tar creosote source zone to generate a multi-component plume.

  18. Field measurements and modeling to resolve m2 to km2 CH4 emissions for a complex urban source: An Indiana landfill study

    USDA-ARS?s Scientific Manuscript database

    Large uncertainties for landfill CH4 emissions due to spatial and temporal variabilities remain unresolved by short-term field campaigns and historic GHG inventory models. Using four field methods (aircraft-based mass balance, tracer correlation, vertical radial plume mapping, and static chambers) ...

  19. NASA Cold Land Processes Experiment (CLPX 2002/03): Ground-based and near-surface meteorological observations

    Treesearch

    Kelly Elder; Don Cline; Angus Goodbody; Paul Houser; Glen E. Liston; Larry Mahrt; Nick Rutter

    2009-01-01

    A short-term meteorological database has been developed for the Cold Land Processes Experiment (CLPX). This database includes meteorological observations from stations designed and deployed exclusively for CLPXas well as observations available from other sources located in the small regional study area (SRSA) in north-central Colorado. The measured weather parameters...

  20. An Open Source Agenda for Research Linking Text and Image Content Features.

    ERIC Educational Resources Information Center

    Goodrum, Abby A.; Rorvig, Mark E.; Jeong, Ki-Tai; Suresh, Chitturi

    2001-01-01

    Proposes methods to utilize image primitives to support term assignment for image classification. Proposes to release code for image analysis in a common tool set for other researchers to use. Of particular focus is the expansion of work by researchers in image indexing to include image content-based feature extraction capabilities in their work.…

  1. Children's Understanding of Health and Health-Related Behavior: The Influence of Age and Information Source

    ERIC Educational Resources Information Center

    Daigle, Kay; Hebert, Edward; Humphries, Charlotte

    2007-01-01

    Background: Health is recognized as an important factor in children's educational achievement, and school-based health education programs are recognized for their potential long-term impact. However, the extent to which children conceptualize health similarly as adults, and the nature of their understandings of health, remain cloudy. Purpose:…

  2. Preventing Bandwidth Abuse at the Router through Sending Rate Estimate-based Active Queue Management

    DTIC Science & Technology

    2007-06-01

    behavior is growing in the Internet. These non-responsive sources can monopolize network bandwidth and starve the “congestion friendly” flows. Without...unnecessarily complex because most of the flows in the Internet are short flows usually termed as “web mice ” [7]. Moreover, having a separate queue for each

  3. Renewable Electricity Policy in Germany, 1974 to 2005

    ERIC Educational Resources Information Center

    Lauber, Volkmar; Mez, Lutz

    2006-01-01

    Of the large industrial countries, Germany is clearly leading with regard to new renewable energy sources, occupying first rank in terms of installed capacity for wind energy and second for photovoltaics. This is not because of an exceptional natural resource base but because of public policy in this area, despite the fact that this policy was…

  4. Analysis of Scientific Research Related Anxiety Levels of Undergraduate Students'

    ERIC Educational Resources Information Center

    Yildirim, Sefa; Hasiloglu, Mehmet Akif

    2018-01-01

    In this study, it was aimed to identify the scientific research-related anxiety levels of the undergraduate students studying in the department of faculty of science and letters and faculty of education to analyse these anxiety levels in terms of various variables (students' gender, using web based information sources, going to the library,…

  5. Diffraction-based optical correlator

    NASA Technical Reports Server (NTRS)

    Spremo, Stevan M. (Inventor); Fuhr, Peter L. (Inventor); Schipper, John F. (Inventor)

    2005-01-01

    Method and system for wavelength-based processing of a light beam. A light beam, produced at a chemical or physical reaction site and having at least first and second wavelengths, ?1 and ?2, is received and diffracted at a first diffraction grating to provide first and second diffracted beams, which are received and analyzed in terms of wavelength and/or time at two spaced apart light detectors. In a second embodiment, light from first and second sources is diffracted and compared in terms of wavelength and/or time to determine if the two beams arise from the same source. In a third embodiment, a light beam is split and diffracted and passed through first and second environments to study differential effects. In a fourth embodiment, diffracted light beam components, having first and second wavelengths, are received sequentially at a reaction site to determine whether a specified reaction is promoted, based on order of receipt of the beams. In a fifth embodiment, a cylindrically shaped diffraction grating (uniform or chirped) is rotated and translated to provide a sequence of diffracted beams with different wavelengths. In a sixth embodiment, incident light, representing one or more symbols, is successively diffracted from first and second diffraction gratings and is received at different light detectors, depending upon the wavelengths present in the incident light.

  6. Combining molecular fingerprints with multidimensional scaling analyses to identify the source of spilled oil from highly similar suspected oils.

    PubMed

    Zhou, Peiyu; Chen, Changshu; Ye, Jianjun; Shen, Wenjie; Xiong, Xiaofei; Hu, Ping; Fang, Hongda; Huang, Chuguang; Sun, Yongge

    2015-04-15

    Oil fingerprints have been a powerful tool widely used for determining the source of spilled oil. In most cases, this tool works well. However, it is usually difficult to identify the source if the oil spill accident occurs during offshore petroleum exploration due to the highly similar physiochemical characteristics of suspected oils from the same drilling platform. In this report, a case study from the waters of the South China Sea is presented, and multidimensional scaling analysis (MDS) is introduced to demonstrate how oil fingerprints can be combined with mathematical methods to identify the source of spilled oil from highly similar suspected sources. The results suggest that the MDS calculation based on oil fingerprints and subsequently integrated with specific biomarkers in spilled oils is the most effective method with a great potential for determining the source in terms of highly similar suspected oils. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Electromagnetic fields and the public: EMF standards and estimation of risk

    NASA Astrophysics Data System (ADS)

    Grigoriev, Yury

    2010-04-01

    Mobile communications are a relatively new and additional source of electromagnetic exposure for the population. Standard daily mobile-phone use is known to increase RF-EMF (radiofrequency electromagnetic field) exposure to the brains of users of all ages, whilst mobile-phone base stations, and base station units for cordless phones, can regularly increase the exposures of large numbers of the population to RF-EMF radiation in everyday life. The need to determine appropriate standards stipulating the maximum acceptable short-term and long-term RF-EMF levels encountered by the public, and set such levels as general guidelines, is of great importance in order to help preserve the general public's health and that of the next generation of humanity.

  8. Multigrid Method for Modeling Multi-Dimensional Combustion with Detailed Chemistry

    NASA Technical Reports Server (NTRS)

    Zheng, Xiaoqing; Liu, Chaoqun; Liao, Changming; Liu, Zhining; McCormick, Steve

    1996-01-01

    A highly accurate and efficient numerical method is developed for modeling 3-D reacting flows with detailed chemistry. A contravariant velocity-based governing system is developed for general curvilinear coordinates to maintain simplicity of the continuity equation and compactness of the discretization stencil. A fully-implicit backward Euler technique and a third-order monotone upwind-biased scheme on a staggered grid are used for the respective temporal and spatial terms. An efficient semi-coarsening multigrid method based on line-distributive relaxation is used as the flow solver. The species equations are solved in a fully coupled way and the chemical reaction source terms are treated implicitly. Example results are shown for a 3-D gas turbine combustor with strong swirling inflows.

  9. Diamond-based single-photon emitters

    NASA Astrophysics Data System (ADS)

    Aharonovich, I.; Castelletto, S.; Simpson, D. A.; Su, C.-H.; Greentree, A. D.; Prawer, S.

    2011-07-01

    The exploitation of emerging quantum technologies requires efficient fabrication of key building blocks. Sources of single photons are extremely important across many applications as they can serve as vectors for quantum information—thereby allowing long-range (perhaps even global-scale) quantum states to be made and manipulated for tasks such as quantum communication or distributed quantum computation. At the single-emitter level, quantum sources also afford new possibilities in terms of nanoscopy and bio-marking. Color centers in diamond are prominent candidates to generate and manipulate quantum states of light, as they are a photostable solid-state source of single photons at room temperature. In this review, we discuss the state of the art of diamond-based single-photon emitters and highlight their fabrication methodologies. We present the experimental techniques used to characterize the quantum emitters and discuss their photophysical properties. We outline a number of applications including quantum key distribution, bio-marking and sub-diffraction imaging, where diamond-based single emitters are playing a crucial role. We conclude with a discussion of the main challenges and perspectives for employing diamond emitters in quantum information processing.

  10. Helicon plasma generator-assisted surface conversion ion source for the production of H- ion beams at the Los Alamos Neutron Science Centera)

    NASA Astrophysics Data System (ADS)

    Tarvainen, O.; Rouleau, G.; Keller, R.; Geros, E.; Stelzer, J.; Ferris, J.

    2008-02-01

    The converter-type negative ion source currently employed at the Los Alamos Neutron Science Center (LANSCE) is based on cesium enhanced surface production of H- ion beams in a filament-driven discharge. In this kind of an ion source the extracted H- beam current is limited by the achievable plasma density which depends primarily on the electron emission current from the filaments. The emission current can be increased by increasing the filament temperature but, unfortunately, this leads not only to shorter filament lifetime but also to an increase in metal evaporation from the filament, which deposits on the H- converter surface and degrades its performance. Therefore, we have started an ion source development project focused on replacing these thermionic cathodes (filaments) of the converter source by a helicon plasma generator capable of producing high-density hydrogen plasmas with low electron energy. In our studies which have so far shown that the plasma density of the surface conversion source can be increased significantly by exciting a helicon wave in the plasma, and we expect to improve the performance of the surface converter H- ion source in terms of beam brightness and time between services. The design of this new source and preliminary results are presented, along with a discussion of physical processes relevant for H- ion beam production with this novel design. Ultimately, we perceive this approach as an interim step towards our long-term goal, combining a helicon plasma generator with an SNS-type main discharge chamber, which will allow us to individually optimize the plasma properties of the plasma cathode (helicon) and H- production (main discharge) in order to further improve the brightness of extracted H- ion beams.

  11. Helicon plasma generator-assisted surface conversion ion source for the production of H(-) ion beams at the Los Alamos Neutron Science Center.

    PubMed

    Tarvainen, O; Rouleau, G; Keller, R; Geros, E; Stelzer, J; Ferris, J

    2008-02-01

    The converter-type negative ion source currently employed at the Los Alamos Neutron Science Center (LANSCE) is based on cesium enhanced surface production of H(-) ion beams in a filament-driven discharge. In this kind of an ion source the extracted H(-) beam current is limited by the achievable plasma density which depends primarily on the electron emission current from the filaments. The emission current can be increased by increasing the filament temperature but, unfortunately, this leads not only to shorter filament lifetime but also to an increase in metal evaporation from the filament, which deposits on the H(-) converter surface and degrades its performance. Therefore, we have started an ion source development project focused on replacing these thermionic cathodes (filaments) of the converter source by a helicon plasma generator capable of producing high-density hydrogen plasmas with low electron energy. In our studies which have so far shown that the plasma density of the surface conversion source can be increased significantly by exciting a helicon wave in the plasma, and we expect to improve the performance of the surface converter H(-) ion source in terms of beam brightness and time between services. The design of this new source and preliminary results are presented, along with a discussion of physical processes relevant for H(-) ion beam production with this novel design. Ultimately, we perceive this approach as an interim step towards our long-term goal, combining a helicon plasma generator with an SNS-type main discharge chamber, which will allow us to individually optimize the plasma properties of the plasma cathode (helicon) and H(-) production (main discharge) in order to further improve the brightness of extracted H(-) ion beams.

  12. Dictionary-Based Tensor Canonical Polyadic Decomposition

    NASA Astrophysics Data System (ADS)

    Cohen, Jeremy Emile; Gillis, Nicolas

    2018-04-01

    To ensure interpretability of extracted sources in tensor decomposition, we introduce in this paper a dictionary-based tensor canonical polyadic decomposition which enforces one factor to belong exactly to a known dictionary. A new formulation of sparse coding is proposed which enables high dimensional tensors dictionary-based canonical polyadic decomposition. The benefits of using a dictionary in tensor decomposition models are explored both in terms of parameter identifiability and estimation accuracy. Performances of the proposed algorithms are evaluated on the decomposition of simulated data and the unmixing of hyperspectral images.

  13. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less

  14. Repeat immigration: A previously unobserved source of heterogeneity?

    PubMed

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  15. Effects of Varying Nitrogen Sources on Amino Acid Synthesis Costs in Arabidopsis thaliana under Different Light and Carbon-Source Conditions

    PubMed Central

    Nikoloski, Zoran

    2015-01-01

    Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization. PMID:25706533

  16. Effects of varying nitrogen sources on amino acid synthesis costs in Arabidopsis thaliana under different light and carbon-source conditions.

    PubMed

    Arnold, Anne; Sajitz-Hermstein, Max; Nikoloski, Zoran

    2015-01-01

    Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization.

  17. Issues and Methods Concerning the Evaluation of Hypersingular and Near-Hypersingular Integrals in BEM Formulations

    NASA Technical Reports Server (NTRS)

    Fink, P. W.; Khayat, M. A.; Wilton, D. R.

    2005-01-01

    It is known that higher order modeling of the sources and the geometry in Boundary Element Modeling (BEM) formulations is essential to highly efficient computational electromagnetics. However, in order to achieve the benefits of hIgher order basis and geometry modeling, the singular and near-singular terms arising in BEM formulations must be integrated accurately. In particular, the accurate integration of near-singular terms, which occur when observation points are near but not on source regions of the scattering object, has been considered one of the remaining limitations on the computational efficiency of integral equation methods. The method of singularity subtraction has been used extensively for the evaluation of singular and near-singular terms. Piecewise integration of the source terms in this manner, while manageable for bases of constant and linear orders, becomes unwieldy and prone to error for bases of higher order. Furthermore, we find that the singularity subtraction method is not conducive to object-oriented programming practices, particularly in the context of multiple operators. To extend the capabilities, accuracy, and maintainability of general-purpose codes, the subtraction method is being replaced in favor of the purely numerical quadrature schemes. These schemes employ singularity cancellation methods in which a change of variables is chosen such that the Jacobian of the transformation cancels the singularity. An example of the sin,oularity cancellation approach is the Duffy method, which has two major drawbacks: 1) In the resulting integrand, it produces an angular variation about the singular point that becomes nearly-singular for observation points close to an edge of the parent element, and 2) it appears not to work well when applied to nearly-singular integrals. Recently, the authors have introduced the transformation u(x(prime))= sinh (exp -1) x(prime)/Square root of ((y prime (exp 2))+ z(exp 2) for integrating functions of the form I = Integral of (lambda(r(prime))((e(exp -jkR))/(4 pi R) d D where A (r (prime)) is a vector or scalar basis function and R = Square root of( (x(prime)(exp2) + (y(prime)(exp2) + z(exp 2)) is the distance between source and observation points. This scheme has all of the advantages of the Duffy method while avoiding the disadvantages listed above. In this presentation we will survey similar approaches for handling singular and near-singular terms for kernels with 1/R(exp 2) type behavior, addressing potential pitfalls and offering techniques to efficiently handle special cases.

  18. A large eddy simulation scheme for turbulent reacting flows

    NASA Technical Reports Server (NTRS)

    Gao, Feng

    1993-01-01

    The recent development of the dynamic subgrid-scale (SGS) model has provided a consistent method for generating localized turbulent mixing models and has opened up great possibilities for applying the large eddy simulation (LES) technique to real world problems. Given the fact that the direct numerical simulation (DNS) can not solve for engineering flow problems in the foreseeable future (Reynolds 1989), the LES is certainly an attractive alternative. It seems only natural to bring this new development in SGS modeling to bear on the reacting flows. The major stumbling block for introducing LES to reacting flow problems has been the proper modeling of the reaction source terms. Various models have been proposed, but none of them has a wide range of applicability. For example, some of the models in combustion have been based on the flamelet assumption which is only valid for relatively fast reactions. Some other models have neglected the effects of chemical reactions on the turbulent mixing time scale, which is certainly not valid for fast and non-isothermal reactions. The probability density function (PDF) method can be usefully employed to deal with the modeling of the reaction source terms. In order to fit into the framework of LES, a new PDF, the large eddy PDF (LEPDF), is introduced. This PDF provides an accurate representation for the filtered chemical source terms and can be readily calculated in the simulations. The details of this scheme are described.

  19. Extended lattice Boltzmann scheme for droplet combustion.

    PubMed

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n-butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  20. Exact relations for energy transfer in self-gravitating isothermal turbulence

    NASA Astrophysics Data System (ADS)

    Banerjee, Supratik; Kritsuk, Alexei G.

    2017-11-01

    Self-gravitating isothermal supersonic turbulence is analyzed in the asymptotic limit of large Reynolds numbers. Based on the inviscid invariance of total energy, an exact relation is derived for homogeneous (not necessarily isotropic) turbulence. A modified definition for the two-point energy correlation functions is used to comply with the requirement of detailed energy equipartition in the acoustic limit. In contrast to the previous relations (S. Galtier and S. Banerjee, Phys. Rev. Lett. 107, 134501 (2011), 10.1103/PhysRevLett.107.134501; S. Banerjee and S. Galtier, Phys. Rev. E 87, 013019 (2013), 10.1103/PhysRevE.87.013019), the current exact relation shows that the pressure dilatation terms play practically no role in the energy cascade. Both the flux and source terms are written in terms of two-point differences. Sources enter the relation in a form of mixed second-order structure functions. Unlike the kinetic and thermodynamic potential energies, the gravitational contribution is absent from the flux term. An estimate shows that, for the isotropic case, the correlation between density and gravitational acceleration may play an important role in modifying the energy transfer in self-gravitating turbulence. The exact relation is also written in an alternative form in terms of two-point correlation functions, which is then used to describe scale-by-scale energy budget in spectral space.

  1. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  2. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  3. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  4. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  5. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  6. Radioisotope Power System Pool Concept

    NASA Technical Reports Server (NTRS)

    Rusick, Jeffrey J.; Bolotin, Gary S.

    2015-01-01

    Advanced Radioisotope Power Systems (RPS) for NASA deep space science missions have historically used static thermoelectric-based designs because they are highly reliable, and their radioisotope heat sources can be passively cooled throughout the mission life cycle. Recently, a significant effort to develop a dynamic RPS, the Advanced Stirling Radioisotope Generator (ASRG), was conducted by NASA and the Department of Energy, because Stirling based designs offer energy conversion efficiencies four times higher than heritage thermoelectric designs; and the efficiency would proportionately reduce the amount of radioisotope fuel needed for the same power output. However, the long term reliability of a Stirling based design is a concern compared to thermoelectric designs, because for certain Stirling system architectures the radioisotope heat sources must be actively cooled via the dynamic operation of Stirling converters throughout the mission life cycle. To address this reliability concern, a new dynamic Stirling cycle RPS architecture is proposed called the RPS Pool Concept.

  7. The base pairing RNA Spot 42 participates in a multi-output feedforward loop to help enact catabolite repression in Escherichia coli

    PubMed Central

    Beisel, Chase L.; Storz, Gisela

    2011-01-01

    SUMMARY Bacteria selectively consume some carbon sources over others through a regulatory mechanism termed catabolite repression. Here, we show that the base pairing RNA Spot 42 plays a broad role in catabolite repression in Escherichia coli by directly repressing genes involved in central and secondary metabolism, redox balancing, and the consumption of diverse non-preferred carbon sources. Many of the genes repressed by Spot 42 are transcriptionally activated by the global regulator CRP. Since CRP represses Spot 42, these regulators participate in a specific regulatory circuit called a multi-output feedforward loop. We found that this loop can reduce leaky expression of target genes in the presence of glucose and can maintain repression of target genes under changing nutrient conditions. Our results suggest that base pairing RNAs in feedforward loops can help shape the steady-state levels and dynamics of gene expression. PMID:21292161

  8. Correlation between Ti source/drain contact and performance of InGaZnO-based thin film transistors

    NASA Astrophysics Data System (ADS)

    Choi, Kwang-Hyuk; Kim, Han-Ki

    2013-02-01

    Ti contact properties and their electrical contribution to an amorphous InGaZnO (a-IGZO) semiconductor-based thin film transistor (TFT) were investigated in terms of chemical, structural, and electrical considerations. TFT device parameters were quantitatively studied by a transmission line method. By comparing various a-IGZO TFT parameters with those of different Ag and Ti source/drain electrodes, Ti S/D contact with an a-IGZO channel was found to lead to a negative shift in VT (-Δ 0.52 V). This resulted in higher saturation mobility (8.48 cm2/Vs) of a-IGZO TFTs due to effective interfacial reaction between Ti and an a-IGZO semiconducting layer. Based on transmission electron microcopy, x-ray photoelectron depth profile analyses, and numerical calculation of TFT parameters, we suggest a possible Ti contact mechanism on semiconducting a-IGZO channel layers for TFTs.

  9. Infrared and visible image fusion method based on saliency detection in sparse domain

    NASA Astrophysics Data System (ADS)

    Liu, C. H.; Qi, Y.; Ding, W. R.

    2017-06-01

    Infrared and visible image fusion is a key problem in the field of multi-sensor image fusion. To better preserve the significant information of the infrared and visible images in the final fused image, the saliency maps of the source images is introduced into the fusion procedure. Firstly, under the framework of the joint sparse representation (JSR) model, the global and local saliency maps of the source images are obtained based on sparse coefficients. Then, a saliency detection model is proposed, which combines the global and local saliency maps to generate an integrated saliency map. Finally, a weighted fusion algorithm based on the integrated saliency map is developed to achieve the fusion progress. The experimental results show that our method is superior to the state-of-the-art methods in terms of several universal quality evaluation indexes, as well as in the visual quality.

  10. Rational Autologous Cell Sources For Therapy of Heart Failure - Vehicles and Targets For Gene and RNA Therapies.

    PubMed

    Lampinen, Milla; Vento, Antti; Laurikka, Jari; Nystedt, Johanna; Mervaala, Eero; Harjula, Ari; Kankuri, Esko

    2016-01-01

    This review focuses on the possibilities for intraoperative processing and isolation of autologous cells, particularly atrial appendage-derived cells (AADCs) and cellular micrografts, and their straightforward use in cell transplantation for heart failure therapy. We review the potential of autologous tissues to serve as sources for cell therapy and consider especially those tissues that are used in surgery but from which the excess is currently discarded as surgical waste. We compare the inculture expanded cells to the freshly isolated ones in terms of evidence-based cost-efficacy and their usability as gene- and RNA therapy vehicles. We also review how financial and authority-based decisions and restrictions sculpt the landscape for patients to participate in academic-based trials. Finally, we provide an insight example into AADCs isolation and processing for epicardial therapy during coronary artery bypass surgery.

  11. 12 CFR 201.4 - Availability and terms of credit.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...

  12. Broadband Phase Retrieval for Image-Based Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.

    2007-01-01

    A focus-diverse phase-retrieval algorithm has been shown to perform adequately for the purpose of image-based wavefront sensing when (1) broadband light (typically spanning the visible spectrum) is used in forming the images by use of an optical system under test and (2) the assumption of monochromaticity is applied to the broadband image data. Heretofore, it had been assumed that in order to obtain adequate performance, it is necessary to use narrowband or monochromatic light. Some background information, including definitions of terms and a brief description of pertinent aspects of image-based phase retrieval, is prerequisite to a meaningful summary of the present development. Phase retrieval is a general term used in optics to denote estimation of optical imperfections or aberrations of an optical system under test. The term image-based wavefront sensing refers to a general class of algorithms that recover optical phase information, and phase-retrieval algorithms constitute a subset of this class. In phase retrieval, one utilizes the measured response of the optical system under test to produce a phase estimate. The optical response of the system is defined as the image of a point-source object, which could be a star or a laboratory point source. The phase-retrieval problem is characterized as image-based in the sense that a charge-coupled-device camera, preferably of scientific imaging quality, is used to collect image data where the optical system would normally form an image. In a variant of phase retrieval, denoted phase-diverse phase retrieval [which can include focus-diverse phase retrieval (in which various defocus planes are used)], an additional known aberration (or an equivalent diversity function) is superimposed as an aid in estimating unknown aberrations by use of an image-based wavefront-sensing algorithm. Image-based phase-retrieval differs from such other wavefront-sensing methods, such as interferometry, shearing interferometry, curvature wavefront sensing, and Shack-Hartmann sensing, all of which entail disadvantages in comparison with image-based methods. The main disadvantages of these non-image based methods are complexity of test equipment and the need for a wavefront reference.

  13. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    NASA Astrophysics Data System (ADS)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  14. Fermi Large Area Telescope First Source Catalog

    DOE PAGES

    Abdo, A. A.; Ackermann, M.; Ajello, M.; ...

    2010-05-25

    Here, we present a catalog of high-energy gamma-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), during the first 11 months of the science phase of the mission, which began on 2008 August 4. The First Fermi-LAT catalog (1FGL) contains 1451 sources detected and characterized in the 100 MeV to 100 GeV range. Source detection was based on the average flux over the 11 month period, and the threshold likelihood Test Statistic is 25, corresponding to a significance of just over 4σ. The 1FGL catalog includes source location regions,more » defined in terms of elliptical fits to the 95% confidence regions and power-law spectral fits as well as flux measurements in five energy bands for each source. In addition, monthly light curves are provided. Using a protocol defined before launch we have tested for several populations of gamma-ray sources among the sources in the catalog. For individual LAT-detected sources we provide firm identifications or plausible associations with sources in other astronomical catalogs. Identifications are based on correlated variability with counterparts at other wavelengths, or on spin or orbital periodicity. For the catalogs and association criteria that we have selected, 630 of the sources are unassociated. In conclusion, care was taken to characterize the sensitivity of the results to the model of interstellar diffuse gamma-ray emission used to model the bright foreground, with the result that 161 sources at low Galactic latitudes and toward bright local interstellar clouds are flagged as having properties that are strongly dependent on the model or as potentially being due to incorrectly modeled structure in the Galactic diffuse emission.« less

  15. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    NASA Astrophysics Data System (ADS)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration and deposition observations over Europe. The results of the present inversion were confirmed using an independent Eulerian model, for which deposition patterns were also improved when using the estimated posterior releases. Although the independent model tends to underestimate deposition in countries that are not in the main direction of the plume, it reproduces country levels of deposition very efficiently. The results were also tested for robustness against different setups of the inversion through sensitivity runs. The source term data from this study are publicly available.

  16. The NASA Space Radiation Health Program

    NASA Technical Reports Server (NTRS)

    Schimmerling, W.; Sulzman, F. M.

    1994-01-01

    The NASA Space Radiation Health Program is a part of the Life Sciences Division in the Office of Space Science and Applications (OSSA). The goal of the Space Radiation Health Program is development of scientific bases for assuring adequate radiation protection in space. A proposed research program will determine long-term health risks from exposure to cosmic rays and other radiation. Ground-based animal models will be used to predict risk of exposures at varying levels from various sources and the safe levels for manned space flight.

  17. Certification of Public Librarians in the United States; A Detailed Summary of Legal and Voluntary Certification Plans for Public Librarians Based on Information Supplied by the Various Certificating State Agencies or other Appropriate Sources, 2nd Edition.

    ERIC Educational Resources Information Center

    Frame, Ruth R.; Coyne, John R.

    Contained in this report is a detailed summary of legal and voluntary certification plans for public librarians in each of the 50 states. Descriptions of the certification plans for public librarians are based on information supplied by state agencies in September 1971. Each plan is identified by the descriptive terms--mandatory, permissive or…

  18. Attenuation Tomography of Northern California and the Yellow Sea / Korean Peninsula from Coda-source Normalized and Direct Lg Amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Dreger, D S; Phillips, W S

    2008-07-16

    Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less

  19. Infrared and visible image fusion with spectral graph wavelet transform.

    PubMed

    Yan, Xiang; Qin, Hanlin; Li, Jia; Zhou, Huixin; Zong, Jing-guo

    2015-09-01

    Infrared and visible image fusion technique is a popular topic in image analysis because it can integrate complementary information and obtain reliable and accurate description of scenes. Multiscale transform theory as a signal representation method is widely used in image fusion. In this paper, a novel infrared and visible image fusion method is proposed based on spectral graph wavelet transform (SGWT) and bilateral filter. The main novelty of this study is that SGWT is used for image fusion. On the one hand, source images are decomposed by SGWT in its transform domain. The proposed approach not only effectively preserves the details of different source images, but also excellently represents the irregular areas of the source images. On the other hand, a novel weighted average method based on bilateral filter is proposed to fuse low- and high-frequency subbands by taking advantage of spatial consistency of natural images. Experimental results demonstrate that the proposed method outperforms seven recently proposed image fusion methods in terms of both visual effect and objective evaluation metrics.

  20. Reply by the Authors to C. K. W. Tam

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; Farassat, F.

    2002-01-01

    The prediction of noise generation and radiation by turbulence has been the subject of continuous research for over fifty years. The essential problem is how to model the noise sources when one s knowledge of the detailed space-time properties of the turbulence is limited. We attempted to provide a comparison of models based on acoustic analogies and recent alternative models. Our goal was to demonstrate that the predictive capabilities of any model are based on the choice of the turbulence property that is modeled as a source of noise. Our general definition of an acoustic analogy is a rearrangement of the equations of motion into the form L(u) = Q, where L is a linear operator that reduces to an acoustic propagation operator outside a region upsilon; u is a variable that reduces to acoustic pressure (or a related linear acoustic variable) outside upsilon; and Q is a source term that can be meaningfully estimated without knowing u and tends to zero outside upsilon.

  1. 10 CFR 40.41 - Terms and conditions of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...

  2. On epicardial potential reconstruction using regularization schemes with the L1-norm data term.

    PubMed

    Shou, Guofa; Xia, Ling; Liu, Feng; Jiang, Mingfeng; Crozier, Stuart

    2011-01-07

    The electrocardiographic (ECG) inverse problem is ill-posed and usually solved by regularization schemes. These regularization methods, such as the Tikhonov method, are often based on the L2-norm data and constraint terms. However, L2-norm-based methods inherently provide smoothed inverse solutions that are sensitive to measurement errors, and also lack the capability of localizing and distinguishing multiple proximal cardiac electrical sources. This paper presents alternative regularization schemes employing the L1-norm data term for the reconstruction of epicardial potentials (EPs) from measured body surface potentials (BSPs). During numerical implementation, the iteratively reweighted norm algorithm was applied to solve the L1-norm-related schemes, and measurement noises were considered in the BSP data. The proposed L1-norm data term-based regularization schemes (with L1 and L2 penalty terms of the normal derivative constraint (labelled as L1TV and L1L2)) were compared with the L2-norm data terms (Tikhonov with zero-order and normal derivative constraints, labelled as ZOT and FOT, and the total variation method labelled as L2TV). The studies demonstrated that, with averaged measurement noise, the inverse solutions provided by the L1L2 and FOT algorithms have less relative error values. However, when larger noise occurred in some electrodes (for example, signal lost during measurement), the L1TV and L1L2 methods can obtain more accurate EPs in a robust manner. Therefore the L1-norm data term-based solutions are generally less perturbed by measurement noises, suggesting that the new regularization scheme is promising for providing practical ECG inverse solutions.

  3. The European seabass (Dicentrarchus labrax) innate immunity and gut health are modulated by dietary plant-protein inclusion and prebiotic supplementation.

    PubMed

    Azeredo, Rita; Machado, Marina; Kreuz, Eva; Wuertz, Sven; Oliva-Teles, Aires; Enes, Paula; Costas, Benjamín

    2017-01-01

    Inclusion of prebiotics in aqua feeds, though a costly strategy, has increased as a means to improve growth. Still, its effects on health improvement are not fully disclosed. Regarding their immunestimulatory properties, research has focused on carbohydrates such as fructooligosaccharides and xylooligosaccharides demonstrating their modulatory effects on immune defences in higher vertebrates but few studies have been done on their impact on fish immunity. Replacing fish meal (FM) by plant protein (PP) sources is a current practice in the aquaculture business but their content in antinutrients is still a drawback in terms of gut well-functioning. This work intends to evaluate the short-term effect (7 or 15 days feeding the experimental diets) on juvenile European seabass (Dicentrarchus labrax) immune status of dietary i) replacement of FM by PP sources; ii) prebiotics supplementation. Six isoproteic (46%) and isolipidic (15%) diets were tested including a FM control diet (FMCTRL), a PP control diet (PPCTRL, 30 FM:70 PP) and four other diets based on either FM or PP to which short-chain fructooligosaccharides (scFOS) or xylooligosaccharides (XOS) were added at 1% (FMFOS, PPFOS, FMXOS, PPXOS). The replacement of FM by PP in the diets induced nitric oxide (NO) and lysozyme production, while immunoglobulins (Ig), monocytes percentage and gut interleukin 10 (IL10) gene expression were inhibited. Dietary scFOS supplementation inhibited total bactericidal activity and neutrophils relative percentage regardless protein source and increased plasma NO and thrombocytes percentage in fish fed FM-based diets, while monocytes percentage was increased in PPFOS-fed fish. XOS supplementation down-regulated immune gene expression in the gut while it partly enhanced systemic response. Inconsistency among results regarding FM replacement by PP-based ingredients exposes the need for further research considering both local and systemic responses. Distinct outcomes of prebiotic supplementation were highlighted reflecting sight-specific effects with no clear interaction with protein source. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Using the internet to understand angler behavior in the information age

    USGS Publications Warehouse

    Martin, Dustin R.; Pracheil, Brenda M.; DeBoer, Jason A.; Wilde, Gene R.; Pope, Kevin L.

    2012-01-01

    Declining participation in recreational angling is of great concern to fishery managers because fishing license sales are an important revenue source for protection of aquatic resources. This decline is frequently attributed, in part, to increased societal reliance on electronics. Internet use by anglers is increasing and fishery managers may use the Internet as a unique means to increase angler participation. We examined Internet search behavior using Google Insights for Search, a free online tool that summarizes Google searches from 2004 to 2011 to determine (1) trends in Internet search volume for general fishing related terms and (2) the relative usefulness of terms related to angler recruitment programs across the United States. Though search volume declined for general fishing terms (e.g., fishing, fishing guide), search volume increased for social media and recruitment terms (e.g., fishing forum, family fishing) over the 7-year period. We encourage coordinators of recruitment programs to capitalize on anglers’ Internet usage by considering Internet search patterns when creating web-based information. Careful selection of terms used in web-based information to match those currently searched by potential anglers may help to direct traffic to state agency websites that support recruitment efforts.

  5. Multisource inverse-geometry CT. Part I. System concept and development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Man, Bruno, E-mail: deman@ge.com; Harrison, Dan

    Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: Themore » authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals.« less

  6. Multisource inverse-geometry CT. Part I. System concept and development

    PubMed Central

    De Man, Bruno; Uribe, Jorge; Baek, Jongduk; Harrison, Dan; Yin, Zhye; Longtin, Randy; Roy, Jaydeep; Waters, Bill; Wilson, Colin; Short, Jonathan; Inzinna, Lou; Reynolds, Joseph; Neculaes, V. Bogdan; Frutschy, Kristopher; Senzig, Bob; Pelc, Norbert

    2016-01-01

    Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: The authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals. PMID:27487877

  7. Evaluating the feasibility of biological waste processing for long term space missions.

    PubMed

    Garland, J L; Alazraki, M P; Atkinson, C F; Finger, B W

    1998-01-01

    Recycling waste products during orbital (e.g., International Space Station) and planetary missions (e.g., lunar base, Mars transit mission, Martian base) will reduce storage and resupply costs. Wastes streams on the space station will include human hygiene water, urine, faeces, and trash. Longer term missions will contain human waste and inedible plant material from plant growth systems used for atmospheric regeneration, food production, and water recycling. The feasibility of biological and physical-chemical waste recycling is being investigated as part of National Aeronautics and Space Administration's (NASA) Advanced Life Support (ALS) Program. In-vessel composting has lower manpower requirements, lower water and volume requirements, and greater potential for sanitization of human waste compared to alternative bioreactor designs such as continuously stirred tank reactors (CSTR). Residual solids from the process (i.e. compost) could be used a biological air filter, a plant nutrient source, and a carbon sink. Potential in-vessel composting designs for both near- and long-term space missions are presented and discussed with respect to the unique aspects of space-based systems.

  8. Evaluating the feasibility of biological waste processing for long term space missions

    NASA Technical Reports Server (NTRS)

    Garland, J. L.; Alazraki, M. P.; Atkinson, C. F.; Finger, B. W.; Sager, J. C. (Principal Investigator)

    1998-01-01

    Recycling waste products during orbital (e.g., International Space Station) and planetary missions (e.g., lunar base, Mars transit mission, Martian base) will reduce storage and resupply costs. Wastes streams on the space station will include human hygiene water, urine, faeces, and trash. Longer term missions will contain human waste and inedible plant material from plant growth systems used for atmospheric regeneration, food production, and water recycling. The feasibility of biological and physical-chemical waste recycling is being investigated as part of National Aeronautics and Space Administration's (NASA) Advanced Life Support (ALS) Program. In-vessel composting has lower manpower requirements, lower water and volume requirements, and greater potential for sanitization of human waste compared to alternative bioreactor designs such as continuously stirred tank reactors (CSTR). Residual solids from the process (i.e. compost) could be used a biological air filter, a plant nutrient source, and a carbon sink. Potential in-vessel composting designs for both near- and long-term space missions are presented and discussed with respect to the unique aspects of space-based systems.

  9. Estimation of the Cesium-137 Source Term from the Fukushima Daiichi Power Plant Using Air Concentration and Deposition Data

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2013-04-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativeness of the measurements, the instrumental errors, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, and specially in a situation of sparse observability, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. In Winiarek et al. (2012), we proposed to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We applied the method to the estimation of the Fukushima Daiichi cesium-137 and iodine-131 source terms using activity concentrations in the air. The results were compared to an L-curve estimation technique, and to Desroziers's scheme. Additionally to the estimations of released activities, we provided related uncertainties (12 PBq with a std. of 15 - 20 % for cesium-137 and 190 - 380 PBq with a std. of 5 - 10 % for iodine-131). We also enlightened that, because of the low number of available observations (few hundreds) and even if orders of magnitude were consistent, the reconstructed activities significantly depended on the method used to estimate the prior errors. In order to use more data, we propose to extend the methods to the use of several data types, such as activity concentrations in the air and fallout measurements. The idea is to simultaneously estimate the prior errors related to each dataset, in order to fully exploit the information content of each one. Using the activity concentration measurements, but also daily fallout data from prefectures and cumulated deposition data over a region lying approximately 150 km around the nuclear power plant, we can use a few thousands of data in our inverse modeling algorithm to reconstruct the Cesium-137 source term. To improve the parameterization of removal processes, rainfall fields have also been corrected using outputs from the mesoscale meteorological model WRF and ground station rainfall data. As expected, the different methods yield closer results as the number of data increases. Reference : Winiarek, V., M. Bocquet, O. Saunier, A. Mathieu (2012), Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant : Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant, J. Geophys. Res., 117, D05122, doi:10.1029/2011JD016932.

  10. [Interventions based on exercise and physical environment for preventing falls in cognitively impaired older people living in long-term care facilities: A systematic review and meta-analysis].

    PubMed

    González-Román, Loreto; Bagur-Calafat, Caritat; Urrútia-Cuchí, Gerard; Garrido-Pedrosa, Jèssica

    2016-01-01

    This systematic review aims to report the effectiveness of interventions based on exercise and/or physical environment for reducing falls in cognitively impaired older adults living in long-term care facilities. In July 2014, a literature search was conducted using main databases and specialised sources. Randomised controlled trials assessing the effectiveness of fall prevention interventions, which used exercise or physical environment among elderly people with cognitive impairment living in long-term care facilities, were selected. Two independent reviewers checked the eligibility of the studies, and evaluated their methodological quality. If it was adequate, data were gathered. Fourteen studies with 3,539 participants using exercise and/or physical environment by a single or combined approach were included. The data gathered from studies that used both interventions showed a significant reduction in fall rate. Further research is needed to demonstrate the effectiveness of those interventions for preventing falls in the elderly with cognitive impairment living in long-term care establishments. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  11. Data collection and storage in long-term ecological and evolutionary studies: The Mongoose 2000 system.

    PubMed

    Marshall, Harry H; Griffiths, David J; Mwanguhya, Francis; Businge, Robert; Griffiths, Amber G F; Kyabulima, Solomon; Mwesige, Kenneth; Sanderson, Jennifer L; Thompson, Faye J; Vitikainen, Emma I K; Cant, Michael A

    2018-01-01

    Studying ecological and evolutionary processes in the natural world often requires research projects to follow multiple individuals in the wild over many years. These projects have provided significant advances but may also be hampered by needing to accurately and efficiently collect and store multiple streams of the data from multiple individuals concurrently. The increase in the availability and sophistication of portable computers (smartphones and tablets) and the applications that run on them has the potential to address many of these data collection and storage issues. In this paper we describe the challenges faced by one such long-term, individual-based research project: the Banded Mongoose Research Project in Uganda. We describe a system we have developed called Mongoose 2000 that utilises the potential of apps and portable computers to meet these challenges. We discuss the benefits and limitations of employing such a system in a long-term research project. The app and source code for the Mongoose 2000 system are freely available and we detail how it might be used to aid data collection and storage in other long-term individual-based projects.

  12. Quantitative and Qualitative Aspects of L1 (Swedish) and L2 (English) Idiom Comprehension

    ERIC Educational Resources Information Center

    Karlsson, Monica

    2013-01-01

    In the present investigation, 15 first term university students were faced with 80 context-based idioms in English (L2) and Swedish (L1) respectively, 30 of which were in the source domain of animals, commonly used in both languages, and asked to explain their meaning. The idioms were of varying frequency and transparency. Three main research…

  13. Setting Priorities for Graduate Medical Education,

    DTIC Science & Technology

    1996-02-01

    assist the work of these staffs. 14. SUBJECT TERMS Attrition, data bases, education , mathematical models, medical personnel, military medicine, naval...CRM 95-209 / February 1996 Setting Priorities for Graduate Medical Education Neil B. Carey • Marjorie D. Curia • Oliver A. Smith 19960718 027...the tirae for reviewing instructions, searching existing data sources gathering and maintaining the data needed, and reviewing the collection of

  14. Copper complexes as a source of redox active MRI contrast agents.

    PubMed

    Dunbar, Lynsey; Sowden, Rebecca J; Trotter, Katherine D; Taylor, Michelle K; Smith, David; Kennedy, Alan R; Reglinski, John; Spickett, Corinne M

    2015-10-01

    The study reports an advance in designing copper-based redox sensing MRI contrast agents. Although the data demonstrate that copper(II) complexes are not able to compete with lanthanoids species in terms of contrast, the redox-dependent switch between diamagnetic copper(I) and paramagnetic copper(II) yields a novel redox-sensitive contrast moiety with potential for reversibility.

  15. Youth Illicit Drug Use Prevention: DARE Long-Term Evaluations and Federal Efforts To Identify Effective Programs.

    ERIC Educational Resources Information Center

    Kanof, Marjorie E.

    The most widely used school-based substance abuse prevention program in the United States is the Drug Abuse Resistance Education (DARE) program, which is funded by a variety of sources, including private, federal, and other public entities. DAREs primary mission is to provide children with the information and skills they need to live drug- and…

  16. Surveillance system for air pollutants by combination of the decision support system COMPAS and optical remote sensing systems

    NASA Astrophysics Data System (ADS)

    Flassak, Thomas; de Witt, Helmut; Hahnfeld, Peter; Knaup, Andreas; Kramer, Lothar

    1995-09-01

    COMPAS is a decision support system designed to assist in the assessment of the consequences of accidental releases of toxic and flammable substances. One of the key elements of COMPAS is a feedback algorithm which allows us to calculate the source term with the aid of concentration measurements. Up to now the feedback technique is applied to concentration measurements done with test tubes or conventional point sensors. In this paper the extension of the actual method is presented which is the combination of COMPAS and an optical remote sensing system like the KAYSER-THREDE K300 FTIR system. Active remote sensing methods based on FTIR are, among other applications, ideal for the so-called fence line monitoring of the diffuse emissions and accidental releases from industrial facilities, since from the FTIR spectra averaged concentration levels along the measurement path can be achieved. The line-averaged concentrations are ideally suited as on-line input for COMPAS' feedback technique. Uncertainties in the assessment of the source term related with both shortcomings of the dispersion model itself and also problems of a feedback strategy based on point measurements are reduced.

  17. WebGIVI: a web-based gene enrichment analysis and visualization tool.

    PubMed

    Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J

    2017-05-04

    A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .

  18. Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steven M.; Harding, Lee

    The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.

  19. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  20. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  1. An evaluation of the social dimensions in public participation in rural domestic waste source-separated collection in Guilin, China.

    PubMed

    Ma, Jing; Hipel, Keith W; Hanson, Mark L

    2017-12-21

    A comprehensive evaluation of public participation in rural domestic waste (RDW) source-separated collection in China was carried out within a social-dimension framework, specifically in terms of public perception, awareness, attitude, and willingness to pay for RDW management. The evaluation was based on a case study conducted in Guilin, Guangxi Zhuang Autonomous Region, China, which is a representative of most inland areas of the country with a GDP around the national average. It was found that unlike urban residents, rural residents maintained a high rate of recycling, but in a spontaneous manner; they paid more attention to issues closely related to their daily lives, but less attention to those at the general level; their awareness of RDW source-separated collection was low and different age groups showed significantly different preferences regarding the sources of knowledge acquirement. Among potential information sources, village committees played a very important role in knowledge dissemination; for the respondents' pro-environmental attitudes, the influencing factor of "lack of legislation/policy" was considered to be significant; mandatory charges for waste collection and disposal had a high rate of acceptance among rural residents; and high monthly incomes had a positive correlation with both public pro-environmental attitudes and public willingness to pay for extra charges levied by RDW management. These observations imply that, for decision-makers in the short term, implementing mandatory RDW source-separated collection programs with enforced guidelines and economic compensation is more effective, while in the long run, promoting pro-environmental education to rural residents is more important.

  2. A web-based screening tool for near-port air quality assessments

    PubMed Central

    Isakov, Vlad; Barzyk, Timothy M.; Smith, Elizabeth R.; Arunachalam, Saravanan; Naess, Brian; Venkatram, Akula

    2018-01-01

    The Community model for near-PORT applications (C-PORT) is a screening tool with an intended purpose of calculating differences in annual averaged concentration patterns and relative contributions of various source categories over the spatial domain within about 10 km of the port. C-PORT can inform decision-makers and concerned citizens about local air quality due to mobile source emissions related to commercial port activities. It allows users to visualize and evaluate different planning scenarios, helping them identify the best alternatives for making long-term decisions that protect community health and sustainability. The web-based, easy-to-use interface currently includes data from 21 seaports primarily in the Southeastern U.S., and has a map-based interface based on Google Maps. The tool was developed to visualize and assess changes in air quality due to changes in emissions and/or meteorology in order to analyze development scenarios, and is not intended to support or replace any regulatory models or programs. PMID:29681760

  3. Contributions of wildland fire to terrestrial ecosystem carbon dynamics in North America from 1990 to 2012

    USGS Publications Warehouse

    Chen, Guangsheng; Hayes, Daniel J.; McGuire, A. David

    2017-01-01

    Burn area and the frequency of extreme fire events have been increasing during recent decades in North America, and this trend is expected to continue over the 21st century. While many aspects of the North American carbon budget have been intensively studied, the net contribution of fire disturbance to the overall net carbon flux at the continental scale remains uncertain. Based on national scale, spatially explicit and long-term fire data, along with the improved model parameterization in a process-based ecosystem model, we simulated the impact of fire disturbance on both direct carbon emissions and net terrestrial ecosystem carbon balance in North America. Fire-caused direct carbon emissions were 106.55 ± 15.98 Tg C/yr during 1990–2012; however, the net ecosystem carbon balance associated with fire was −26.09 ± 5.22 Tg C/yr, indicating that most of the emitted carbon was resequestered by the terrestrial ecosystem. Direct carbon emissions showed an increase in Alaska and Canada during 1990–2012 as compared to prior periods due to more extreme fire events, resulting in a large carbon source from these two regions. Among biomes, the largest carbon source was found to be from the boreal forest, primarily due to large reductions in soil organic matter during, and with slower recovery after, fire events. The interactions between fire and environmental factors reduced the fire-caused ecosystem carbon source. Fire disturbance only caused a weak carbon source as compared to the best estimate terrestrial carbon sink in North America owing to the long-term legacy effects of historical burn area coupled with fast ecosystem recovery during 1990–2012.

  4. A primer on theory-driven web scraping: Automatic extraction of big data from the Internet for use in psychological research.

    PubMed

    Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B

    2016-12-01

    The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Fundamental Rotorcraft Acoustic Modeling From Experiments (FRAME)

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric

    2011-01-01

    A new methodology is developed for the construction of helicopter source noise models for use in mission planning tools from experimental measurements of helicopter external noise radiation. The models are constructed by employing a parameter identification method to an assumed analytical model of the rotor harmonic noise sources. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. The method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor harmonic noise, allowing accurate estimates of the dominant rotorcraft noise sources to be made for operating conditions based on a small number of measurements taken at different operating conditions. The ability of this method to estimate changes in noise radiation due to changes in ambient conditions is also demonstrated.

  6. The evolution of methods for noise prediction of high speed rotors and propellers in the time domain

    NASA Technical Reports Server (NTRS)

    Farassat, F.

    1986-01-01

    Linear wave equation models which have been used over the years at NASA Langley for describing noise emissions from high speed rotating blades are summarized. The noise sources are assumed to lie on a moving surface, and analysis of the situation has been based on the Ffowcs Williams-Hawkings (FW-H) equation. Although the equation accounts for two surface and one volume source, the NASA analyses have considered only the surface terms. Several variations on the FW-H model are delineated for various types of applications, noting the computational benefits of removing the frequency dependence of the calculations. Formulations are also provided for compact and noncompact sources, and features of Long's subsonic integral equation and Farassat's high speed integral equation are discussed. The selection of subsonic or high speed models is dependent on the Mach number of the blade surface where the source is located.

  7. Lineal energy calibration of mini tissue-equivalent gas-proportional counters (TEPC)

    NASA Astrophysics Data System (ADS)

    Conte, V.; Moro, D.; Grosswendt, B.; Colautti, P.

    2013-07-01

    Mini TEPCs are cylindrical gas proportional counters of 1 mm or less of sensitive volume diameter. The lineal energy calibration of these tiny counters can be performed with an external gamma-ray source. However, to do that, first a method to get a simple and precise spectral mark has to be found and then the keV/μm value of this mark. A precise method (less than 1% of uncertainty) to identify this markis described here, and the lineal energy value of this mark has been measured for different simulated site sizes by using a 137Cs gamma source and a cylindrical TEPC equipped with a precision internal 244Cm alpha-particle source, and filled with propane-based tissue-equivalent gas mixture. Mini TEPCs can be calibrated in terms of lineal energy, by exposing them to 137Cesium sources, with an overall uncertainty of about 5%.

  8. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  9. Design and characterization of electron beam focusing for X-ray generation in novel medical imaging architecturea

    PubMed Central

    Bogdan Neculaes, V.; Zou, Yun; Zavodszky, Peter; Inzinna, Louis; Zhang, Xi; Conway, Kenneth; Caiafa, Antonio; Frutschy, Kristopher; Waters, William; De Man, Bruno

    2014-01-01

    A novel electron beam focusing scheme for medical X-ray sources is described in this paper. Most vacuum based medical X-ray sources today employ a tungsten filament operated in temperature limited regime, with electrostatic focusing tabs for limited range beam optics. This paper presents the electron beam optics designed for the first distributed X-ray source in the world for Computed Tomography (CT) applications. This distributed source includes 32 electron beamlets in a common vacuum chamber, with 32 circular dispenser cathodes operated in space charge limited regime, where the initial circular beam is transformed into an elliptical beam before being collected at the anode. The electron beam optics designed and validated here are at the heart of the first Inverse Geometry CT system, with potential benefits in terms of improved image quality and dramatic X-ray dose reduction for the patient. PMID:24826066

  10. Sensitivity of WRF-chem predictions to dust source function specification in West Asia

    NASA Astrophysics Data System (ADS)

    Nabavi, Seyed Omid; Haimberger, Leopold; Samimi, Cyrus

    2017-02-01

    Dust storms tend to form in sparsely populated areas covered by only few observations. Dust source maps, known as source functions, are used in dust models to allocate a certain potential of dust release to each place. Recent research showed that the well known Ginoux source function (GSF), currently used in Weather Research and Forecasting Model coupled with Chemistry (WRF-chem), exhibits large errors over some regions in West Asia, particularly near the IRAQ/Syrian border. This study aims to improve the specification of this critical part of dust forecasts. A new source function based on multi-year analysis of satellite observations, called West Asia source function (WASF), is therefore proposed to raise the quality of WRF-chem predictions in the region. WASF has been implemented in three dust schemes of WRF-chem. Remotely sensed and ground-based observations have been used to verify the horizontal and vertical extent and location of simulated dust clouds. Results indicate that WRF-chem performance is significantly improved in many areas after the implementation of WASF. The modified runs (long term simulations over the summers 2008-2012, using nudging) have yielded an average increase of Spearman correlation between observed and forecast aerosol optical thickness by 12-16 percent points compared to control runs with standard source functions. They even outperform MACC and DREAM dust simulations over many dust source regions. However, the quality of the forecasts decreased with distance from sources, probably due to deficiencies in the transport and deposition characteristics of the forecast model in these areas.

  11. Poster — Thur Eve — 40: Automated Quality Assurance for Remote-Afterloading High Dose Rate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Anthony; Ravi, Ananth

    2014-08-15

    High dose rate (HDR) remote afterloading brachytherapy involves sending a small, high-activity radioactive source attached to a cable to different positions within a hollow applicator implanted in the patient. It is critical that the source position within the applicator and the dwell time of the source are accurate. Daily quality assurance (QA) tests of the positional and dwell time accuracy are essential to ensure that the accuracy of the remote afterloader is not compromised prior to patient treatment. Our centre has developed an automated, video-based QA system for HDR brachytherapy that is dramatically superior to existing diode or film QAmore » solutions in terms of cost, objectivity, positional accuracy, with additional functionalities such as being able to determine source dwell time and transit time of the source. In our system, a video is taken of the brachytherapy source as it is sent out through a position check ruler, with the source visible through a clear window. Using a proprietary image analysis algorithm, the source position is determined with respect to time as it moves to different positions along the check ruler. The total material cost of the video-based system was under $20, consisting of a commercial webcam and adjustable stand. The accuracy of the position measurement is ±0.2 mm, and the time resolution is 30 msec. Additionally, our system is capable of robustly verifying the source transit time and velocity (a test required by the AAPM and CPQR recommendations), which is currently difficult to perform accurately.« less

  12. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  13. Hybrid Energy System Design of Micro Hydro-PV-biogas Based Micro-grid

    NASA Astrophysics Data System (ADS)

    Nishrina; Abdullah, A. G.; Risdiyanto, A.; Nandiyanto, ABD

    2017-03-01

    Hybrid renewable energy system is an arrangement of one or more sources of renewable energy and also conventional energy. This paper describes a simulation results of hybrid renewable power system based on the available potential in an educational institution in Indonesia. HOMER software was used to simulate and analyse both in terms of optimization and economic terms. This software was developed through 3 main principles; simulation, optimization, and sensitivity analysis. Generally, the presented results show that the software can demonstrate a feasible hybrid power system as well to be realized. The entire demand in case study area can be supplied by the system configuration and can be met by ¾ of electricity production. So, there are ¼ of generated energy became an excess electricity.

  14. An empirical model for prediction of geomagnetic storms using initially observed CME parameters at the Sun

    NASA Astrophysics Data System (ADS)

    Kim, R.-S.; Cho, K.-S.; Moon, Y.-J.; Dryer, M.; Lee, J.; Yi, Y.; Kim, K.-H.; Wang, H.; Park, Y.-D.; Kim, Yong Ha

    2010-12-01

    In this study, we discuss the general behaviors of geomagnetic storm strength associated with observed parameters of coronal mass ejection (CME) such as speed (V) and earthward direction (D) of CMEs as well as the longitude (L) and magnetic field orientation (M) of overlaying potential fields of the CME source region, and we develop an empirical model to predict geomagnetic storm occurrence with its strength (gauged by the Dst index) in terms of these CME parameters. For this we select 66 halo or partial halo CMEs associated with M-class and X-class solar flares, which have clearly identifiable source regions, from 1997 to 2003. After examining how each of these CME parameters correlates with the geoeffectiveness of the CMEs, we find several properties as follows: (1) Parameter D best correlates with storm strength Dst; (2) the majority of geoeffective CMEs have been originated from solar longitude 15°W, and CMEs originated away from this longitude tend to produce weaker storms; (3) correlations between Dst and the CME parameters improve if CMEs are separated into two groups depending on whether their magnetic fields are oriented southward or northward in their source regions. Based on these observations, we present two empirical expressions for Dst in terms of L, V, and D for two groups of CMEs, respectively. This is a new attempt to predict not only the occurrence of geomagnetic storms, but also the storm strength (Dst) solely based on the CME parameters.

  15. Seismic envelope-based detection and location of ground-coupled airwaves from volcanoes in Alaska

    USGS Publications Warehouse

    Fee, David; Haney, Matt; Matoza, Robin S.; Szuberla, Curt A.L.; Lyons, John; Waythomas, Christopher F.

    2016-01-01

    Volcanic explosions and other infrasonic sources frequently produce acoustic waves that are recorded by seismometers. Here we explore multiple techniques to detect, locate, and characterize ground‐coupled airwaves (GCA) on volcano seismic networks in Alaska. GCA waveforms are typically incoherent between stations, thus we use envelope‐based techniques in our analyses. For distant sources and planar waves, we use f‐k beamforming to estimate back azimuth and trace velocity parameters. For spherical waves originating within the network, we use two related time difference of arrival (TDOA) methods to detect and localize the source. We investigate a modified envelope function to enhance the signal‐to‐noise ratio and emphasize both high energies and energy contrasts within a spectrogram. We apply these methods to recent eruptions from Cleveland, Veniaminof, and Pavlof Volcanoes, Alaska. Array processing of GCA from Cleveland Volcano on 4 May 2013 produces robust detection and wave characterization. Our modified envelopes substantially improve the short‐term average/long‐term average ratios, enhancing explosion detection. We detect GCA within both the Veniaminof and Pavlof networks from the 2007 and 2013–2014 activity, indicating repeated volcanic explosions. Event clustering and forward modeling suggests that high‐resolution localization is possible for GCA on typical volcano seismic networks. These results indicate that GCA can be used to help detect, locate, characterize, and monitor volcanic eruptions, particularly in difficult‐to‐monitor regions. We have implemented these GCA detection algorithms into our operational volcano‐monitoring algorithms at the Alaska Volcano Observatory.

  16. The long-term problems of contaminated land: Sources, impacts and countermeasures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  17. Detection of spatial fluctuations of non-point source fecal pollution in coral reef surrounding waters in southwestern Puerto Rico using PCR-based assays.

    PubMed

    Bonkosky, M; Hernández-Delgado, E A; Sandoz, B; Robledo, I E; Norat-Ramírez, J; Mattei, H

    2009-01-01

    Human fecal contamination of coral reefs is a major cause of concern. Conventional methods used to monitor microbial water quality cannot be used to discriminate between different fecal pollution sources. Fecal coliforms, enterococci, and human-specific Bacteroides (HF183, HF134), general Bacteroides-Prevotella (GB32), and Clostridium coccoides group (CP) 16S rDNA PCR assays were used to test for the presence of non-point source fecal contamination across the southwestern Puerto Rico shelf. Inshore waters were highly turbid, consistently receiving fecal pollution from variable sources, and showing the highest frequency of positive molecular marker signals. Signals were also detected at offshore waters in compliance with existing microbiological quality regulations. Phylogenetic analysis showed that most isolates were of human fecal origin. The geographic extent of non-point source fecal pollution was large and impacted extensive coral reef systems. This could have deleterious long-term impacts on public health, local fisheries and in tourism potential if not adequately addressed.

  18. Augmented classical least squares multivariate spectral analysis

    DOEpatents

    Haaland, David M.; Melgaard, David K.

    2004-02-03

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  19. Augmented Classical Least Squares Multivariate Spectral Analysis

    DOEpatents

    Haaland, David M.; Melgaard, David K.

    2005-07-26

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  20. Augmented Classical Least Squares Multivariate Spectral Analysis

    DOEpatents

    Haaland, David M.; Melgaard, David K.

    2005-01-11

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  1. Modeling of reverberant room responses for two-dimensional spatial sound field analysis and synthesis.

    PubMed

    Bai, Mingsian R; Li, Yi; Chiang, Yi-Hao

    2017-10-01

    A unified framework is proposed for analysis and synthesis of two-dimensional spatial sound field in reverberant environments. In the sound field analysis (SFA) phase, an unbaffled 24-element circular microphone array is utilized to encode the sound field based on the plane-wave decomposition. Depending on the sparsity of the sound sources, the SFA stage can be implemented in two manners. For sparse-source scenarios, a one-stage algorithm based on compressive sensing algorithm is utilized. Alternatively, a two-stage algorithm can be used, where the minimum power distortionless response beamformer is used to localize the sources and Tikhonov regularization algorithm is used to extract the source amplitudes. In the sound field synthesis (SFS), a 32-element rectangular loudspeaker array is employed to decode the target sound field using pressure matching technique. To establish the room response model, as required in the pressure matching step of the SFS phase, an SFA technique for nonsparse-source scenarios is utilized. Choice of regularization parameters is vital to the reproduced sound field. In the SFS phase, three SFS approaches are compared in terms of localization performance and voice reproduction quality. Experimental results obtained in a reverberant room are presented and reveal that an accurate room response model is vital to immersive rendering of the reproduced sound field.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.A.; Brasseur, G.P.; Zimmerman, P.R.

    Using the hydroxyl radical field calibrated to the methyl chloroform observations, the globally averaged release of methane and its spatial and temporal distribution were investigated. Two source function models of the spatial and temporal distribution of the flux of methane to the atmosphere were developed. The first model was based on the assumption that methane is emitted as a proportion of net primary productivity (NPP). With the average hydroxyl radical concentration fixed, the methane source term was computed as {approximately}623 Tg CH{sub 4}, giving an atmospheric lifetime for methane {approximately}8.3 years. The second model identified source regions for methane frommore » rice paddies, wetlands, enteric fermentation, termites, and biomass burning based on high-resolution land use data. This methane source distribution resulted in an estimate of the global total methane source of {approximately}611 Tg CH{sub 4}, giving an atmospheric lifetime for methane {approximately}8.5 years. The most significant difference between the two models were predictions of methane fluxes over China and South East Asia, the location of most of the world's rice paddies. Using a recent measurement of the reaction rate of hydroxyl radical and methane leads to estimates of the global total methane source for SF1 of {approximately}524 Tg CH{sub 4} giving an atmospheric lifetime of {approximately}10.0 years and for SF2{approximately}514 Tg CH{sub 4} yielding a lifetime of {approximately}10.2 years.« less

  3. The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank

    NASA Astrophysics Data System (ADS)

    Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing

    2018-03-01

    In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.

  4. Monte Carlo simulation of moderator and reflector in coal analyzer based on a D-T neutron generator.

    PubMed

    Shan, Qing; Chu, Shengnan; Jia, Wenbao

    2015-11-01

    Coal is one of the most popular fuels in the world. The use of coal not only produces carbon dioxide, but also contributes to the environmental pollution by heavy metals. In prompt gamma-ray neutron activation analysis (PGNAA)-based coal analyzer, the characteristic gamma rays of C and O are mainly induced by fast neutrons, whereas thermal neutrons can be used to induce the characteristic gamma rays of H, Si, and heavy metals. Therefore, appropriate thermal and fast neutrons are beneficial in improving the measurement accuracy of heavy metals, and ensure that the measurement accuracy of main elements meets the requirements of the industry. Once the required yield of the deuterium-tritium (d-T) neutron generator is determined, appropriate thermal and fast neutrons can be obtained by optimizing the neutron source term. In this article, the Monte Carlo N-Particle (MCNP) Transport Code and Evaluated Nuclear Data File (ENDF) database are used to optimize the neutron source term in PGNAA-based coal analyzer, including the material and shape of the moderator and neutron reflector. The optimized targets include two points: (1) the ratio of the thermal to fast neutron is 1:1 and (2) the total neutron flux from the optimized neutron source in the sample increases at least 100% when compared with the initial one. The simulation results show that, the total neutron flux in the sample increases 102%, 102%, 85%, 72%, and 62% with Pb, Bi, Nb, W, and Be reflectors, respectively. Maximum optimization of the targets is achieved when the moderator is a 3-cm-thick lead layer coupled with a 3-cm-thick high-density polyethylene (HDPE) layer, and the neutron reflector is a 27-cm-thick hemispherical lead layer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Nutritional Considerations for the Vegetarian and Vegan Dancer.

    PubMed

    Brown, Derrick D

    2018-03-15

    Vegetarianism provides a catchall term for a variety of diets that exclude the consumption of some or all animal products. Contrary to popular claims, appropriately designed and managed vegetarian diets contain foods nutritionally sufficient for health, well-being, and physical performance. Vegetarian dancers can meet their protein needs from primarily or exclusively (vegan) plant-based sources when a variety of these foods are consumed daily and energy intake is adequate. However, the quality and timing of dietary intake is of key importance to meet the physical demands typical of high intensity, intermittent types of dance styles. Poorly planned, calorically restrictive, and nutrient poor diets confer a host of deficiencies that diminish health and ultimately performance. The recommendation for dietary macronutrient composition of carbohydrate, fat, and protein of 55%, 20% to 30%, and 12% to 15%, respectively, offers an acceptable baseline for all dancers across different dance styles. Vegetarians, in particular vegans, should ensure sufficient caloric and adequate intake of Vitamin B12, Vitamin D, ω-3 fatty acids, calcium, and zinc. Many of these micronutrients are derived from animal products, but, with sufficient knowledge, can be obtained from plantbased sources. However, the diminished bioavailability of iron from plants and lack of plant sources of Vitamin B12 in vegan type diets can have detrimental effects on physical performance. Thus, to prevent long-term deficiencies, vegan dancers require more diligence when preparing and managing dietary intake. This article reviews literature on vegetarian diets with regard to dance, gleaning findings from epidemiologic, clinical, and sport nutrition research. It also highlights potential micronutrient deficiencies that may occur in some plant-based diets and presents potential strategies to improve nutrient and caloric intake for dancers who opt for a plant-based diet.

  6. Phase II evaluation of clinical coding schemes: completeness, taxonomy, mapping, definitions, and clarity. CPRI Work Group on Codes and Structures.

    PubMed

    Campbell, J R; Carpenter, P; Sneiderman, C; Cohn, S; Chute, C G; Warren, J

    1997-01-01

    To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for "parent" and "child" codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p < .00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56, UMLS 3.17; READ 2.14, *p < .005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p < .00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p < .004) associated with a loss of clarity. No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. Is suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record.

  7. Phase II Evaluation of Clinical Coding Schemes

    PubMed Central

    Campbell, James R.; Carpenter, Paul; Sneiderman, Charles; Cohn, Simon; Chute, Christopher G.; Warren, Judith

    1997-01-01

    Abstract Objective: To compare three potential sources of controlled clinical terminology (READ codes version 3.1, SNOMED International, and Unified Medical Language System (UMLS) version 1.6) relative to attributes of completeness, clinical taxonomy, administrative mapping, term definitions and clarity (duplicate coding rate). Methods: The authors assembled 1929 source concept records from a variety of clinical information taken from four medical centers across the United States. The source data included medical as well as ample nursing terminology. The source records were coded in each scheme by an investigator and checked by the coding scheme owner. The codings were then scored by an independent panel of clinicians for acceptability. Codes were checked for definitions provided with the scheme. Codes for a random sample of source records were analyzed by an investigator for “parent” and “child” codes within the scheme. Parent and child pairs were scored by an independent panel of medical informatics specialists for clinical acceptability. Administrative and billing code mapping from the published scheme were reviewed for all coded records and analyzed by independent reviewers for accuracy. The investigator for each scheme exhaustively searched a sample of coded records for duplications. Results: SNOMED was judged to be significantly more complete in coding the source material than the other schemes (SNOMED* 70%; READ 57%; UMLS 50%; *p <.00001). SNOMED also had a richer clinical taxonomy judged by the number of acceptable first-degree relatives per coded concept (SNOMED* 4.56; UMLS 3.17; READ 2.14, *p <.005). Only the UMLS provided any definitions; these were found for 49% of records which had a coding assignment. READ and UMLS had better administrative mappings (composite score: READ* 40.6%; UMLS* 36.1%; SNOMED 20.7%, *p <. 00001), and SNOMED had substantially more duplications of coding assignments (duplication rate: READ 0%; UMLS 4.2%; SNOMED* 13.9%, *p <. 004) associated with a loss of clarity. Conclusion: No major terminology source can lay claim to being the ideal resource for a computer-based patient record. However, based upon this analysis of releases for April 1995, SNOMED International is considerably more complete, has a compositional nature and a richer taxonomy. It suffers from less clarity, resulting from a lack of syntax and evolutionary changes in its coding scheme. READ has greater clarity and better mapping to administrative schemes (ICD-10 and OPCS-4), is rapidly changing and is less complete. UMLS is a rich lexical resource, with mappings to many source vocabularies. It provides definitions for many of its terms. However, due to the varying granularities and purposes of its source schemes, it has limitations for representation of clinical concepts within a computer-based patient record. PMID:9147343

  8. Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests

    NASA Astrophysics Data System (ADS)

    Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.

    2015-12-01

    Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.

  9. Arsenic speciation dynamics in paddy rice soil-water environment: sources, physico-chemical, and biological factors - A review.

    PubMed

    Kumarathilaka, Prasanna; Seneweera, Saman; Meharg, Andrew; Bundschuh, Jochen

    2018-04-21

    Rice is the main staple carbohydrate source for billions of people worldwide. Natural geogenic and anthropogenic sources has led to high arsenic (As) concentrations in rice grains. This is because As is highly bioavailable to rice roots under conditions in which rice is cultivated. A multifaceted and interdisciplinary understanding, both of short-term and long-term effects, are required to identify spatial and temporal changes in As contamination levels in paddy soil-water systems. During flooding, soil pore waters are elevated in inorganic As compared to dryland cultivation systems, as anaerobism results in poorly mobile As(V), being reduced to highly mobile As(III). The formation of iron (Fe) plaque on roots, availability of metal (hydro)oxides (Fe and Mn), organic matter, clay mineralogy and competing ions and compounds (PO 4 3- and Si(OH) 4 ) are all known to influence As(V) and As(III) mobility in paddy soil-water environments. Microorganisms play a key role in As transformation through oxidation/reduction, and methylation/volatilization reactions, but transformation kinetics are poorly understood. Scientific-based optimization of all biogeochemical parameters may help to significantly reduce the bioavailability of inorganic As. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  11. Investigation of remote sensing techniques as inputs to operational resource management models. [South Dakota

    NASA Technical Reports Server (NTRS)

    Schmer, F. A. (Principal Investigator); Isakson, R. E.; Eidenshink, J. C.

    1977-01-01

    The author has identified the following significant results. Successful operational applications of LANDSAT data were found for level 1 land use mapping, drainage network delineation, and aspen mapping. Visual LANDSAT interpretation using 1:125,000 color composite imagery was the least expensive method of obtaining timely level 1 land use data. With an average agricultural/rangeland interpretation accuracy in excess of 80%, such a data source was considered the most cost effective of those sources available to state agencies. Costs do not compare favorably with those incurred using the present method of extracting land use data from historical tabular summaries. The cost increase in advancing from the present procedure to a satellite-based data source was justified in terms of expanded data content.

  12. OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

    PubMed

    Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van

    2017-08-01

    Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  14. Modeling the contribution of point sources and non-point sources to Thachin River water pollution.

    PubMed

    Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth

    2009-08-15

    Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.

  15. Fermi Large Area Telescope Second Source Catalog

    NASA Astrophysics Data System (ADS)

    Nolan, P. L.; Abdo, A. A.; Ackermann, M.; Ajello, M.; Allafort, A.; Antolini, E.; Atwood, W. B.; Axelsson, M.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bechtol, K.; Belfiore, A.; Bellazzini, R.; Berenji, B.; Bignami, G. F.; Blandford, R. D.; Bloom, E. D.; Bonamente, E.; Bonnell, J.; Borgland, A. W.; Bottacini, E.; Bouvier, A.; Brandt, T. J.; Bregeon, J.; Brigida, M.; Bruel, P.; Buehler, R.; Burnett, T. H.; Buson, S.; Caliandro, G. A.; Cameron, R. A.; Campana, R.; Cañadas, B.; Cannon, A.; Caraveo, P. A.; Casandjian, J. M.; Cavazzuti, E.; Ceccanti, M.; Cecchi, C.; Çelik, Ö.; Charles, E.; Chekhtman, A.; Cheung, C. C.; Chiang, J.; Chipaux, R.; Ciprini, S.; Claus, R.; Cohen-Tanugi, J.; Cominsky, L. R.; Conrad, J.; Corbet, R.; Cutini, S.; D'Ammando, F.; Davis, D. S.; de Angelis, A.; DeCesar, M. E.; DeKlotz, M.; De Luca, A.; den Hartog, P. R.; de Palma, F.; Dermer, C. D.; Digel, S. W.; Silva, E. do Couto e.; Drell, P. S.; Drlica-Wagner, A.; Dubois, R.; Dumora, D.; Enoto, T.; Escande, L.; Fabiani, D.; Falletti, L.; Favuzzi, C.; Fegan, S. J.; Ferrara, E. C.; Focke, W. B.; Fortin, P.; Frailis, M.; Fukazawa, Y.; Funk, S.; Fusco, P.; Gargano, F.; Gasparrini, D.; Gehrels, N.; Germani, S.; Giebels, B.; Giglietto, N.; Giommi, P.; Giordano, F.; Giroletti, M.; Glanzman, T.; Godfrey, G.; Grenier, I. A.; Grondin, M.-H.; Grove, J. E.; Guillemot, L.; Guiriec, S.; Gustafsson, M.; Hadasch, D.; Hanabata, Y.; Harding, A. K.; Hayashida, M.; Hays, E.; Hill, A. B.; Horan, D.; Hou, X.; Hughes, R. E.; Iafrate, G.; Itoh, R.; Jóhannesson, G.; Johnson, R. P.; Johnson, T. E.; Johnson, A. S.; Johnson, T. J.; Kamae, T.; Katagiri, H.; Kataoka, J.; Katsuta, J.; Kawai, N.; Kerr, M.; Knödlseder, J.; Kocevski, D.; Kuss, M.; Lande, J.; Landriu, D.; Latronico, L.; Lemoine-Goumard, M.; Lionetto, A. M.; Llena Garde, M.; Longo, F.; Loparco, F.; Lott, B.; Lovellette, M. N.; Lubrano, P.; Madejski, G. M.; Marelli, M.; Massaro, E.; Mazziotta, M. N.; McConville, W.; McEnery, J. E.; Mehault, J.; Michelson, P. F.; Minuti, M.; Mitthumsiri, W.; Mizuno, T.; Moiseev, A. A.; Mongelli, M.; Monte, C.; Monzani, M. E.; Morselli, A.; Moskalenko, I. V.; Murgia, S.; Nakamori, T.; Naumann-Godo, M.; Norris, J. P.; Nuss, E.; Nymark, T.; Ohno, M.; Ohsugi, T.; Okumura, A.; Omodei, N.; Orlando, E.; Ormes, J. F.; Ozaki, M.; Paneque, D.; Panetta, J. H.; Parent, D.; Perkins, J. S.; Pesce-Rollins, M.; Pierbattista, M.; Pinchera, M.; Piron, F.; Pivato, G.; Porter, T. A.; Racusin, J. L.; Rainò, S.; Rando, R.; Razzano, M.; Razzaque, S.; Reimer, A.; Reimer, O.; Reposeur, T.; Ritz, S.; Rochester, L. S.; Romani, R. W.; Roth, M.; Rousseau, R.; Ryde, F.; Sadrozinski, H. F.-W.; Salvetti, D.; Sanchez, D. A.; Saz Parkinson, P. M.; Sbarra, C.; Scargle, J. D.; Schalk, T. L.; Sgrò, C.; Shaw, M. S.; Shrader, C.; Siskind, E. J.; Smith, D. A.; Spandre, G.; Spinelli, P.; Stephens, T. E.; Strickman, M. S.; Suson, D. J.; Tajima, H.; Takahashi, H.; Takahashi, T.; Tanaka, T.; Thayer, J. G.; Thayer, J. B.; Thompson, D. J.; Tibaldo, L.; Tibolla, O.; Tinebra, F.; Tinivella, M.; Torres, D. F.; Tosti, G.; Troja, E.; Uchiyama, Y.; Vandenbroucke, J.; Van Etten, A.; Van Klaveren, B.; Vasileiou, V.; Vianello, G.; Vitale, V.; Waite, A. P.; Wallace, E.; Wang, P.; Werner, M.; Winer, B. L.; Wood, D. L.; Wood, K. S.; Wood, M.; Yang, Z.; Zimmer, S.

    2012-04-01

    We present the second catalog of high-energy γ-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), derived from data taken during the first 24 months of the science phase of the mission, which began on 2008 August 4. Source detection is based on the average flux over the 24 month period. The second Fermi-LAT catalog (2FGL) includes source location regions, defined in terms of elliptical fits to the 95% confidence regions and spectral fits in terms of power-law, exponentially cutoff power-law, or log-normal forms. Also included are flux measurements in five energy bands and light curves on monthly intervals for each source. Twelve sources in the catalog are modeled as spatially extended. We provide a detailed comparison of the results from this catalog with those from the first Fermi-LAT catalog (1FGL). Although the diffuse Galactic and isotropic models used in the 2FGL analysis are improved compared to the 1FGL catalog, we attach caution flags to 162 of the sources to indicate possible confusion with residual imperfections in the diffuse model. The 2FGL catalog contains 1873 sources detected and characterized in the 100 MeV to 100 GeV range of which we consider 127 as being firmly identified and 1171 as being reliably associated with counterparts of known or likely γ-ray-producing source classes. We dedicate this paper to the memory of our colleague Patrick Nolan, who died on 2011 November 6. His career spanned much of the history of high-energy astronomy from space and his work on the Large Area Telescope (LAT) began nearly 20 years ago when it was just a concept. Pat was a central member in the operation of the LAT collaboration and he is greatly missed.

  16. A comparison of graph- and kernel-based -omics data integration algorithms for classifying complex traits.

    PubMed

    Yan, Kang K; Zhao, Hongyu; Pang, Herbert

    2017-12-06

    High-throughput sequencing data are widely collected and analyzed in the study of complex diseases in quest of improving human health. Well-studied algorithms mostly deal with single data source, and cannot fully utilize the potential of these multi-omics data sources. In order to provide a holistic understanding of human health and diseases, it is necessary to integrate multiple data sources. Several algorithms have been proposed so far, however, a comprehensive comparison of data integration algorithms for classification of binary traits is currently lacking. In this paper, we focus on two common classes of integration algorithms, graph-based that depict relationships with subjects denoted by nodes and relationships denoted by edges, and kernel-based that can generate a classifier in feature space. Our paper provides a comprehensive comparison of their performance in terms of various measurements of classification accuracy and computation time. Seven different integration algorithms, including graph-based semi-supervised learning, graph sharpening integration, composite association network, Bayesian network, semi-definite programming-support vector machine (SDP-SVM), relevance vector machine (RVM) and Ada-boost relevance vector machine are compared and evaluated with hypertension and two cancer data sets in our study. In general, kernel-based algorithms create more complex models and require longer computation time, but they tend to perform better than graph-based algorithms. The performance of graph-based algorithms has the advantage of being faster computationally. The empirical results demonstrate that composite association network, relevance vector machine, and Ada-boost RVM are the better performers. We provide recommendations on how to choose an appropriate algorithm for integrating data from multiple sources.

  17. What's in a ray set: moving towards a unified ray set format

    NASA Astrophysics Data System (ADS)

    Muschaweck, Julius

    2011-10-01

    For the purpose of optical simulation, a plethora of formats exist to describe the properties of a light source. Except for the EULUMDAT and IES formats which describe sources in terms of aperture area and far field intensity, all these formats are vendor specific, and no generally accepted standard exists. Most illumination simulation software vendors use their own format for ray sets, which describe sources in terms of many rays. Some of them keep their format definition proprietary. Thus, software packages typically can read or write only their own specific format, although the actual data content is not so different. Typically, they describe origin and direction of each ray in 3D vectors, and use one more single number for magnitude, where magnitude may denote radiant flux, luminous flux (equivalently tristimulus Y), or tristimulus X and Z. Sometimes each ray also carries its wavelength, while other formats allow to specify an overall spectrum for the whole source. In addition, in at least one format, polarization properties are also included for each ray. This situation makes it inefficient and potentially error prone for light source manufacturers to provide ray data sets for their sources in many different formats. Furthermore, near field goniometer vendors again use their proprietary formats to store the source description in terms of luminance data, and offer their proprietary software to generate ray sets from this data base. Again, the plethora of ray sets make the ray set production inefficient and potentially error prone. In this paper, we propose to describe ray data sets in terms of phase space, as a step towards a standardized ray set format. It is well known that luminance and radiance can be defined as flux density in phase space: luminance is flux divided by etendue. Therefore, single rays can be thought of as center points of phase space cells, where each cell possesses its volume (i.e. etendue), its flux, and therefore its luminance. In addition, each phase space cell possesses its spectrum, and its polarization properties. We show how this approach leads to a unification of the EULUMDAT/IES, ray set and near field goniometer formats, making possible the generation of arbitrarily many additional rays by luminance interpolation. We also show how the EULUMDAT/IES and individual ray set formats can be derived from the proposed general format, making software using a possible standard format downward compatible.

  18. Open source software integrated into data services of Japanese planetary explorations

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.

    2015-12-01

    Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.

  19. Gas Turbine Energy Conversion Systems for Nuclear Power Plants Applicable to LiFTR Liquid Fluoride Thorium Reactor Technology

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2014-01-01

    This panel plans to cover thermal energy and electric power production issues facing our nation and the world over the next decades, with relevant technologies ranging from near term to mid-and far term.Although the main focus will be on ground based plants to provide baseload electric power, energy conversion systems (ECS) for space are also included, with solar- or nuclear energy sources for output power levels ranging tens of Watts to kilo-Watts for unmanned spacecraft, and eventual mega-Watts for lunar outposts and planetary surface colonies. Implications of these technologies on future terrestrial energy systems, combined with advanced fracking, are touched upon.Thorium based reactors, and nuclear fusion along with suitable gas turbine energy conversion systems (ECS) will also be considered by the panelists. The characteristics of the above mentioned ECS will be described, both in terms of their overall energy utilization effectiveness and also with regard to climactic effects due to exhaust emissions.

  20. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  1. Petrographic Analyses of Lonestones from ODP Drill Sites Leg 188 Prydz Bay, Antarctica

    NASA Astrophysics Data System (ADS)

    Detterman, K.; Warnke, D. A.; Richter, C.

    2006-12-01

    ODP Leg 188 was drilled in 2000 to sample the first advances of the Antarctic ice sheet and to document further cryospheric development. Continental shelf Site 1166 documented the earliest stages of glaciation during the Eocene-Oligocene and continental slope Site 1167 documented rapid deposition by debris flows during the Pliocene-Pleistocene and a subtle change in onshore erosion areas. Site 1165, located on the continental rise, documented long-term transition from wet-based lower Miocene glaciers to dry-based upper Miocene glaciers, including short-term fluctuations starting in the early Miocene. Source areas for all drill sites are the Lambert Glacier-Amery Ice Shelf drainage area, encompassing the Northern and Southern Prince Charles Mountains, the Gamburtsev Sub-glacial Mountains, and the Grove Mountains. Lonestones occur in most of the cores from all sites of Leg 188 prompting research for potential source areas and transportation modes of the lonestones. One-hundred and seventeen thin sections of lonestones were prepared from Sites 1166, 1167, and 1165 for petrographic analyses. Metamorphic lonestones outnumber igneous and sedimentary lonestones at all three sites. Sedimentary lonestones were not found in the thin sections of Site 1166. Extrusive igneous lonestones were found only at Site 1165 and comprised 5.1 percent of Leg 188's lithology. The anorthite content of igneous and metamorphic lonestones represented at all three sites was albite-oligoclase plagioclase. Albite oligoclase plagioclase has been documented in the Southern Prince Charles Mountains. The results of this study of a selection of lonestones from Site 1167 supports a hypothesis first proposed by the Shipboard Scientific Party in 2001 that as time elapsed, the source area for Site 1167 lonestones shifted slightly from a largely sandstone source to a largely granitic source within the drainage area. One potential source area for the Site 1167 sandstone lonestones is the Permian to Triassic Amery Group in the Beaver Lake area of the Northern Prince Charles Mountains. We hypothesize that more easily eroded portions of the sandstone outcrops were planed off first while ubiquitous gneiss and granite outcrops provided the source material for the younger debris flows at Site 1167 in the Pliocene-Pleistocene. None of all the available lonestones suggest sources other than the drainage area of the Lambert Glacier- Amery Ice Shelf complex.

  2. Evaluation of the source area of rooftop scalar measurements in London, UK using wind tunnel and modelling approaches.

    NASA Astrophysics Data System (ADS)

    Brocklehurst, Aidan; Boon, Alex; Barlow, Janet; Hayden, Paul; Robins, Alan

    2014-05-01

    The source area of an instrument is an estimate of the area of ground over which the measurement is generated. Quantification of the source area of a measurement site provides crucial context for analysis and interpretation of the data. A range of computational models exists to calculate the source area of an instrument, but these are usually based on assumptions which do not hold for instruments positioned very close to the surface, particularly those surrounded by heterogeneous terrain i.e. urban areas. Although positioning instrumentation at higher elevation (i.e. on masts) is ideal in urban areas, this can be costly in terms of installation and maintenance costs and logistically difficult to position instruments in the ideal geographical location. Therefore, in many studies, experimentalists turn to rooftops to position instrumentation. Experimental validations of source area models for these situations are very limited. In this study, a controlled tracer gas experiment was conducted in a wind tunnel based on a 1:200 scale model of a measurement site used in previous experimental work in central London. The detector was set at the location of the rooftop site as the tracer was released at a range of locations within the surrounding streets and rooftops. Concentration measurements are presented for a range of wind angles, with the spread of concentration measurements indicative of the source area distribution. Clear evidence of wind channeling by streets is seen with the shape of the source area strongly influenced by buildings upwind of the measurement point. The results of the wind tunnel study are compared to scalar concentration source areas generated by modelling approaches based on meteorological data from the central London experimental site and used in the interpretation of continuous carbon dioxide (CO2) concentration data. Initial conclusions will be drawn as to how to apply scalar concentration source area models to rooftop measurement sites and suggestions for their improvement to incorporate effects such as channeling.

  3. POI Summarization by Aesthetics Evaluation From Crowd Source Social Media.

    PubMed

    Qian, Xueming; Li, Cheng; Lan, Ke; Hou, Xingsong; Li, Zhetao; Han, Junwei

    2018-03-01

    Place-of-Interest (POI) summarization by aesthetics evaluation can recommend a set of POI images to the user and it is significant in image retrieval. In this paper, we propose a system that summarizes a collection of POI images regarding both aesthetics and diversity of the distribution of cameras. First, we generate visual albums by a coarse-to-fine POI clustering approach and then generate 3D models for each album by the collected images from social media. Second, based on the 3D to 2D projection relationship, we select candidate photos in terms of the proposed crowd source saliency model. Third, in order to improve the performance of aesthetic measurement model, we propose a crowd-sourced saliency detection approach by exploring the distribution of salient regions in the 3D model. Then, we measure the composition aesthetics of each image and we explore crowd source salient feature to yield saliency map, based on which, we propose an adaptive image adoption approach. Finally, we combine the diversity and the aesthetics to recommend aesthetic pictures. Experimental results show that the proposed POI summarization approach can return images with diverse camera distributions and aesthetics.

  4. Viking-Age Sails: Form and Proportion

    NASA Astrophysics Data System (ADS)

    Bischoff, Vibeke

    2017-04-01

    Archaeological ship-finds have shed much light on the design and construction of vessels from the Viking Age. However, the exact proportions of their sails remain unknown due to the lack of fully preserved sails, or other definite indicators of their proportions. Key Viking-Age ship-finds from Scandinavia—the Oseberg Ship, the Gokstad Ship and Skuldelev 3—have all revealed traces of rigging. In all three finds, the keelson—with the mast position—is preserved, together with fastenings for the sheets and the tack, indicating the breadth of the sail. The sail area can then be estimated based on practical experience of how large a sail the specific ship can carry, in conjunction with hull form and displacement. This article presents reconstructions of the form and dimensions of rigging and sail based on the archaeological finds, evidence from iconographic and written sources, and ethnographic parallels with traditional Nordic boats. When these sources are analysed, not only do the similarities become apparent, but so too does the relative disparity between the archaeological record and the other sources. Preferential selection in terms of which source is given the greatest merit is therefore required, as it is not possible to afford them all equal value.

  5. Hepatocyte transplantation and advancements in alternative cell sources for liver-based regenerative medicine.

    PubMed

    Lee, Charlotte A; Sinha, Siddharth; Fitzpatrick, Emer; Dhawan, Anil

    2018-06-01

    Human hepatocyte transplantation has been actively perused as an alternative to liver replacement for acute liver failure and liver-based metabolic defects. Current challenges in this field include a limited cell source, reduced cell viability following cryopreservation and poor engraftment of cells into the recipient liver with consequent limited life span. As a result, alternative stem cell sources such as pluripotent stem cells, fibroblasts, hepatic progenitor cells, amniotic epithelial cells and mesenchymal stem/stromal cells (MSCs) can be used to generate induced hepatocyte like cells (HLC) with each technique exhibiting advantages and disadvantages. HLCs may have comparable function to primary human hepatocytes and could offer patient-specific treatment. However, long-term functionality of transplanted HLCs and the potential oncogenic risks of using stem cells have yet to be established. The immunomodulatory effects of MSCs are promising, and multiple clinical trials are investigating their effect in cirrhosis and acute liver failure. Here, we review the current status of hepatocyte transplantation, alternative cell sources to primary human hepatocytes and their potential in liver regeneration. We also describe recent clinical trials using hepatocytes derived from stem cells and their role in improving the phenotype of several liver diseases.

  6. A moving medium formulation for prediction of propeller noise at incidence

    NASA Astrophysics Data System (ADS)

    Ghorbaniasl, Ghader; Lacor, Chris

    2012-01-01

    This paper presents a time domain formulation for the sound field radiated by moving bodies in a uniform steady flow with arbitrary orientation. The aim is to provide a formulation for prediction of noise from body so that effects of crossflow on a propeller can be modeled in the time domain. An established theory of noise generation by a moving source is combined with the moving medium Green's function for derivation of the formulation. A formula with Doppler factor is developed because it is more easily interpreted and is more helpful in examining the physic of systems. Based on the technique presented, the source of asymmetry of the sound field can be explained in terms of physics of a moving source. It is shown that the derived formulation can be interpreted as an extension of formulation 1 and 1A of Farassat based on the Ffowcs Williams and Hawkings (FW-H) equation for moving medium problems. Computational results for a stationary monopole and dipole point source in moving medium, a rotating point force in crossflow, a model of helicopter blade at incidence and a propeller case with subsonic tips at incidence verify the formulation.

  7. Marine litter from beach-based sources: Case study of an Eastern Mediterranean coastal town.

    PubMed

    Portman, Michelle E; Brennan, Ruth E

    2017-11-01

    Marine litter has been a serious and growing problem for some decades now. Yet, there is still much speculation among researchers, policy makers and planners about how to tackle marine litter from land-based sources. This paper provides insights into approaches for managing marine litter by reporting and analyzing survey results of litter dispersal and makeup from three areas along an Arab-Israeli coastal town in view of other recent studies conducted around the Mediterranean Sea. Based on our results and analysis, we posit that bathing beach activities should be a high priority for waste managers as a point of intervention and beach-goers must be encouraged to take a more active role in keeping beaches clean. Further, plastic fragments on the beach should be targeted as a first priority for prevention (and cleanup) of marine litter with plastic bottle caps being a high priority to be targeted among plastics. More survey research is needed on non-plastic litter composition for which amounts and geographic dispersal in the region vary greatly from place to place along Mediterranean shores. In general, findings of this study lead us to recommend exploring persuasive beach trash can design coupled with greater enforcement for short term waste management intervention while considering the local socio-economic and institutional context further for long-term efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Development of atmospheric N2O isotopomers model based on a chemistry-coupled atmospheric general circulation model

    NASA Astrophysics Data System (ADS)

    Ishijima, K.; Toyoda, S.; Sudo, K.; Yoshikawa, C.; Nanbu, S.; Aoki, S.; Nakazawa, T.; Yoshida, N.

    2009-12-01

    It is well known that isotopic information is useful to qualitatively understand cycles and constrain sources of some atmospheric species, but so far there has been no study to model N2O isotopomers throughout the atmosphere from the troposphere to the stratosphere, including realistic surface N2O isotopomers emissions. We have started to develop a model to simulate spatiotemporal variations of the atmospheric N2O isotopomers in both the troposphere and the stratosphere, based on a chemistry-coupled atmospheric general circulation model, in order to obtain more accurate quantitative understanding of the global N2O cycle. For surface emissions of the isotopomers, combination of EDGAR-based anthropogenic and soil fluxes and monthly varying GEIA oceanic fluxes are factored, using isotopic values of global total sources estimated from firn-air analyses based long-term trend of the atmospheric N2O isotopomers. Isotopic fractionations in chemical reactions are considered for photolysis and photo-oxidation of N2O in the stratosphere. The isotopic fractionation coefficients have been employed from studies based on laboratory experiments, but we also will test the coefficients determined by theoretical calculations. In terms of the global N2O isotopomer budgets, precise quantification of the sources is quite challenging, because even the spatiotemporal variabilities of N2O sources have never been adequately estimated. Therefore, we have firstly started validation of simulated isotopomer results in the stratosphere, by using the isotopomer profiles obtained by balloon observations. N2O concentration profiles are mostly well reproduced, partly because of realistic reproduction of dynamical processes by nudging with reanalysis meteorological data. However, the concentration in the polar vortex tends to be overestimated, probably due to relatively coarse wave-length resolution in photolysis calculation. Such model features also appear in the isotopomers results, which are almost underestimated, relative to the balloon observations, although the concentration is well simulated. The tendency has been somewhat improved by incorporating another photolysis scheme with slightly higher wave-length resolution into the model. From another point of view, these facts indicate that N2O isotopomers can be used for validation of the stratospheric photochemical calculations in model, because of very high sensitivity of the isotopomer ratio values to some settings such as the wave-length resolution in the photochemical scheme.Therefore, N2O isotopomers modeling seems to be not only useful for validation of the fractionation coefficients and of isotopic characterization of sources, but also have the possibility to be an index especially for precision in the stratospheric photolysis in model.

  9. Application of Geodetic VLBI Data to Obtaining Long-Term Light Curves for Astrophysics

    NASA Technical Reports Server (NTRS)

    Kijima, Masachika

    2010-01-01

    The long-term light curve is important to research on binary black holes and disk instability in AGNs. The light curves have been drawn mainly using single dish data provided by the University of Michigan Radio Observatory and the Metsahovi Radio Observatory. Hence, thus far, we have to research on limited sources. I attempt to draw light curves using VLBI data for those sources that have not been monitored by any observatories with single dish. I developed software, analyzed all geodetic VLBI data available at the IVS Data Centers, and drew the light curves at 8 GHz. In this report, I show the tentative results for two AGNs. I compared two light curves of 4C39.25, which were drawn based on single dish data and on VLBI data. I confirmed that the two light curves were consistent. Furthermore, I succeeded in drawing the light curve of 0454-234 with VLBI data, which has not been monitored by any observatory with single dish. In this report, I suggest that the geodetic VLBI archive data is useful to obtain the long-term light curves at radio bands for astrophysics.

  10. Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy

    PubMed Central

    Hall, Matthew L.

    2011-01-01

    Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory – perception, encoding, and recall – in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English was used for perception, memory encoding, and recall in hearing ASL-English bilinguals. Results indicate that using ASL during both perception and encoding contributes to the serial span discrepancy. Interestingly, performing recall in ASL slightly increased span, ruling out the view that signing is in general a poor choice for short-term memory. These results suggest that despite the general equivalence of sign and speech in other memory domains, speech-based representations are better suited for the specific task of perception and memory encoding of a series of unrelated verbal items in serial order through the phonological loop. This work suggests that interpretation of performance on serial recall tasks in English may not translate straightforwardly to serial tasks in sign language. PMID:21450284

  11. Shifting material source of Chinese Loess since ~2.7 Ma reflected by Sr isotopic composition.

    PubMed

    Zhang, Wenfang; Chen, Jun; Li, Gaojun

    2015-05-21

    Deciphering the sources of eolian dust on the Chinese Loess Plateau (CLP) is fundamental to reconstruct paleo-wind patterns and paleo-environmental changes. Existing datasets show contradictory source evolutions of eolian dust on the CLP, both on orbital and tectonic timescales. Here, the silicate Sr and Nd isotopic compositions of a restricted grain size fraction (28-45 μm) were measured to trace the source evolution of the CLP since ~2.7 Ma. Our results revealed an unchanged source on orbital timescales but a gradual source shift from the Qilian Mountains to the Gobi Altay Mountains during the past 2.7 Ma. Both tectonic uplift and climate change may have played important roles for this shift. The later uplift of the Gobi Altay Mountains relative to the Qilian Mountains since 5 ± 3 Ma might be responsible for the increasing contribution of Gobi materials to the source deserts in Alxa arid lands. Enhanced winter monsoon may also facilitate transportation of Gobi materials from the Alxa arid lands to the CLP. The shifting source of Asian dust was also reflected in north Pacific sediments. The finding of this shifting source calls for caution when interpreting the long-term climate changes based on the source-sensitive proxies of the eolian deposits.

  12. Neuroimaging Evidence for Agenda-Dependent Monitoring of Different Features during Short-Term Source Memory Tests

    ERIC Educational Resources Information Center

    Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.

    2008-01-01

    A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…

  13. Unstructured Adaptive Meshes: Bad for Your Memory?

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Feng, Hui-Yu; VanderWijngaart, Rob

    2003-01-01

    This viewgraph presentation explores the need for a NASA Advanced Supercomputing (NAS) parallel benchmark for problems with irregular dynamical memory access. This benchmark is important and necessary because: 1) Problems with localized error source benefit from adaptive nonuniform meshes; 2) Certain machines perform poorly on such problems; 3) Parallel implementation may provide further performance improvement but is difficult. Some examples of problems which use irregular dynamical memory access include: 1) Heat transfer problem; 2) Heat source term; 3) Spectral element method; 4) Base functions; 5) Elemental discrete equations; 6) Global discrete equations. Nonconforming Mesh and Mortar Element Method are covered in greater detail in this presentation.

  14. Romanian Experience in The Conditioning of Radium Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dogaru, Gh.; Dragolici, F.; Rotarescu, Gh.

    2008-07-01

    Ra{sup 226} first radionuclide separated from pitchblende in 1898 by Pierre and Marie Curie was successfully used in medicine, industry as in other fields being the only one available radionuclide till 1940 when were produced other radionuclides in accelerators. On long term the use of Ra{sup 226} sealed sources are not any more safe due to: the high specific activity, long half live, decays in Rn{sup 226} gas which increases the internal pressure of capsule leading in time to the leakage, the salts as raw materials from which the sealed sources are manufactured are soluble, there is a leak ofmore » information and records on the manufacture and operation. Based on this consideration in Romania regulatory authority did not authorized any more the use of these sealed sources [1]. The paper presents some aspects from Romanian experience related to the collection and conditioning of radium sealed sources. Data relating the radium inventory as well as the arrangements made in order to create a workshop for the conditioning of radium sources are presented. (authors)« less

  15. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    PubMed

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  16. Toward a common language for biobanking.

    PubMed

    Fransson, Martin N; Rial-Sebbag, Emmanuelle; Brochhausen, Mathias; Litton, Jan-Eric

    2015-01-01

    To encourage the process of harmonization, the biobank community should support and use a common terminology. Relevant terms may be found in general thesauri for medicine, legal instruments or specific glossaries for biobanking. A comparison of the use of these sources has so far not been conducted and would be a useful instrument to further promote harmonization and data sharing. Thus, the purpose of the present study was to investigate the preference of definitions important for sharing biological samples and data. Definitions for 10 terms -[human] biobank, sample/specimen, sample collection, study, aliquot, coded, identifying information, anonymised, personal data and informed consent-were collected from several sources. A web-based questionnaire was sent to 560 European individuals working with biobanks asking to select their preferred definition for the terms. A total of 123 people participated in the survey, giving a response rate of 23%. The result was evaluated from four aspects: scope of definitions, potential regional differences, differences in semantics and definitions in the context of ontologies, guided by comments from responders. Indicative from the survey is the risk of focusing only on the research aspect of biobanking in definitions. Hence, it is recommended that important terms should be formulated in such a way that all areas of biobanking are covered to improve the bridges between research and clinical application. Since several of the terms investigated here within can also be found in a legal context, which may differ between countries, establishing what is a proper definition on how it adheres to law is also crucial.

  17. NASA thesaurus. Volume 3: Definitions

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.

  18. Response of plant community structure and primary productivity to experimental drought and flooding in an Alaskan fen

    Treesearch

    Amber C. Churchill; Merritt R. Turetsky; A. David McGuire; Teresa N. Hollingsworth

    2015-01-01

    Northern peatlands represent a long-term net sink for atmospheric CO2, but these ecosystems can shift from net carbon (C) sinks to sources based on changing climate and environmental conditions. In particular, changes in water availability associated with climate control peatland vegetation and carbon uptake processes. We examined the influence of changing hydrology on...

  19. Policy Challenges of Accelerating Technological Change: Security Policy and Strategy Implications of Parallel Scientific Revolutions

    DTIC Science & Technology

    2014-09-01

    generation, exotic storage technologies, smart power grid management, and better power sources for directed-energy weapons (DEW). Accessible partner nation...near term will help to mitigate risks and improve outcomes. 2 Forecasting typically extrapolates predictions based...eventually, diminished national power . Within this context, this paper examines policy, legal, ethical, and strategy implications for DoD from the impact

  20. Sources of Prescriptions for Misuse by Adolescents: Differences in Sex, Ethnicity, and Severity of Misuse in a Population-Based Study

    ERIC Educational Resources Information Center

    Schepis, Ty S.; Krishnan-Sarin, Suchitra

    2009-01-01

    A study found that adolescents who recently acquired medication by buying it had the worst risk profile among all medications classes in terms of concurrent substance use and the severity of prescription misuse. It is hoped that the findings could help identify subgroups of adolescents prescription misusers who are most vulnerable to consequences…

  1. The Life of Meaning: A Model of the Positive Contributions to Well-Being from Veterinary Work.

    PubMed

    Cake, Martin A; Bell, Melinda A; Bickley, Naomi; Bartram, David J

    2015-01-01

    We present a veterinary model of work-derived well-being, and argue that educators should not only present a (potentially self-fulfilling) stress management model of future wellness, but also balance this with a positive psychology-based approach depicting a veterinary career as a richly generative source of satisfaction and fulfillment. A review of known sources of satisfaction for veterinarians finds them to be based mostly in meaningful purpose, relationships, and personal growth. This positions veterinary well-being within the tradition of eudaimonia, an ancient concept of achieving one's best possible self, and a term increasingly employed to describe well-being derived from living a life that is engaging, meaningful, and deeply fulfilling. The theory of eudaimonia for workplace well-being should inform development of personal resources that foster resilience in undergraduate and graduate veterinarians.

  2. On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel

    2018-05-01

    We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.

  3. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data-driven, rotating or time-varying directivity function at runtime. Unlike previous approaches, the listener directivity approach can be used to compute spatial audio (3D audio) for a moving, rotating listener at interactive rates. Lastly, we propose an efficient GPU-based time-domain solver for the wave equation that enables wave simulation up to the mid-frequency range in tens of minutes on a desktop computer. It is demonstrated that by carefully mapping all the components of the wave simulator to match the parallel processing capabilities of the graphics processors, significant improvement in performance can be achieved compared to the CPU-based simulators, while maintaining numerical accuracy. We validate these techniques with offline numerical simulations and measured data recorded in an outdoor scene. We present results of preliminary user evaluations conducted to study the impact of these techniques on user's immersion in virtual environment. We have integrated these techniques with the Half-Life 2 game engine, Oculus Rift head-mounted display, and Xbox game controller to enable users to experience high-quality acoustics effects and spatial audio in the virtual environment.

  4. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  5. Polysaccharide-Based Nanobiomaterials as Controlled Release Systems for Tissue Engineering Applications.

    PubMed

    Rodriguez-Velazquez, Eustolia; Alatorre-Meda, Manuel; Mano, Joao F

    2015-01-01

    Polysaccharides belong to a special class of biopolymers that has been used in different areas of research and technology for some years now. They present distinctive features attractive for the biomedical field. Among others, as extracted from natural sources, these materials are usually biocompatible and possess a significant ability to absorb water. Moreover, they can be conveniently modified by chemical means so as to display improved biological and physicochemical properties. The last but not the least, they are abundant in the natural Extracellular Matrix (ECM) and have a tremendous affinity for different endogenous macromolecules. Accordingly, these particular materials constitute outstanding candidates for a variety of biomimetic approaches entailing the entrapment/stabilization of bioactive molecules (e.g. growth factors, siRNA, and DNA) that could be delivered and have an effect on relevant cellular mechanisms, such as gene expression and cell viability, -proliferation, and -differentiation. This review will explore the current status of nano-scale drug delivery devices based on polysaccharides that could be used in tissue engineering and regenerative medicine (TERM). Aiming to contextualize the topics here discussed, especially for non-experts in the field, section 1 (Introduction) will present a brief overview of TERM and the principal polysaccharides herein employed. In order to get a broader perspective on both issues, this section will include a brief description of non-nanometric systems with relevant characteristics for TERM, such as injectable microparticles and macroscopic hydrogels, just to cite a few. Section 2 will illustrate the contributions of nanotechnology to the development of TERM, in particular to the development of biomimetic systems capable of replicating the natural, endogenous ECMs. Next, sections 3 to 6 will describe representative systems in the nanometric scale presenting 0D (nanoparticles), 1D (nanorods and nanowires), 2D (thin coatings/films or multilayered systems), and 3D (woven nanofibrillar mats and meshes) configurations, respectively. Special attention will be paid on how nanometric constructs with these configurations can be used as model systems in TERM to understand and/or manipulate biological functions at the cellular level. Finally, section 7 will provide an outlook on future perspectives in the field. Overall, the review is intended to constitute a critical source of information relative to the current status of polysaccharide- based biomaterials for TERM, in particular those at the nanometric scale.

  6. Erratum to Surface‐wave green’s tensors in the near field

    USGS Publications Warehouse

    Haney, Matthew M.; Hisashi Nakahara,

    2016-01-01

    Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).

  7. Matrix effect and recovery terminology issues in regulated drug bioanalysis.

    PubMed

    Huang, Yong; Shi, Robert; Gee, Winnie; Bonderud, Richard

    2012-02-01

    Understanding the meaning of the terms used in the bioanalytical method validation guidance is essential for practitioners to implement best practice. However, terms that have several meanings or that have different interpretations exist within bioanalysis, and this may give rise to differing practices. In this perspective we discuss an important but often confusing term - 'matrix effect (ME)' - in regulated drug bioanalysis. The ME can be interpreted as either the ionization change or the measurement bias of the method caused by the nonanalyte matrix. The ME definition dilemma makes its evaluation challenging. The matrix factor is currently used as a standard method for evaluation of ionization changes caused by the matrix in MS-based methods. Standard additions to pre-extraction samples have been suggested to evaluate the overall effects of a matrix from different sources on the analytical system, because it covers ionization variation and extraction recovery variation. We also provide our personal views on the term 'recovery'.

  8. Predictable Unpredictability: the Problem with Basing Medicare Policy on Long-Term Financial Forecasting.

    PubMed

    Glied, Sherry; Zaylor, Abigail

    2015-07-01

    The authors assess how Medicare financing and projections of future costs have changed since 2000. They also assess the impact of legislative reforms on the sources and levels of financing and compare cost forecasts made at different times. Although the aging U.S. population and rising health care costs are expected to increase the share of gross domestic product devoted to Medicare, changes made in the program over the past decade have helped stabilize Medicare's financial outlook--even as benefits have been expanded. Long-term forecasting uncertainty should make policymakers and beneficiaries wary of dramatic changes to the program in the near term that are intended to alter its long-term forecast: the range of error associated with cost forecasts rises as the forecast window lengthens. Instead, policymakers should focus on the immediate policy window, taking steps to reduce the current burden of Medicare costs by containing spending today.

  9. Time-Limited Psychotherapy With Adolescents

    PubMed Central

    Shefler, Gaby

    2000-01-01

    Short-term dynamic therapies, characterized by abbreviated lengths (10–40 sessions) and, in many cases, preset termination dates, have become more widespread in the past three decades. Short-term therapies are based on rapid psychodynamic diagnosis, a therapeutic focus, a rapidly formed therapeutic alliance, awareness of termination and separation processes, and the directive stance of the therapist. The emotional storm of adolescence, stemming from both developmental and psychopathological sources, leaves many adolescents in need of psychotherapy. Many adolescents in need of therapy resist long-term attachment and involvement in an ambiguous relationship, which they experience as a threat to their emerging sense of independence and separateness. Short-term dynamic therapy can be the treatment of choice for many adolescents because it minimizes these threats and is more responsive to their developmental needs. The article presents treatment and follow-up of a 17-year-old youth, using James Mann's time-limited psychotherapy method. PMID:10793128

  10. Overview of the gaps in the health care legislation in Georgia: short-, medium-, and long-term priorities.

    PubMed

    Kiknadze, Nino; Beletsky, Leo

    2013-12-12

    After gaining independence following the dissolution of the Soviet Union, Georgia has aspired to become the region's leader in progressive legal reform. Particularly in the realm of health care regulation, Georgia has proceeded with extensive legislative reforms intended to modernize its health care system, and bring it in line with international standards. As part of a larger project to improve human rights in patient care, we conducted a study designed to identify gaps in the current Georgian health care legislation. Using a cross-site research framework based on the European Charter of Patients’ Rights, an interdisciplinary working group oversaw a comprehensive review of human rights legislation pertinent to health care settings using various sources, such as black letter law, expert opinions, court cases, research papers, reports, and complaints. The study identified a number of serious inconsistencies, gaps, and conflicts in the definition and coverage of terms used in the national legislative canon pertinent to human rights in patient care. These include inconsistent definitions of key terms "informed consent" and "medical malpractice" across the legislative landscape. Imprecise and overly broad drafting of legislation has left concepts like patient confidentiality and implied consent wide open to abuse. The field of health care provider rights was entirely missing from existing Georgian legislation. To our knowledge, this is the first study of its kind in Georgia. Gaps and inconsistencies uncovered were categorized based on a short-, medium-, and long-term action framework. Results were presented to key decision makers in Georgian ministerial and legislative institutions. Several of the major recommendations are currently being considered for inclusion into future legal reform. Copyright © 2013 Kiknadze and Beletsky. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.

  11. Quantifying Transmission of Clostridium difficile within and outside Healthcare Settings

    PubMed Central

    Olsen, Margaret A.; Dubberke, Erik R.; Galvani, Alison P.; Townsend, Jeffrey P.

    2016-01-01

    To quantify the effect of hospital and community-based transmission and control measures on Clostridium difficile infection (CDI), we constructed a transmission model within and between hospital, community, and long-term care-facility settings. By parameterizing the model from national databases and calibrating it to C. difficile prevalence and CDI incidence, we found that hospitalized patients with CDI transmit C. difficile at a rate 15 (95% CI 7.2–32) times that of asymptomatic patients. Long-term care facility residents transmit at a rate of 27% (95% CI 13%–51%) that of hospitalized patients, and persons in the community at a rate of 0.1% (95% CI 0.062%–0.2%) that of hospitalized patients. Despite lower transmission rates for asymptomatic carriers and community sources, these transmission routes have a substantial effect on hospital-onset CDI because of the larger reservoir of hospitalized carriers and persons in the community. Asymptomatic carriers and community sources should be accounted for when designing and evaluating control interventions. PMID:26982504

  12. Satellite Remote Sensing: Aerosol Measurements

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph A.

    2013-01-01

    Aerosols are solid or liquid particles suspended in the air, and those observed by satellite remote sensing are typically between about 0.05 and 10 microns in size. (Note that in traditional aerosol science, the term "aerosol" refers to both the particles and the medium in which they reside, whereas for remote sensing, the term commonly refers to the particles only. In this article, we adopt the remote-sensing definition.) They originate from a great diversity of sources, such as wildfires, volcanoes, soils and desert sands, breaking waves, natural biological activity, agricultural burning, cement production, and fossil fuel combustion. They typically remain in the atmosphere from several days to a week or more, and some travel great distances before returning to Earth's surface via gravitational settling or washout by precipitation. Many aerosol sources exhibit strong seasonal variability, and most experience inter-annual fluctuations. As such, the frequent, global coverage that space-based aerosol remote-sensing instruments can provide is making increasingly important contributions to regional and larger-scale aerosol studies.

  13. Monitoring X-Ray Emission from X-Ray Bursters

    NASA Technical Reports Server (NTRS)

    Halpern, Jules P.; Kaaret, Philip

    1999-01-01

    The scientific goal of this project was to monitor a selected sample of x-ray bursters using data from the All-Sky Monitor (ASM) on the Rossi X-Ray Timing Explorer together with data from the Burst and Transient Source Experiment (BATSE) on the Compton Gamma-Ray Observatory to study the long-term temporal evolution of these sources in the x-ray and hard x-ray bands. The project was closely related to "Long-Term Hard X-Ray Monitoring of X-Ray Bursters", NASA project NAG5-3891, and and "Hard x-ray emission of x-ray bursters", NASA project NAG5-4633, and shares publications in common with both of these. The project involved preparation of software for use in monitoring and then the actual monitoring itself. These efforts have lead to results directly from the ASM data and also from Target of Opportunity Observations (TOO) made with the Rossi X-Ray Timing Explorer based on detection of transient hard x-ray outbursts with the ASM and BATSE.

  14. Antimatter Production for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities which could exist within the early part of next century. Results show that although it may be impractical to consider systems which rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-ten-n production capabilities. In fact, such systems could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $60 million per mission.

  15. Possible consequences of severe accidents at the Lubiatowo site, Poland

    NASA Astrophysics Data System (ADS)

    Seibert, Petra; Philipp, Anne; Hofman, Radek; Gufler, Klaus; Sholly, Steven

    2014-05-01

    The construction of a nuclear power plant is under consideration in Poland. One of the sites under discussion is near Lubiatowo, located on the cost of the Baltic Sea northwest of Gdansk. An assessment of possible environmental consequences is carried out for 88 real meteorological cases with the Lagrangian particle dispersion model FLEXPART. Based on literature research, three reactor designs (ABWR, EPR, AP 1000) were identified as being under discussion in Poland. For each of the designs, a set of accident scenarios was evaluated and two source terms per reactor design were selected for analysis. One of the selected source terms was a relatively large release while the second one was a severe accident with an intact containment. Considered endpoints of the calculations are ground contamination with Cs-137 and time-integrated concentrations of I-131 in air as well as committed doses. They are evaluated on a grid of ca. 3 km mesh size covering eastern Central Europe.

  16. Integral representations of solutions of the wave equation based on relativistic wavelets

    NASA Astrophysics Data System (ADS)

    Perel, Maria; Gorodnitskiy, Evgeny

    2012-09-01

    A representation of solutions of the wave equation with two spatial coordinates in terms of localized elementary ones is presented. Elementary solutions are constructed from four solutions with the help of transformations of the affine Poincaré group, i.e. with the help of translations, dilations in space and time and Lorentz transformations. The representation can be interpreted in terms of the initial-boundary value problem for the wave equation in a half-plane. It gives the solution as an integral representation of two types of solutions: propagating localized solutions running away from the boundary under different angles and packet-like surface waves running along the boundary and exponentially decreasing away from the boundary. Properties of elementary solutions are discussed. A numerical investigation of coefficients of the decomposition is carried out. An example of the decomposition of the field created by sources moving along a line with different speeds is considered, and the dependence of coefficients on speeds of sources is discussed.

  17. IMPROVEMENTS IN THE THERMAL NEUTRON CALIBRATION UNIT, TNF2, AT LNMRI/IRD.

    PubMed

    Astuto, A; Fernandes, S S; Patrão, K C S; Fonseca, E S; Pereira, W W; Lopes, R T

    2018-02-21

    The standard thermal neutron flux unit, TNF2, in the Brazilian National Ionizing Radiation Metrology Laboratory was rebuilt. Fluence is still achieved by moderating of four 241Am-Be sources with 0.6 TBq each. The facility was again simulated and redesigned with graphite core and paraffin added graphite blocks surrounding it. Simulations using the MCNPX code on different geometric arrangements of moderator materials and neutron sources were performed. The resulting neutron fluence quality in terms of intensity, spectrum and cadmium ratio was evaluated. After this step, the system was assembled based on the results obtained from the simulations and measurements were performed with equipment existing in LNMRI/IRD and by simulated equipment. This work focuses on the characterization of a central chamber point and external points around the TNF2 in terms of neutron spectrum, fluence and ambient dose equivalent, H*(10). This system was validated with spectra measurements, fluence and H*(10) to ensure traceability.

  18. Spitzer Opens New Path to Break Classic Degeneracy for Jupiter-mass Microlensing Planet OGLE-2017-BLG-1140Lb

    NASA Astrophysics Data System (ADS)

    Calchi Novati, S.; Skowron, J.; Jung, Y. K.; Beichman, C.; Bryden, G.; Carey, S.; Gaudi, B. S.; Henderson, C. B.; Shvartzvald, Y.; Yee, J. C.; Zhu, W.; Spitzer Team; Udalski, A.; Szymański, M. K.; Mróz, P.; Poleski, R.; Soszyński, I.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M.; Rybicki, K.; Iwanek, P.; OGLE Collaboration; Albrow, M. D.; Chung, S.-J.; Gould, A.; Han, C.; Hwang, K.-H.; Ryu, Y.-H.; Shin, I.-G.; Zang, W.; Cha, S.-M.; Kim, D.-J.; Kim, H.-W.; Kim, S.-L.; Lee, C.-U.; Lee, D.-J.; Lee, Y.; Park, B.-G.; Pogge, R. W.; KMTNet Collaboration

    2018-06-01

    We analyze the combined Spitzer and ground-based data for OGLE-2017-BLG-1140 and show that the event was generated by a Jupiter-class ({m}p≃ 1.6 {M}{{J}{{u}}{{p}}}) planet orbiting a mid-late M dwarf (M≃ 0.2 {M}ȯ ) that lies {D}LS}≃ 1.0 {kpc} in the foreground of the microlensed Galactic-bar source star. The planet–host projected separation is {a}\\perp ≃ 1.0 {au}, i.e., well beyond the snow line. By measuring the source proper motion {{\\boldsymbol{μ }}}s from ongoing long-term OGLE imaging and combining this with the lens-source relative proper motion {{\\boldsymbol{μ }}}rel} derived from the microlensing solution, we show that the lens proper motion {{\\boldsymbol{μ }}}l={{\\boldsymbol{μ }}}rel}+{{\\boldsymbol{μ }}}s is consistent with the lens lying in the Galactic disk, although a bulge lens is not ruled out. We show that while the Spitzer and ground-based data are comparably well fitted by planetary (i.e., binary-lens (2L1S)) and binary-source (1L2S) models, the combination of Spitzer and ground-based data decisively favors the planetary model. This is a new channel to resolve the 2L1S/1L2S degeneracy, which can be difficult to break in some cases.

  19. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  20. Development of Load Duration Curve System in Data Scarce Watersheds Based on a Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    WANG, J.

    2017-12-01

    In stream water quality control, the total maximum daily load (TMDL) program is very effective. However, the load duration curves (LDC) of TMDL are difficult to be established because no sufficient observed flow and pollutant data can be provided in data-scarce watersheds in which no hydrological stations or consecutively long-term hydrological data are available. Although the point sources or a non-point sources of pollutants can be clarified easily with the aid of LDC, where does the pollutant come from and to where it will be transported in the watershed cannot be traced by LDC. To seek out the best management practices (BMPs) of pollutants in a watershed, and to overcome the limitation of LDC, we proposed to develop LDC based on a distributed hydrological model of SWAT for the water quality management in data scarce river basins. In this study, firstly, the distributed hydrological model of SWAT was established with the scarce-hydrological data. Then, the long-term daily flows were generated with the established SWAT model and rainfall data from the adjacent weather station. Flow duration curves (FDC) was then developed with the aid of generated daily flows by SWAT model. Considering the goal of water quality management, LDC curves of different pollutants can be obtained based on the FDC. With the monitored water quality data and the LDC curves, the water quality problems caused by the point or non-point source pollutants in different seasons can be ascertained. Finally, the distributed hydrological model of SWAT was employed again to tracing the spatial distribution and the origination of the pollutants of coming from what kind of agricultural practices and/or other human activities. A case study was conducted in the Jian-jiang river, a tributary of Yangtze river, of Duyun city, Guizhou province. Results indicate that this kind of method can realize the water quality management based on TMDL and find out the suitable BMPs for reducing pollutant in a watershed.

Top