Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, D.; Brunett, A.; Passerini, S.
Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less
Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.
2017-01-01
Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304
Uncertainty, variability, and earthquake physics in ground‐motion prediction equations
Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.
2017-01-01
Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20 km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.
High-order scheme for the source-sink term in a one-dimensional water temperature model
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005
High-order scheme for the source-sink term in a one-dimensional water temperature model.
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.
NASA Astrophysics Data System (ADS)
Conti, P.; Testi, D.; Grassi, W.
2017-11-01
This work reviews and compares suitable models for the thermal analysis of forced convection over a heat source in a porous medium. The set of available models refers to an infinite medium in which a fluid moves over different three heat source geometries: i.e. the moving infinite line source, the moving finite line source, and the moving infinite cylindrical source. In this perspective, the present work presents a plain and handy compendium of the above-mentioned models for forced external convection in porous media; besides, we propose a dimensionless analysis to figure out the reciprocal deviation among available models, helping the selection of the most suitable one in the specific case of interest. Under specific conditions, the advection term becomes ineffective in terms of heat transfer performances, allowing the use of purely-conductive models. For that reason, available analytical and numerical solutions for purely-conductive media are also reviewed and compared, again, by dimensionless criteria. Therefore, one can choose the simplest solution, with significant benefits in terms of computational effort and interpretation of the results. The main outcomes presented in the paper are: the conditions under which the system can be considered subject to a Darcy flow, the minimal distance beyond which the finite dimension of the heat source does not affect the thermal field, and the critical fluid velocity needed to have a significant contribution of the advection term in the overall heat transfer process.
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
ERIC Educational Resources Information Center
Mylonas, Kostas; Furnham, Adrian; Divale, William; Leblebici, Cigdem; Gondim, Sonia; Moniz, Angela; Grad, Hector; Alvaro, Jose Luis; Cretu, Romeo Zeno; Filus, Ania; Boski, Pawel
2014-01-01
Several sources of bias can plague research data and individual assessment. When cultural groups are considered, across or even within countries, it is essential that the constructs assessed and evaluated are as free as possible from any source of bias and specifically from bias caused due to culturally specific characteristics. Employing the…
Apprentices and Trainees: Terms and Definitions. Support Document
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2017
2017-01-01
This document covers the data terms used in publications sourced from the National Apprentice and Trainee Collection and their associated data tables. The primary purpose of this document is to assist users of the publications to understand the specific data terms used within them. Terms are listed in alphabetical order with the following…
Students and Courses--Terms and Definitions. Support Document
ERIC Educational Resources Information Center
National Centre for Vocational Education Research (NCVER), 2014
2014-01-01
This document covers the data terms used in publications sourced from the National VET Provider Collection and their associated data tables. The primary purpose of this document is to assist users of the publication to understand the specific data terms used within them. Terms that appear in the publications and data items are listed in…
Murakami, Toshiki; Suzuki, Yoshihiro; Oishi, Hiroyuki; Ito, Kenichi; Nakao, Toshio
2013-05-15
A unique method to trace the source of "difficult-to-settle fine particles," which are a causative factor of long-term turbidity in reservoirs was developed. This method is characterized by cluster analysis of XRD (X-ray diffraction) data and homology comparison of major component compositions between "difficult-to-settle fine particles" contained in landslide soil samples taken from the upstream of a dam, and suspended "long-term turbid water particles" in the reservoir, which is subject to long-term turbidity. The experiment carried out to validate the proposed method, demonstrated a high possibility of being able to make an almost identical match between "difficult-to-settle fine particles" taken from landslide soils at specific locations and "long-term turbid water particles" taken from a reservoir. This method has the potential to determine substances causing long-term turbidity and the locations of soils from which those substances came. Appropriate countermeasures can then be taken at those specific locations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ahlfors, Seppo P.; Jones, Stephanie R.; Ahveninen, Jyrki; Hämäläinen, Matti S.; Belliveau, John W.; Bar, Moshe
2014-01-01
Identifying inter-area communication in terms of the hierarchical organization of functional brain areas is of considerable interest in human neuroimaging. Previous studies have suggested that the direction of magneto- and electroencephalography (MEG, EEG) source currents depends on the layer-specific input patterns into a cortical area. We examined the direction in MEG source currents in a visual object recognition experiment in which there were specific expectations of activation in the fusiform region being driven by either feedforward or feedback inputs. The source for the early non-specific visual evoked response, presumably corresponding to feedforward driven activity, pointed outward, i.e., away from the white matter. In contrast, the source for the later, object-recognition related signals, expected to be driven by feedback inputs, pointed inward, toward the white matter. Associating specific features of the MEG/EEG source waveforms to feedforward and feedback inputs could provide unique information about the activation patterns within hierarchically organized cortical areas. PMID:25445356
Antecedents to Reverse Auction Use
2006-02-01
sourcing the specific products or services. Comprehension of the effect of sourcing strategy on a sourcing professional’s choice of sourcing media ...of strategic items and services are typically manifested in partnerships, long-term contracts, and strategic alliances . Given the nature of such...technical, service, and social benefits a customer firm receives in exchange for the price it pays for a market offering" (Anderson and Narus 1988
ERIC Educational Resources Information Center
Best, Fred
This document was prepared to identify long-term needs and opportunities for adult education, suggesting the implications of long-term social changes without proposing specific actions or institutional arrangements. Following an introduction, chapter 2 discusses the following trends: (1) continued population growth, including the sources and…
NASA thesaurus. Volume 3: Definitions
NASA Technical Reports Server (NTRS)
1988-01-01
Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.
Code of Federal Regulations, 2010 CFR
2010-07-01
... AQUIFERS Review of Projects Affecting the Edwards Underground Reservoir, A Designated Sole Source Aquifer... specifically provided, the term(s): (a) Act means the Public Health Service Act, as amended by the Safe Drinking Water Act, Public Law 93-523. (b) Contaminant means any physical, chemical, biological, or...
A Lagrangian stochastic model is proposed as a tool that can be utilized in forecasting remedial performance and estimating the benefits (in terms of flux and mass reduction) derived from a source zone remedial effort. The stochastic functional relationships that describe the hyd...
Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H
2014-07-01
There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Community Wildfire Events as a Source of Social Conflict
ERIC Educational Resources Information Center
Carroll, Matthew S.; Higgins, Lorie L.; Cohn, Patricia J.; Burchfield, James
2006-01-01
The literature notes that natural disasters, including wildfires, that damage human settlements often have the short-term effect of "bringing people together." Less recognized is the fact that such events can also generate social conflict at the local level. This study examines the specific sources of such social conflict during and after…
Manríquez, Juan J
2008-04-01
Systematic reviews should include as many articles as possible. However, many systematic reviews use only databases with high English language content as sources of trials. Literatura Latino Americana e do Caribe em Ciências da Saúde (LILACS) is an underused source of trials, and there is not a validated strategy for searching clinical trials to be used in this database. The objective of this study was to develop a sensitive search strategy for clinical trials in LILACS. An analytical survey was performed. Several single and multiple-term search strategies were tested for their ability to retrieve clinical trials in LILACS. Sensitivity, specificity, and accuracy of each single and multiple-term strategy were calculated using the results of a hand-search of 44 Chilean journals as gold standard. After combining the most sensitive, specific, and accurate single and multiple-term search strategy, a strategy with a sensitivity of 97.75% (95% confidence interval [CI]=95.98-99.53) and a specificity of 61.85 (95% CI=61.19-62.51) was obtained. LILACS is a source of trials that could improve systematic reviews. A new highly sensitive search strategy for clinical trials in LILACS has been developed. It is hoped this search strategy will improve and increase the utilization of LILACS in future systematic reviews.
Unexpected source of latex sensitization in a neonatal intensive care unit.
Wynn, R J; Boneberg, A; Lakshminrusimha, S
2007-09-01
We report a term infant with gastroschisis who presented with a systemic allergic reaction at a specific time of each day coinciding with infusion from a new preparation of total parenteral nutrition and intravenous lipid emulsion. The source of latex was traced to the rubber stopper of the lipid emulsion. We present this case to highlight the possibility of allergy from this unexpected source in a neonate.
Schiller, Q.; Tu, W.; Ali, A. F.; ...
2017-03-11
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Q.; Tu, W.; Ali, A. F.
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
Krall, Jenna R.; Mulholland, James A.; Russell, Armistead G.; Balachandran, Sivaraman; Winquist, Andrea; Tolbert, Paige E.; Waller, Lance A.; Sarnat, Stefanie Ebelt
2016-01-01
Background: Short-term exposure to ambient fine particulate matter (PM2.5) concentrations has been associated with increased mortality and morbidity. Determining which sources of PM2.5 are most toxic can help guide targeted reduction of PM2.5. However, conducting multicity epidemiologic studies of sources is difficult because source-specific PM2.5 is not directly measured, and source chemical compositions can vary between cities. Objectives: We determined how the chemical composition of primary ambient PM2.5 sources varies across cities. We estimated associations between source-specific PM2.5 and respiratory disease emergency department (ED) visits and examined between-city heterogeneity in estimated associations. Methods: We used source apportionment to estimate daily concentrations of primary source-specific PM2.5 for four U.S. cities. For sources with similar chemical compositions between cities, we applied Poisson time-series regression models to estimate associations between source-specific PM2.5 and respiratory disease ED visits. Results: We found that PM2.5 from biomass burning, diesel vehicle, gasoline vehicle, and dust sources was similar in chemical composition between cities, but PM2.5 from coal combustion and metal sources varied across cities. We found some evidence of positive associations of respiratory disease ED visits with biomass burning PM2.5; associations with diesel and gasoline PM2.5 were frequently imprecise or consistent with the null. We found little evidence of associations with dust PM2.5. Conclusions: We introduced an approach for comparing the chemical compositions of PM2.5 sources across cities and conducted one of the first multicity studies of source-specific PM2.5 and ED visits. Across four U.S. cities, among the primary PM2.5 sources assessed, biomass burning PM2.5 was most strongly associated with respiratory health. Citation: Krall JR, Mulholland JA, Russell AG, Balachandran S, Winquist A, Tolbert PE, Waller LA, Sarnat SE. 2017. Associations between source-specific fine particulate matter and emergency department visits for respiratory disease in four U.S. cities. Environ Health Perspect 125:97–103; http://dx.doi.org/10.1289/EHP271 PMID:27315241
Krall, Jenna R; Mulholland, James A; Russell, Armistead G; Balachandran, Sivaraman; Winquist, Andrea; Tolbert, Paige E; Waller, Lance A; Sarnat, Stefanie Ebelt
2017-01-01
Short-term exposure to ambient fine particulate matter (PM2.5) concentrations has been associated with increased mortality and morbidity. Determining which sources of PM2.5 are most toxic can help guide targeted reduction of PM2.5. However, conducting multicity epidemiologic studies of sources is difficult because source-specific PM2.5 is not directly measured, and source chemical compositions can vary between cities. We determined how the chemical composition of primary ambient PM2.5 sources varies across cities. We estimated associations between source-specific PM2.5 and respiratory disease emergency department (ED) visits and examined between-city heterogeneity in estimated associations. We used source apportionment to estimate daily concentrations of primary source-specific PM2.5 for four U.S. cities. For sources with similar chemical compositions between cities, we applied Poisson time-series regression models to estimate associations between source-specific PM2.5 and respiratory disease ED visits. We found that PM2.5 from biomass burning, diesel vehicle, gasoline vehicle, and dust sources was similar in chemical composition between cities, but PM2.5 from coal combustion and metal sources varied across cities. We found some evidence of positive associations of respiratory disease ED visits with biomass burning PM2.5; associations with diesel and gasoline PM2.5 were frequently imprecise or consistent with the null. We found little evidence of associations with dust PM2.5. We introduced an approach for comparing the chemical compositions of PM2.5 sources across cities and conducted one of the first multicity studies of source-specific PM2.5 and ED visits. Across four U.S. cities, among the primary PM2.5 sources assessed, biomass burning PM2.5 was most strongly associated with respiratory health. Citation: Krall JR, Mulholland JA, Russell AG, Balachandran S, Winquist A, Tolbert PE, Waller LA, Sarnat SE. 2017. Associations between source-specific fine particulate matter and emergency department visits for respiratory disease in four U.S. cities. Environ Health Perspect 125:97-103; http://dx.doi.org/10.1289/EHP271.
Comparing the contributions of ionospheric outflow and high-altitude production to O+ loss at Mars
NASA Astrophysics Data System (ADS)
Liemohn, Michael; Curry, Shannon; Fang, Xiaohua; Johnson, Blake; Fraenz, Markus; Ma, Yingjuan
2013-04-01
The Mars total O+ escape rate is highly dependent on both the ionospheric and high-altitude source terms. Because of their different source locations, they appear in velocity space distributions as distinct populations. The Mars Test Particle model is used (with background parameters from the BATS-R-US magnetohydrodynamic code) to simulate the transport of ions in the near-Mars space environment. Because it is a collisionless model, the MTP's inner boundary is placed at 300 km altitude for this study. The MHD values at this altitude are used to define an ionospheric outflow source of ions for the MTP. The resulting loss distributions (in both real and velocity space) from this ionospheric source term are compared against those from high-altitude ionization mechanisms, in particular photoionization, charge exchange, and electron impact ionization, each of which have their own (albeit overlapping) source regions. In subsequent simulations, the MHD values defining the ionospheric outflow are systematically varied to parametrically explore possible ionospheric outflow scenarios. For the nominal MHD ionospheric outflow settings, this source contributes only 10% to the total O+ loss rate, nearly all via the central tail region. There is very little dependence of this percentage on the initial temperature, but a change in the initial density or bulk velocity directly alters this loss through the central tail. However, a density or bulk velocity increase of a factor of 10 makes the ionospheric outflow loss comparable in magnitude to the loss from the combined high-altitude sources. The spatial and velocity space distributions of escaping O+ are examined and compared for the various source terms, identifying features specific to each ion source mechanism. These results are applied to a specific Mars Express orbit and used to interpret high-altitude observations from the ion mass analyzer onboard MEX.
NASA Astrophysics Data System (ADS)
Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.
2017-12-01
A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the residuals for our different sets independently to see how path terms differ between event-to-station collections. The path-specific information gained from this can inform development of terms for regional GMPEs, through understanding of these seismological phenomena.
Fuks, Kateryna B; Weinmayr, Gudrun; Hennig, Frauke; Tzivian, Lilian; Moebus, Susanne; Jakobs, Hermann; Memmesheimer, Michael; Kälsch, Hagen; Andrich, Silke; Nonnemacher, Michael; Erbel, Raimund; Jöckel, Karl-Heinz; Hoffmann, Barbara
2016-08-01
Long-term exposure to fine particulate matter (PM2.5) may lead to increased blood pressure (BP). The role of industry- and traffic-specific PM2.5 remains unclear. We investigated the associations of residential long-term source-specific PM2.5 exposure with arterial BP and incident hypertension in the population-based Heinz Nixdorf Recall cohort study. We defined hypertension as systolic BP≥140mmHg, or diastolic BP≥90mmHg, or current use of BP lowering medication. Long-term concentrations of PM2.5 from all local sources (PM2.5ALL), local industry (PM2.5IND) and traffic (PM2.5TRA) were modeled with a dispersion and chemistry transport model (EURAD-CTM) with a 1km(2) resolution. We performed a cross-sectional analysis with BP and prevalent hypertension at baseline, using linear and logistic regression, respectively, and a longitudinal analysis with incident hypertension at 5-year follow-up, using Poisson regression with robust variance estimation. We adjusted for age, sex, body mass index, lifestyle, education, and major road proximity. Change in BP (mmHg), odds ratio (OR) and relative risk (RR) for hypertension were calculated per 1μg/m(3) of exposure concentration. PM2.5ALL was highly correlated with PM2.5IND (Spearman's ρ=0.92) and moderately with PM2.5TRA (ρ=0.42). In adjusted cross-sectional analysis with 4539 participants, we found positive associations of PM2.5ALL with systolic (0.42 [95%-CI: 0.03, 0.80]) and diastolic (0.25 [0.04, 0.46]) BP. Higher, but less precise estimates were found for PM2.5IND (systolic: 0.55 [-0.05, 1.14]; diastolic: 0.35 [0.03, 0.67]) and PM2.5TRA (systolic: 0.88 [-1.55, 3.31]; diastolic: 0.41 [-0.91, 1.73]). We found crude positive association of PM2.5TRA with prevalence (OR 1.41 [1.10, 1.80]) and incidence of hypertension (RR 1.38 [1.03, 1.85]), attenuating after adjustment (OR 1.19 [0.90, 1.58] and RR 1.28 [0.94, 1.72]). We found no association of PM2.5ALL and PM2.5IND with hypertension. Long-term exposures to all-source and industry-specific PM2.5 were positively related to BP. We could not separate the effects of industry-specific PM2.5 from all-source PM2.5. Estimates with traffic-specific PM2.5 were generally higher but inconclusive. Copyright © 2016. Published by Elsevier GmbH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Values of current energy technology costs and prices, available from a variety of sources, can sometimes vary. While some of this variation can be due to differences in the specific materials or configurations assumed, it can also reflect differences in the definition and context of the terms "cost" and "price." This fact sheet illustrates and explains this latter source of variation in a case study of automotive lithium-ion batteries.
The MIT/OSO 7 catalog of X-ray sources - Intensities, spectra, and long-term variability
NASA Technical Reports Server (NTRS)
Markert, T. H.; Laird, F. N.; Clark, G. W.; Hearn, D. R.; Sprott, G. F.; Li, F. K.; Bradt, H. V.; Lewin, W. H. G.; Schnopper, H. W.; Winkler, P. F.
1979-01-01
This paper is a summary of the observations of the cosmic X-ray sky performed by the MIT 1-40-keV X-ray detectors on OSO 7 between October 1971 and May 1973. Specifically, mean intensities or upper limits of all third Uhuru or OSO 7 cataloged sources (185 sources) in the 3-10-keV range are computed. For those sources for which a statistically significant (greater than 20) intensity was found in the 3-10-keV band (138 sources), further intensity determinations were made in the 1-15-keV, 1-6-keV, and 15-40-keV energy bands. Graphs and other simple techniques are provided to aid the user in converting the observed counting rates to convenient units and in determining spectral parameters. Long-term light curves (counting rates in one or more energy bands as a function of time) are plotted for 86 of the brighter sources.
Enhanced Elliptic Grid Generation
NASA Technical Reports Server (NTRS)
Kaul, Upender K.
2007-01-01
An enhanced method of elliptic grid generation has been invented. Whereas prior methods require user input of certain grid parameters, this method provides for these parameters to be determined automatically. "Elliptic grid generation" signifies generation of generalized curvilinear coordinate grids through solution of elliptic partial differential equations (PDEs). Usually, such grids are fitted to bounding bodies and used in numerical solution of other PDEs like those of fluid flow, heat flow, and electromagnetics. Such a grid is smooth and has continuous first and second derivatives (and possibly also continuous higher-order derivatives), grid lines are appropriately stretched or clustered, and grid lines are orthogonal or nearly so over most of the grid domain. The source terms in the grid-generating PDEs (hereafter called "defining" PDEs) make it possible for the grid to satisfy requirements for clustering and orthogonality properties in the vicinity of specific surfaces in three dimensions or in the vicinity of specific lines in two dimensions. The grid parameters in question are decay parameters that appear in the source terms of the inhomogeneous defining PDEs. The decay parameters are characteristic lengths in exponential- decay factors that express how the influences of the boundaries decrease with distance from the boundaries. These terms govern the rates at which distance between adjacent grid lines change with distance from nearby boundaries. Heretofore, users have arbitrarily specified decay parameters. However, the characteristic lengths are coupled with the strengths of the source terms, such that arbitrary specification could lead to conflicts among parameter values. Moreover, the manual insertion of decay parameters is cumbersome for static grids and infeasible for dynamically changing grids. In the present method, manual insertion and user specification of decay parameters are neither required nor allowed. Instead, the decay parameters are determined automatically as part of the solution of the defining PDEs. Depending on the shape of the boundary segments and the physical nature of the problem to be solved on the grid, the solution of the defining PDEs may provide for rates of decay to vary along and among the boundary segments and may lend itself to interpretation in terms of one or more physical quantities associated with the problem.
NASA Astrophysics Data System (ADS)
Lavrieux, Marlène; Meusburger, Katrin; Birkholz, Axel; Alewell, Christine
2017-04-01
Slope destabilization and associated sediment transfer are among the major causes of aquatic ecosystems and surface water quality impairment. Through land uses and agricultural practices, human activities modify the soil erosive risk and the catchment connectivity, becoming a key factor of sediment dynamics. Hence, restoration and management plans of water bodies can only be efficient if the sediment sources and the proportion attributable to different land uses and agricultural practices are identified. Several sediment fingerprinting methods, based on the geochemical (elemental composition), color, magnetic or isotopic (137Cs) sediment properties, are currently in use. However, these tools are not suitable for a land-use based fingerprinting. New organic geochemical approaches are now developed to discriminate source-soil contributions under different land-uses: The compound-specific stable isotopes (CSSI) technique, based on the biomarkers isotopic signature (here, fatty acids δ13C) variability within the plant species, The analysis of highly specific (i.e. source-family- or even source-species-specific) biomarkers assemblages, which use is until now mainly restricted to palaeoenvironmental reconstructions, and which offer also promising prospects for tracing current sediment origin. The approach was applied to reconstruct the spatio-temporal variability of the main sediment sources of Baldegg Lake (Lucern Canton, Switzerland), which suffers from a substantial eutrophication, despite several restoration attempts during the last 40 years. The sediment supplying areas and the exported volumes were identified using CSSI technique and highly specific biomarkers, coupled to a sediment connectivity model. The sediment origin variability was defined through the analysis of suspended river sediments sampled at high flow conditions (short term), and by the analysis of a lake sediment core covering the last 130 years (long term). The results show the utility of biomarkers and CSSI to track organic sources in contrasted land-use settings. Associated to other fingerprinting methods, this approach could in the future become a decision support tool for catchments management.
Associations of short-term exposure to fine particulate matter (PM2.5) with daily mortality may be due to specific PM2.5 chemical components. Objectives: Daily concentrations of PM2.5 chemical species were measured over five consecutive years in Denver, CO to investigate whethe...
English-Russian, Russian-English glossary of coal-cleaning terms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pekar, J.
1987-09-01
The document is an English-Russian, Russian-English glossary of coal-cleaning terms, compiled as a joint U.S./Soviet effort. The need for the glossary resulted from the growing number of language-specific terms used during information exchanges within the framework of the U.S./U.S.S.R. Working Group on Stationary Source Air Pollution Control Technology, under the U.S./U.S.S.R. Agreement of Cooperation in the Field of Environmental Protection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawloski, G A; Tompson, A F B; Carle, S F
The objectives of this report are to develop, summarize, and interpret a series of detailed unclassified simulations that forecast the nature and extent of radionuclide release and near-field migration in groundwater away from the CHESHIRE underground nuclear test at Pahute Mesa at the NTS over 1000 yrs. Collectively, these results are called the CHESHIRE Hydrologic Source Term (HST). The CHESHIRE underground nuclear test was one of 76 underground nuclear tests that were fired below or within 100 m of the water table between 1965 and 1992 in Areas 19 and 20 of the NTS. These areas now comprise the Pahutemore » Mesa Corrective Action Unit (CAU) for which a separate subregional scale flow and transport model is being developed by the UGTA Project to forecast the larger-scale migration of radionuclides from underground tests on Pahute Mesa. The current simulations are being developed, on one hand, to more fully understand the complex coupled processes involved in radionuclide migration, with a specific focus on the CHESHIRE test. While remaining unclassified, they are as site specific as possible and involve a level of modeling detail that is commensurate with the most fundamental processes, conservative assumptions, and representative data sets available. However, the simulation results are also being developed so that they may be simplified and interpreted for use as a source term boundary condition at the CHESHIRE location in the Pahute Mesa CAU model. In addition, the processes of simplification and interpretation will provide generalized insight as to how the source term behavior at other tests may be considered or otherwise represented in the Pahute Mesa CAU model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zachara, John M.; Chen, Xingyuan; Murray, Chris
In this study, a well-field within a uranium (U) plume in the groundwater-surface water transition zone was monitored for a 3 year period for water table elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (U aq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time seriesmore » trends for U aq and SpC were complex and displayed large temporal and well-to-well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common behaviors resulting from the intrusion dynamics of river water and the location of source terms. Hot-spots in U aq varied in location with increasing water table elevation through the combined effects of advection and source term location. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U aq was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While U aq time-series concentration trends varied significantly from year-to-year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of river water intrusion.« less
Zachara, John M.; Chen, Xingyuan; Murray, Chris; ...
2016-03-04
In this study, a well-field within a uranium (U) plume in the groundwater-surface water transition zone was monitored for a 3 year period for water table elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (U aq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time seriesmore » trends for U aq and SpC were complex and displayed large temporal and well-to-well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common behaviors resulting from the intrusion dynamics of river water and the location of source terms. Hot-spots in U aq varied in location with increasing water table elevation through the combined effects of advection and source term location. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U aq was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While U aq time-series concentration trends varied significantly from year-to-year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of river water intrusion.« less
Nomenclature in laboratory robotics and automation (IUPAC Recommendation 1994)
(Skip) Kingston, H. M.; Kingstonz, M. L.
1994-01-01
These recommended terms have been prepared to help provide a uniform approach to terminology and notation in laboratory automation and robotics. Since the terminology used in laboratory automation and robotics has been derived from diverse backgrounds, it is often vague, imprecise, and in some cases, in conflict with classical automation and robotic nomenclature. These dejinitions have been assembled from standards, monographs, dictionaries, journal articles, and documents of international organizations emphasizing laboratory and industrial automation and robotics. When appropriate, definitions have been taken directly from the original source and identified with that source. However, in some cases no acceptable definition could be found and a new definition was prepared to define the object, term, or action. Attention has been given to defining specific robot types, coordinate systems, parameters, attributes, communication protocols and associated workstations and hardware. Diagrams are included to illustrate specific concepts that can best be understood by visualization. PMID:18924684
Widdowson, M.A.; Chapelle, F.H.; Brauner, J.S.; ,
2003-01-01
A method is developed for optimizing monitored natural attenuation (MNA) and the reduction in the aqueous source zone concentration (??C) required to meet a site-specific regulatory target concentration. The mathematical model consists of two one-dimensional equations of mass balance for the aqueous phase contaminant, to coincide with up to two distinct zones of transformation, and appropriate boundary and intermediate conditions. The solution is written in terms of zone-dependent Peclet and Damko??hler numbers. The model is illustrated at a chlorinated solvent site where MNA was implemented following source treatment using in-situ chemical oxidation. The results demonstrate that by not taking into account a variable natural attenuation capacity (NAC), a lower target ??C is predicted, resulting in unnecessary source concentration reduction and cost with little benefit to achieving site-specific remediation goals.
Analysis and Synthesis of Tonal Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Allen, Matthew P.; Rizzi, Stephen A.; Burdisso, Ricardo; Okcu, Selen
2012-01-01
Fixed and rotary wing aircraft operations can have a significant impact on communities in proximity to airports. Simulation of predicted aircraft flyover noise, paired with listening tests, is useful to noise reduction efforts since it allows direct annoyance evaluation of aircraft or operations currently in the design phase. This paper describes efforts to improve the realism of synthesized source noise by including short term fluctuations, specifically for inlet-radiated tones resulting from the fan stage of turbomachinery. It details analysis performed on an existing set of recorded turbofan data to isolate inlet-radiated tonal fan noise, then extract and model short term tonal fluctuations using the analytic signal. Methodologies for synthesizing time-variant tonal and broadband turbofan noise sources using measured fluctuations are also described. Finally, subjective listening test results are discussed which indicate that time-variant synthesized source noise is perceived to be very similar to recordings.
Simulations of cold electroweak baryogenesis: dependence on the source of CP-violation
NASA Astrophysics Data System (ADS)
Mou, Zong-Gang; Saffin, Paul M.; Tranberg, Anders
2018-05-01
We compute the baryon asymmetry created in a tachyonic electroweak symmetry breaking transition, focusing on the dependence on the source of effective CP-violation. Earlier simulations of Cold Electroweak Baryogenesis have almost exclusively considered a very specific CP-violating term explicitly biasing Chern-Simons number. We compare four different dimension six, scalar-gauge CP-violating terms, involving both the Higgs field and another dynamical scalar coupled to SU(2) or U(1) gauge fields. We find that for sensible values of parameters, all implementations can generate a baryon asymmetry consistent with observations, showing that baryogenesis is a generic outcome of a fast tachyonic electroweak transition.
Source Credibility in Tobacco Control Messaging
Schmidt, Allison M.; Ranney, Leah M.; Pepper, Jessica K.; Goldstein, Adam O.
2016-01-01
Objectives Perceived credibility of a message’s source can affect persuasion. This paper reviews how beliefs about the source of tobacco control messages may encourage attitude and behavior change. Methods We conducted a series of searches of the peer-reviewed literature using terms from communication and public health fields. We reviewed research on source credibility, its underlying concepts, and its relation to the persuasiveness of tobacco control messages. Results We recommend an agenda for future research to bridge the gaps between communication literature on source credibility and tobacco control research. Our recommendations are to study the impact of source credibility on persuasion with long-term behavior change outcomes, in different populations and demographic groups, by developing new credibility measures that are topic- and organization-specific, by measuring how credibility operates across media platforms, and by identifying factors that enhance credibility and persuasion. Conclusions This manuscript reviews the state of research on source credibility and identifies gaps that are maximally relevant to tobacco control communication. Knowing first whether a source is perceived as credible, and second, how to enhance perceived credibility, can inform the development of future tobacco control campaigns and regulatory communications. PMID:27525298
Rey-Martinez, Jorge; Pérez-Fernández, Nicolás
2016-12-01
The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.
A UMLS-based spell checker for natural language processing in vaccine safety.
Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C
2007-02-12
The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74-75), 100% (95% CI: 100-100), and 47% (95% CI: 46%-48%), respectively. We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest.
A UMLS-based spell checker for natural language processing in vaccine safety
Tolentino, Herman D; Matters, Michael D; Walop, Wikke; Law, Barbara; Tong, Wesley; Liu, Fang; Fontelo, Paul; Kohl, Katrin; Payne, Daniel C
2007-01-01
Background The Institute of Medicine has identified patient safety as a key goal for health care in the United States. Detecting vaccine adverse events is an important public health activity that contributes to patient safety. Reports about adverse events following immunization (AEFI) from surveillance systems contain free-text components that can be analyzed using natural language processing. To extract Unified Medical Language System (UMLS) concepts from free text and classify AEFI reports based on concepts they contain, we first needed to clean the text by expanding abbreviations and shortcuts and correcting spelling errors. Our objective in this paper was to create a UMLS-based spelling error correction tool as a first step in the natural language processing (NLP) pipeline for AEFI reports. Methods We developed spell checking algorithms using open source tools. We used de-identified AEFI surveillance reports to create free-text data sets for analysis. After expansion of abbreviated clinical terms and shortcuts, we performed spelling correction in four steps: (1) error detection, (2) word list generation, (3) word list disambiguation and (4) error correction. We then measured the performance of the resulting spell checker by comparing it to manual correction. Results We used 12,056 words to train the spell checker and tested its performance on 8,131 words. During testing, sensitivity, specificity, and positive predictive value (PPV) for the spell checker were 74% (95% CI: 74–75), 100% (95% CI: 100–100), and 47% (95% CI: 46%–48%), respectively. Conclusion We created a prototype spell checker that can be used to process AEFI reports. We used the UMLS Specialist Lexicon as the primary source of dictionary terms and the WordNet lexicon as a secondary source. We used the UMLS as a domain-specific source of dictionary terms to compare potentially misspelled words in the corpus. The prototype sensitivity was comparable to currently available tools, but the specificity was much superior. The slow processing speed may be improved by trimming it down to the most useful component algorithms. Other investigators may find the methods we developed useful for cleaning text using lexicons specific to their area of interest. PMID:17295907
Source apportionment of airborne particulates through receptor modeling: Indian scenario
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Murari, Vishnu; Kumar, Manish; Raju, M. P.
2015-10-01
Airborne particulate chemistry mostly governed by associated sources and apportionment of specific sources is extremely essential to delineate explicit control strategies. The present submission initially deals with the publications (1980s-2010s) of Indian origin which report regional heterogeneities of particulate concentrations with reference to associated species. Such meta-analyses clearly indicate the presence of reservoir of both primary and secondary aerosols in different geographical regions. Further, identification of specific signatory molecules for individual source category was also evaluated in terms of their scientific merit and repeatability. Source signatures mostly resemble international profile while, in selected cases lack appropriateness. In India, source apportionment (SA) of airborne particulates was initiated way back in 1985 through factor analysis, however, principal component analysis (PCA) shares a major proportion of applications (34%) followed by enrichment factor (EF, 27%), chemical mass balance (CMB, 15%) and positive matrix factorization (PMF, 9%). Mainstream SA analyses identify earth crust and road dust resuspensions (traced by Al, Ca, Fe, Na and Mg) as a principal source (6-73%) followed by vehicular emissions (traced by Fe, Cu, Pb, Cr, Ni, Mn, Ba and Zn; 5-65%), industrial emissions (traced by Co, Cr, Zn, V, Ni, Mn, Cd; 0-60%), fuel combustion (traced by K, NH4+, SO4-, As, Te, S, Mn; 4-42%), marine aerosols (traced by Na, Mg, K; 0-15%) and biomass/refuse burning (traced by Cd, V, K, Cr, As, TC, Na, K, NH4+, NO3-, OC; 1-42%). In most of the cases, temporal variations of individual source contribution for a specific geographic region exhibit radical heterogeneity possibly due to unscientific orientation of individual tracers for specific source and well exaggerated by methodological weakness, inappropriate sample size, implications of secondary aerosols and inadequate emission inventories. Conclusively, a number of challenging issues and specific recommendations have been included which need to be considered for a scientific apportionment of particulate sources in different geographical regions of India.
A Variable Frequency, Mis-Match Tolerant, Inductive Plasma Source
NASA Astrophysics Data System (ADS)
Rogers, Anthony; Kirchner, Don; Skiff, Fred
2014-10-01
Presented here is a survey and analysis of an inductively coupled, magnetically confined, singly ionized Argon plasma generated by a square-wave, variable frequency plasma source. The helicon-style antenna is driven directly by the class ``D'' amplifier without matching network for increased efficiency while maintaining independent control of frequency and applied power at the feed point. The survey is compared to similar data taken using a traditional exciter--power amplifier--matching network source. Specifically, the flexibility of this plasma source in terms of the independent control of electron plasma temperature and density is discussed in comparison to traditional source arrangements. Supported by US DOE Grant DE-FG02-99ER54543.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genn Saji
2006-07-01
The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less
Seismic hazard assessment over time: Modelling earthquakes in Taiwan
NASA Astrophysics Data System (ADS)
Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting
2017-04-01
To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.
McDonnell, S; Troiano, R P; Barker, N; Noji, E; Hlady, W G; Hopkins, R
1995-12-01
Two three-stage cluster surveys were conducted in South Dade County, Florida, 14 months apart, to assess recovery following Hurricane Andrew. Response rates were 75 per cent and 84 per cent. Sources of assistance used in recovery from Hurricane Andrew differed according to race, per capita income, ethnicity, and education. Reports of improved living situation post-hurricane were not associated with receiving relief assistance, but reports of a worse situation were associated with loss of income, being exploited, or job loss. The number of households reporting problems with crime and community violence doubled between the two surveys. Disaster relief efforts had less impact on subjective long-term recovery than did job or income loss or housing repair difficulties. Existing sources of assistance were used more often than specific post-hurricane relief resources. The demographic make-up of a community may determine which are the most effective means to inform them after a disaster and what sources of assistance may be useful.
Lanoue, Jason; Leonardos, Evangelos D.; Ma, Xiao; Grodzinski, Bernard
2017-01-01
Advancements in light-emitting diode (LED) technology have made them a viable alternative to current lighting systems for both sole and supplemental lighting requirements. Understanding how wavelength specific LED lighting can affect plants is thus an area of great interest. Much research is available on the wavelength specific responses of leaves from multiple crops when exposed to long-term wavelength specific lighting. However, leaf measurements do not always extrapolate linearly to the complexities which are found within a whole plant canopy, namely mutual shading and leaves of different ages. Taken together, both tomato (Solanum lycopersicum) leaves under short-term illumination and lisianthus (Eustoma grandiflorum) and tomato whole plant diurnal patterns of plants acclimated to specific lighting indicate wavelength specific responses of both H2O and CO2 gas exchanges involved in the major growth parameters of a plant. Tomato leaves grown under a white light source indicated an increase in transpiration rate and internal CO2 concentration and a subsequent decrease in water-use-efficiency (WUE) when exposed to a blue LED light source compared to a green LED light source. Interestingly, the maximum photosynthetic rate was observed to be similar. Using plants grown under wavelength specific supplemental lighting in a greenhouse, a decrease in whole plant WUE was seen in both crops under both red-blue (RB) and red-white (RW) LEDs when compared to a high pressure sodium (HPS) light. Whole plant WUE was decreased by 31% under the RB LED treatment for both crops compared to the HPS treatment. Tomato whole plant WUE was decreased by 25% and lisianthus whole plant WUE was decreased by 15% when compared to the HPS treatment when grown under RW LED. The understanding of the effects of wavelength specific lighting on both leaf and whole plant gas exchange has significant implications on basic academic research as well as commercial greenhouse production. PMID:28676816
Lanoue, Jason; Leonardos, Evangelos D; Ma, Xiao; Grodzinski, Bernard
2017-01-01
Advancements in light-emitting diode (LED) technology have made them a viable alternative to current lighting systems for both sole and supplemental lighting requirements. Understanding how wavelength specific LED lighting can affect plants is thus an area of great interest. Much research is available on the wavelength specific responses of leaves from multiple crops when exposed to long-term wavelength specific lighting. However, leaf measurements do not always extrapolate linearly to the complexities which are found within a whole plant canopy, namely mutual shading and leaves of different ages. Taken together, both tomato ( Solanum lycopersicum ) leaves under short-term illumination and lisianthus ( Eustoma grandiflorum ) and tomato whole plant diurnal patterns of plants acclimated to specific lighting indicate wavelength specific responses of both H 2 O and CO 2 gas exchanges involved in the major growth parameters of a plant. Tomato leaves grown under a white light source indicated an increase in transpiration rate and internal CO 2 concentration and a subsequent decrease in water-use-efficiency (WUE) when exposed to a blue LED light source compared to a green LED light source. Interestingly, the maximum photosynthetic rate was observed to be similar. Using plants grown under wavelength specific supplemental lighting in a greenhouse, a decrease in whole plant WUE was seen in both crops under both red-blue (RB) and red-white (RW) LEDs when compared to a high pressure sodium (HPS) light. Whole plant WUE was decreased by 31% under the RB LED treatment for both crops compared to the HPS treatment. Tomato whole plant WUE was decreased by 25% and lisianthus whole plant WUE was decreased by 15% when compared to the HPS treatment when grown under RW LED. The understanding of the effects of wavelength specific lighting on both leaf and whole plant gas exchange has significant implications on basic academic research as well as commercial greenhouse production.
Temporal X-ray astronomy with a pinhole camera. [cygnus and scorpius constellation
NASA Technical Reports Server (NTRS)
Holt, S. S.
1975-01-01
Preliminary results from the Ariel-5 all-sky X-ray monitor are presented, along with sufficient experiment details to define the experiment sensitivity. Periodic modulation of the X-ray emission was investigated from three sources with which specific periods were associated, with the results that the 4.8 hour variation from Cyg X-3 was confirmed, a long-term average 5.6 day variation from Cyg X-1 was discovered, and no detectable 0.787 day modulation of Sco X-1 was observed. Consistency of the long-term Sco X-1 emission with a shot-noise model is discussed, wherein the source behavior is shown to be interpretable as approximately 100 flares per day, each with a duration of several hours. A sudden increase in the Cyg X-1 intensity by almost a factor of three on 22 April 1975 is reported, after 5 months of relative source constancy. The light curve of a bright nova-like transient source in Triangulum is presented, and compared with previously observed transient sources. Preliminary evidence for the existence of X-ray bursts with duration less than 1 hour is offered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.
1995-04-01
This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less
Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.
Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N
2007-12-07
A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.
Towards A Topological Framework for Integrating Semantic Information Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael
2014-09-07
In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.
Source and identification of heavy ions in the equatorial F layer.
NASA Technical Reports Server (NTRS)
Hanson, W. B.; Sterling, D. L.; Woodman, R. F.
1972-01-01
Further evidence is presented to show that the interpretation of some Ogo 6 retarding potential analyzer (RPA) results in terms of ambient Fe+ ions is correct. The Fe+ ions are observed only within dip latitudes of plus or minus 30 deg, and the reason for this latitudinal specificity is discussed in terms of a low-altitude source region and F region diffusion and electrodynamic drift. It is shown that the polarization field associated with the equatorial electrojet will raise ions to 160 km out of a chemical source region below 100 km but it will do so only in a narrow region centered on the dip equator. Subsequent vertical ExB drift, coupled with motions along the magnetic fields, can move the ions to greater heights and greater latitudes. There should be a resultant fountain of metallic ions rising near the equator that subsequently descends back to the E and D layers at tropical latitudes.
Structuring the Multimedia Deal: Legal Issues--Part 1: Licensing in the Multimedia Arena.
ERIC Educational Resources Information Center
Gersh, David L.; Jeffrey, Sheri
1993-01-01
Provides an overview of legal issues related to licensing entertainment rights for multimedia source materials, including the grant of rights clause, copyright ownership, territory and languages, term provision, specifications, approvals/controls, royalties, guilds, bankruptcies, termination of the license, and confidentiality. Common mistakes…
Long Term Temporal and Spectral Evolution of Point Sources in Nearby Elliptical Galaxies
NASA Astrophysics Data System (ADS)
Durmus, D.; Guver, T.; Hudaverdi, M.; Sert, H.; Balman, Solen
2016-06-01
We present the results of an archival study of all the point sources detected in the lines of sight of the elliptical galaxies NGC 4472, NGC 4552, NGC 4649, M32, Maffei 1, NGC 3379, IC 1101, M87, NGC 4477, NGC 4621, and NGC 5128, with both the Chandra and XMM-Newton observatories. Specifically, we studied the temporal and spectral evolution of these point sources over the course of the observations of the galaxies, mostly covering the 2000 - 2015 period. In this poster we present the first results of this study, which allows us to further constrain the X-ray source population in nearby elliptical galaxies and also better understand the nature of individual point sources.
On Seizing the Source: Toward a Phenomenology of Religious Violence
Staudigl, Michael
2016-01-01
Abstract In this paper I argue that we need to analyze ‘religious violence’ in the ‘post-secular context’ in a twofold way: rather than simply viewing it in terms of mere irrationality, senselessness, atavism, or monstrosity – terms which, as we witness today on an immense scale, are strongly endorsed by the contemporary theater of cruelty committed in the name of religion – we also need to understand it in terms of an ‘originary supplement’ of ‘disengaged reason’. In order to confront its specificity beyond traditional explanations of violence, I propose an integrated phenomenological account of religion that traces the phenomenality of religion in terms of a correlation between the originary givenness of transcendence and capable man’s creative capacities to respond to it. Following Ricœur, I discuss ‘religious violence’ in terms of a monopolizing appropriation of the originary source of givenness that conflates man’s freedom to poetically respond to the appeal of the foundational with the surreptitiously claimed sovereignty to make it happen in a practical transfiguration of the everyday. PMID:28690372
A study of comprehension and use of weather information by various agricultural groups in Wisconsin
NASA Technical Reports Server (NTRS)
Smith, J. L.
1972-01-01
An attempt was made to determine whether current techniques are adequate for communicating improved weather forecasts to users. Primary concern was for agricultural users. Efforts were made to learn the preferred source of weather forecasts and the frequency of use. Attempts were also made to measure knowledge of specific terms having to do with weather and comprehension of terms less often used but critical to varying intensities of weather.
Tang, Xiao-Bin; Meng, Jia; Wang, Peng; Cao, Ye; Huang, Xi; Wen, Liang-Sheng; Chen, Da
2016-04-01
A small-sized UAV (NH-UAV) airborne system with two gamma spectrometers (LaBr3 detector and HPGe detector) was developed to monitor activity concentration in serious nuclear accidents, such as the Fukushima nuclear accident. The efficiency calibration and determination of minimum detectable activity concentration (MDAC) of the specific system were studied by MC simulations at different flight altitudes, different horizontal distances from the detection position to the source term center and different source term sizes. Both air and ground radiation were considered in the models. The results obtained may provide instructive suggestions for in-situ radioactivity measurements of NH-UAV. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Beck, Jeffrey; Bos, Jeremy P.
2017-05-01
We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.
Potential sources of precipitation in Lake Baikal basin
NASA Astrophysics Data System (ADS)
Shukurov, K. A.; Mokhov, I. I.
2017-11-01
Based on the data of long-term measurements at 23 meteorological stations in the Russian part of the Lake Baikal basin the probabilities of daily precipitation with different intensity and their contribution to the total precipitation are estimated. Using the trajectory model HYSPLIT_4 for each meteorological station for the period 1948-2016 the 10-day backward trajectories of air parcels, the height of these trajectories and distribution of specific humidity along the trajectories are calculated. The average field of power of potential sources of daily precipitation (less than 10 mm) for all meteorological stations in the Russian part of the Lake Baikal basin was obtained using the CWT (concentration weighted trajectory) method. The areas have been identified from which within 10 days water vapor can be transported to the Lake Baikal basin, as well as regions of the most and least powerful potential sources. The fields of the mean height of air parcels trajectories and the mean specific humidity along the trajectories are compared with the field of mean power of potential sources.
MacLean, Alice; Sweeting, Helen; Hunt, Kate
2012-01-01
Objective To compare the effectiveness of systematic review literature searches that use either generic or specific terms for health outcomes. Design Prospective comparative study of two electronic literature search strategies. The ‘generic’ search included general terms for health such as ‘adolescent health’, ‘health status’, ‘morbidity’, etc. The ‘specific’ search focused on terms for a range of specific illnesses, such as ‘headache’, ‘epilepsy’, ‘diabetes mellitus’, etc. Data sources The authors searched Medline, Embase, the Cumulative Index to Nursing and Allied Health Literature, PsycINFO and the Education Resources Information Center for studies published in English between 1992 and April 2010. Main outcome measures Number and proportion of studies included in the systematic review that were identified from each search. Results The two searches tended to identify different studies. Of 41 studies included in the final review, only three (7%) were identified by both search strategies, 21 (51%) were identified by the generic search only and 17 (41%) were identified by the specific search only. 5 of the 41 studies were also identified through manual searching methods. Studies identified by the two ELS differed in terms of reported health outcomes, while each ELS uniquely identified some of the review's higher quality studies. Conclusions Electronic literature searches (ELS) are a vital stage in conducting systematic reviews and therefore have an important role in attempts to inform and improve policy and practice with the best available evidence. While the use of both generic and specific health terms is conventional for many reviewers and information scientists, there are also reviews that rely solely on either generic or specific terms. Based on the findings, reliance on only the generic or specific approach could increase the risk of systematic reviews missing important evidence and, consequently, misinforming decision makers. However, future research should test the generalisability of these findings. PMID:22734117
The Dilemma of the Modern University in Balancing Competitive Agendas: The USQ Experience
ERIC Educational Resources Information Center
Lovegrove, Bill; Clarke, John
2008-01-01
The Australian government uses numerous strategies to promote specific agendas--including continued efforts to deregulate the higher education sector. These strategies comprise the reduction of government funding to universities in real terms to oblige institutions to seek alternative sources of income; the targeted deployment of government…
Educating the Ablest: Twenty Years Later
ERIC Educational Resources Information Center
Culross, Rita R.
2015-01-01
This study examines the current lives of thirty-five individuals who participated in high school gifted programs twenty years ago. The research specifically looked at educational attainment and career goals in terms of expressed aspirations in high school, using social media and other Internet sources. Results indicated continued support for the…
Operant Conditioning and Learning: Examples, Sources, Technology.
ERIC Educational Resources Information Center
Pedrini, Bonnie C.; Pedrini, D. T.
The purpose of this paper is to relate psychology to teaching generally, and to relate behavior shaping to curriculum, specifically. Focusing on operant conditioning and learning, many studies are cited which illustrate some of the work being done toward effectively shaping or modifying student behavior whether in terms of subject matter or…
Pupils' Understanding of Air Pollution
ERIC Educational Resources Information Center
Dimitriou, Anastasia; Christidou, Vasilia
2007-01-01
This paper reports on a study of pupils' knowledge and understanding of atmospheric pollution. Specifically, the study is aimed at identifying: 1) the extent to which pupils conceptualise the term "air pollution" in a scientifically appropriate way; 2) pupils' knowledge of air pollution sources and air pollutants; and 3) pupils'…
YouTube as a patient-information source for root canal treatment.
Nason, K; Donnelly, A; Duncan, H F
2016-12-01
To assess the content and completeness of Youtube ™ as an information source for patients undergoing root canal treatment procedures. YouTube ™ (https://www.youtube.com/) was searched for information using three relevant treatment search terms ('endodontics', 'root canal' and 'root canal treatment'). After exclusions (language, no audio, >15 min, duplicates), 20 videos per search term were selected. General video assessment included duration, ownership, views, age, likes/dislikes, target audience and video/audio quality, whilst content was analysed under six categories ('aetiology', 'anatomy', 'symptoms', 'procedure', 'postoperative course' and 'prognosis'). Content was scored for completeness level and statistically analysed using anova and post hoc Tukey's test (P < 0.05). To obtain 60 acceptable videos, 124 were assessed. Depending on the search term employed, the video content and ownership differed markedly. There was wide variation in both the number of video views and 'likes/dislikes'. The average video age was 788 days. In total, 46% of videos were 'posted' by a dentist/specialist source; however, this was search term specific rising to 70% of uploads for the search 'endodontic', whilst laypersons contributed 18% of uploads for the search 'root canal treatment'. Every video lacked content in the designated six categories, although 'procedure' details were covered more frequently and in better detail than other categories. Videos posted by dental professional (P = 0.046) and commercial sources (P = 0.009) were significantly more complete than videos posted by laypeople. YouTube ™ videos for endodontic search terms varied significantly by source and content and were generally incomplete. The danger of patient reliance on YouTube ™ is highlighted, as is the need for endodontic professionals to play an active role in directing patients towards alternative high-quality information sources. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Clark, Gregory M.; Mebane, Christopher A.
2014-01-01
Results from this study indicate that remedial activities conducted since the 1990s have been successful in reducing the concentrations and loads of trace metals in streams and rivers in the Coeur d’Alene and Spokane River Basins. Soils, sediment, surface water, and groundwater in areas of the Coeur d’Alene and Spokane River Basins are contaminated, and the hydrological relations between these media are complex and difficult to characterize. Trace metals have variable source areas, are transported differently depending on hydrologic conditions, and behave differently in response to remedial activities in upstream basins. Based on these findings, no single remedial action would be completely effective in reducing all trace metals to nontoxic concentrations throughout the Coeur d’Alene and Spokane River Basins. Instead, unique cleanup activities targeted at specific media and specific source areas may be necessary to achieve long-term water-quality goals.
Modeling and observations of an elevated, moving infrasonic source: Eigenray methods.
Blom, Philip; Waxler, Roger
2017-04-01
The acoustic ray tracing relations are extended by the inclusion of auxiliary parameters describing variations in the spatial ray coordinates and eikonal vector due to changes in the initial conditions. Computation of these parameters allows one to define the geometric spreading factor along individual ray paths and assists in identification of caustic surfaces so that phase shifts can be easily identified. A method is developed leveraging the auxiliary parameters to identify propagation paths connecting specific source-receiver geometries, termed eigenrays. The newly introduced method is found to be highly efficient in cases where propagation is non-planar due to horizontal variations in the propagation medium or the presence of cross winds. The eigenray method is utilized in analysis of infrasonic signals produced by a multi-stage sounding rocket launch with promising results for applications of tracking aeroacoustic sources in the atmosphere and specifically to analysis of motor performance during dynamic tests.
Social influences on adaptive criterion learning.
Cassidy, Brittany S; Dubé, Chad; Gutchess, Angela H
2015-07-01
People adaptively shift decision criteria when given biased feedback encouraging specific types of errors. Given that work on this topic has been conducted in nonsocial contexts, we extended the literature by examining adaptive criterion learning in both social and nonsocial contexts. Specifically, we compared potential differences in criterion shifting given performance feedback from social sources varying in reliability and from a nonsocial source. Participants became lax when given false positive feedback for false alarms, and became conservative when given false positive feedback for misses, replicating prior work. In terms of a social influence on adaptive criterion learning, people became more lax in response style over time if feedback was provided by a nonsocial source or by a social source meant to be perceived as unreliable and low-achieving. In contrast, people adopted a more conservative response style over time if performance feedback came from a high-achieving and reliable source. Awareness that a reliable and high-achieving person had not provided their feedback reduced the tendency to become more conservative, relative to those unaware of the source manipulation. Because teaching and learning often occur in a social context, these findings may have important implications for many scenarios in which people fine-tune their behaviors, given cues from others.
Towards a semantic lexicon for biological language processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verspoor, K.
It is well understood that natural language processing (NLP) applications require sophisticated lexical resources to support their processing goals. In the biomedical domain, we are privileged to have access to extensive terminological resources in the form of controlled vocabularies and ontologies, which have been integrated into the framework of the National Library of Medicine's Unified Medical Language System's (UMLS) Metathesaurus. However, the existence of such terminological resources does not guarantee their utility for NLP. In particular, we have two core requirements for lexical resources for NLP in addition to the basic enumeration of important domain terms: representation of morphosyntactic informationmore » about those terms, specifically part of speech information and inflectional patterns to support parsing and lemma assignment, and representation of semantic information indicating general categorical information about terms, and significant relations between terms to support text understanding and inference (Hahn et at, 1999). Biomedical vocabularies by and large commonly leave out morphosyntactic information, and where they address semantic considerations, they often do so in an unprincipled manner, for instance by indicating a relation between two concepts without indicating the type of that relation. But all is not lost. The UMLS knowledge sources include two additional resources which are relevant - the SPECIALIST lexicon, a lexicon addressing our morphosyntactic requirements, and the Semantic Network, a representation of core conceptual categories in the biomedical domain. The coverage of these two knowledge sources with respect to the full coverage of the Metathesaurus is, however, not entirely clear. Furthermore, when our goals are specifically to process biological text - and often more specifically, text in the molecular biology domain - it is difficult to say whether the coverage of these resources is meaningful. The utility of the UMLS knowledge sources for medical language processing (MLP) has been explored (Johnson, 1999; Friedman et al 2001); the time has now come to repeat these experiments with respect to biological language processing (BLP). To that end, this paper presents an analysis of ihe UMLS resources, specifically with an eye towards constructing lexical resources suitable for BLP. We follow the paradigm presented in Johnson (1999) for medical language, exploring overlap between the UMLS Metathesaurus and SPECIALIST lexicon to construct a morphosyntactic and semantically-specified lexicon, and then further explore the overlap with a relevant domain corpus for molecular biology.« less
What's in a ray set: moving towards a unified ray set format
NASA Astrophysics Data System (ADS)
Muschaweck, Julius
2011-10-01
For the purpose of optical simulation, a plethora of formats exist to describe the properties of a light source. Except for the EULUMDAT and IES formats which describe sources in terms of aperture area and far field intensity, all these formats are vendor specific, and no generally accepted standard exists. Most illumination simulation software vendors use their own format for ray sets, which describe sources in terms of many rays. Some of them keep their format definition proprietary. Thus, software packages typically can read or write only their own specific format, although the actual data content is not so different. Typically, they describe origin and direction of each ray in 3D vectors, and use one more single number for magnitude, where magnitude may denote radiant flux, luminous flux (equivalently tristimulus Y), or tristimulus X and Z. Sometimes each ray also carries its wavelength, while other formats allow to specify an overall spectrum for the whole source. In addition, in at least one format, polarization properties are also included for each ray. This situation makes it inefficient and potentially error prone for light source manufacturers to provide ray data sets for their sources in many different formats. Furthermore, near field goniometer vendors again use their proprietary formats to store the source description in terms of luminance data, and offer their proprietary software to generate ray sets from this data base. Again, the plethora of ray sets make the ray set production inefficient and potentially error prone. In this paper, we propose to describe ray data sets in terms of phase space, as a step towards a standardized ray set format. It is well known that luminance and radiance can be defined as flux density in phase space: luminance is flux divided by etendue. Therefore, single rays can be thought of as center points of phase space cells, where each cell possesses its volume (i.e. etendue), its flux, and therefore its luminance. In addition, each phase space cell possesses its spectrum, and its polarization properties. We show how this approach leads to a unification of the EULUMDAT/IES, ray set and near field goniometer formats, making possible the generation of arbitrarily many additional rays by luminance interpolation. We also show how the EULUMDAT/IES and individual ray set formats can be derived from the proposed general format, making software using a possible standard format downward compatible.
Mitchell, Daniel J; Cusack, Rhodri
2011-01-01
An electroencephalographic (EEG) marker of the limited contents of human visual short-term memory (VSTM) has previously been described. Termed contralateral delay activity, this consists of a sustained, posterior, negative potential that correlates with memory load and is greatest contralateral to the remembered hemifield. The current investigation replicates this finding and uses magnetoencephalography (MEG) to characterize its magnetic counterparts and their neural generators as they evolve throughout the memory delay. A parametric manipulation of memory load, within and beyond capacity limits, allows separation of signals that asymptote with behavioral VSTM performance from additional responses that contribute to a linear increase with set-size. Both EEG and MEG yielded bilateral signals that track the number of objects held in memory, and contralateral signals that are independent of memory load. In MEG, unlike EEG, the contralateral interaction between hemisphere and item load is much weaker, suggesting that bilateral and contralateral markers of memory load reflect distinct sources to which EEG and MEG are differentially sensitive. Nonetheless, source estimation allowed both the bilateral and the weaker contralateral capacity-limited responses to be localized, along with a load-independent contralateral signal. Sources of global and hemisphere-specific signals all localized to the posterior intraparietal sulcus during the early delay. However the bilateral load response peaked earlier and its generators shifted later in the delay. Therefore the hemifield-specific response may be more closely tied to memory maintenance while the global load response may be involved in initial processing of a limited number of attended objects, such as their individuation or consolidation into memory.
How to Keep Your Health Information Private and Secure
... permanently. When Using Mobile Devices · Research mobile apps – software programs that perform one or more specific functions – before you download and install any of them. Be sure to use known app websites or trusted sources. · Read the terms of service and the privacy notice of the mobile app ...
Arboleya, Silvia; Ruas-Madiedo, Patricia; Margolles, Abelardo; Solís, Gonzalo; Salminen, Seppo; de Los Reyes-Gavilán, Clara G; Gueimonde, Miguel
2011-09-01
Most of the current commercial probiotic strains have not been selected for specific applications, but rather on the basis of their technological potential for use in diverse applications. Therefore, by selecting them from appropriate sources, depending on the target population, it is likely that better performing strains may be identified. Few strains have been specifically selected for human neonates, where the applications of probiotics may have a great positive impact. Breast-milk constitutes an interesting source of potentially probiotic bifidobacteria for inclusion in infant formulas and foods targeted to both pre-term and full-term infants. In this study six Bifidobacterium strains isolated from breast-milk were phenotypically and genotypically characterised according to international guidelines for probiotics. In addition, different in vitro tests were used to assess the safety and probiotic potential of the strains. Although clinical data would be needed before drawing any conclusion on the probiotic properties of the strains, our results indicate that some of them may have probiotic potential for their inclusion in products targeting infants. Copyright © 2010 Elsevier B.V. All rights reserved.
Thorn, A S; Gathercole, S E
2001-06-01
Language differences in verbal short-term memory were investigated in two experiments. In Experiment 1, bilinguals with high competence in English and French and monolingual English adults with extremely limited knowledge of French were assessed on their serial recall of words and nonwords in both languages. In all cases recall accuracy was superior in the language with which individuals were most familiar, a first-language advantage that remained when variation due to differential rates of articulation in the two languages was taken into account. In Experiment 2, bilinguals recalled lists of English and French words with and without concurrent articulatory suppression. First-language superiority persisted under suppression, suggesting that the language differences in recall accuracy were not attributable to slower rates of subvocal rehearsal in the less familiar language. The findings indicate that language-specific differences in verbal short-term memory do not exclusively originate in the subvocal rehearsal process. It is suggested that one source of language-specific variation might relate to the use of long-term knowledge to support short-term memory performance.
Block 4 solar cell module design and test specification for residential applications
NASA Technical Reports Server (NTRS)
1978-01-01
Near-term design, qualification and acceptance requirements are provided for terrestrial solar cell modules suitable for incorporation in photovoltaic power sources (2 kW to 10 kW) applied to single family residential installations. Requirement levels and recommended design limits for selected performance criteria are specified for modules intended principally for rooftop installations. Modules satisfying the requirements of this specification fall into one of two categories, residential panel or residential shingle, both meeting general performance requirements plus additional category peculiar constraints.
NASA Technical Reports Server (NTRS)
1979-01-01
Results of a study leading to the preliminary design of a five passenger hybrid vehicle utilizing two energy sources (electricity and gasoline/diesel fuel) to minimize petroleum usage on a fleet basis are presented. The study methodology is described. Vehicle characterizations, the mission description, characterization, and impact on potential sales, and the rationale for the selection of the reference internal combustion engine vehicle are presented. Conclusions and recommendations of the mission analysis and performance specification report are included.
NASA Astrophysics Data System (ADS)
Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu
2016-08-01
The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.
Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu
2016-08-22
The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12-21, 2011 were identified individually by analyzing the combination of measured (134)Cs/(137)Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of (134)Cs/(137)Cs are different in reactor units owing to fuel burnup differences, the (134)Cs/(137)Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2.
Chino, Masamichi; Terada, Hiroaki; Nagai, Haruyasu; Katata, Genki; Mikami, Satoshi; Torii, Tatsuo; Saito, Kimiaki; Nishizawa, Yukiyasu
2016-01-01
The Fukushima Daiichi nuclear power reactor units that generated large amounts of airborne discharges during the period of March 12–21, 2011 were identified individually by analyzing the combination of measured 134Cs/137Cs depositions on ground surfaces and atmospheric transport and deposition simulations. Because the values of 134Cs/137Cs are different in reactor units owing to fuel burnup differences, the 134Cs/137Cs ratio measured in the environment was used to determine which reactor unit ultimately contaminated a specific area. Atmospheric dispersion model simulations were used for predicting specific areas contaminated by each dominant release. Finally, by comparing the results from both sources, the specific reactor units that yielded the most dominant atmospheric release quantities could be determined. The major source reactor units were Unit 1 in the afternoon of March 12, 2011, Unit 2 during the period from the late night of March 14 to the morning of March 15, 2011. These results corresponded to those assumed in our previous source term estimation studies. Furthermore, new findings suggested that the major source reactors from the evening of March 15, 2011 were Units 2 and 3 and that the dominant source reactor on March 20, 2011 temporally changed from Unit 3 to Unit 2. PMID:27546490
Evaluation of actuator energy storage and power sources for spacecraft applications
NASA Technical Reports Server (NTRS)
Simon, William E.; Young, Fred M.
1993-01-01
The objective of this evaluation is to determine an optimum energy storage/power source combination for electrical actuation systems for existing (Solid Rocket Booster (SRB), Shuttle) and future (Advanced Launch System (ALS), Shuttle Derivative) vehicles. Characteristic of these applications is the requirement for high power pulses (50-200 kW) for short times (milliseconds to seconds), coupled with longer-term base or 'housekeeping' requirements (5-16 kW). Specific study parameters (e.g., weight, volume, etc.) as stated in the proposal and specified in the Statement of Work (SOW) are included.
Reducing mortality risk by targeting specific air pollution sources: Suva, Fiji.
Isley, C F; Nelson, P F; Taylor, M P; Stelcer, E; Atanacio, A J; Cohen, D D; Mani, F S; Maata, M
2018-01-15
Health implications of air pollution vary dependent upon pollutant sources. This work determines the value, in terms of reduced mortality, of reducing ambient particulate matter (PM 2.5 : effective aerodynamic diameter 2.5μm or less) concentration due to different emission sources. Suva, a Pacific Island city with substantial input from combustion sources, is used as a case-study. Elemental concentration was determined, by ion beam analysis, for PM 2.5 samples from Suva, spanning one year. Sources of PM 2.5 have been quantified by positive matrix factorisation. A review of recent literature has been carried out to delineate the mortality risk associated with these sources. Risk factors have then been applied for Suva, to calculate the possible mortality reduction that may be achieved through reduction in pollutant levels. Higher risk ratios for black carbon and sulphur resulted in mortality predictions for PM 2.5 from fossil fuel combustion, road vehicle emissions and waste burning that surpass predictions for these sources based on health risk of PM 2.5 mass alone. Predicted mortality for Suva from fossil fuel smoke exceeds the national toll from road accidents in Fiji. The greatest benefit for Suva, in terms of reduced mortality, is likely to be accomplished by reducing emissions from fossil fuel combustion (diesel), vehicles and waste burning. Copyright © 2017. Published by Elsevier B.V.
Gene and protein nomenclature in public databases
Fundel, Katrin; Zimmer, Ralf
2006-01-01
Background Frequently, several alternative names are in use for biological objects such as genes and proteins. Applications like manual literature search, automated text-mining, named entity identification, gene/protein annotation, and linking of knowledge from different information sources require the knowledge of all used names referring to a given gene or protein. Various organism-specific or general public databases aim at organizing knowledge about genes and proteins. These databases can be used for deriving gene and protein name dictionaries. So far, little is known about the differences between databases in terms of size, ambiguities and overlap. Results We compiled five gene and protein name dictionaries for each of the five model organisms (yeast, fly, mouse, rat, and human) from different organism-specific and general public databases. We analyzed the degree of ambiguity of gene and protein names within and between dictionaries, to a lexicon of common English words and domain-related non-gene terms, and we compared different data sources in terms of size of extracted dictionaries and overlap of synonyms between those. The study shows that the number of genes/proteins and synonyms covered in individual databases varies significantly for a given organism, and that the degree of ambiguity of synonyms varies significantly between different organisms. Furthermore, it shows that, despite considerable efforts of co-curation, the overlap of synonyms in different data sources is rather moderate and that the degree of ambiguity of gene names with common English words and domain-related non-gene terms varies depending on the considered organism. Conclusion In conclusion, these results indicate that the combination of data contained in different databases allows the generation of gene and protein name dictionaries that contain significantly more used names than dictionaries obtained from individual data sources. Furthermore, curation of combined dictionaries considerably increases size and decreases ambiguity. The entries of the curated synonym dictionary are available for manual querying, editing, and PubMed- or Google-search via the ProThesaurus-wiki. For automated querying via custom software, we offer a web service and an exemplary client application. PMID:16899134
Is Privately Funded Research on the Rise in Ocean Science?
NASA Astrophysics Data System (ADS)
Spring, M.; Cooksey, S. W.; Orcutt, J. A.; Ramberg, S. E.; Jankowski, J. E.; Mengelt, C.
2014-12-01
While federal funding for oceanography is leveling off or declining, private sector funding from industry and philanthropy appears to be on the rise. The Ocean Studies Board of the National Research Council is discussing these changes in the ocean science funding landscape. In 2014 the Board convened experts to better understand the long term public and private funding trends for the ocean sciences and the implications of such trends for the ocean science enterprise and the nation. Specific topics of discussion included: (1) the current scope of philanthropic and industry funding for the ocean sciences; (2) the long-term trends in the funding balance between federal and other sources of funding; (3) the priorities and goals for private funders; and (4) the characteristics of various modes of engagement for private funders. Although public funding remains the dominant source of research funding, it is unclear how far or fast that balance might shift in the future nor what a shifting balance may mean. There has been no comprehensive assessment of the magnitude and impact of privately-funded science, particularly the ocean sciences, as public funding sources decline. Nevertheless, the existing data can shed some light on these questions. We will present available data on long-term trends in federal and other sources of funding for science (focusing on ocean science) and report on preliminary findings from a panel discussion with key private foundations and industry funders.
Bardelli, Silvana
2010-04-01
Stem cells contribute to innate healing and harbor a promising role for regenerative medicine. Stem cell banking through long-term storage of different stem cell platforms represents a fundamental source to preserve original features of stem cells for patient-specific clinical applications. Stem cell research and clinical translation constitute fundamental and indivisible modules catalyzed through biobanking activity, generating a return of investment.
ERIC Educational Resources Information Center
Jansen, Malte; Schroeders, Ulrich; Lüdtke, Oliver; Marsh, Herbert W.
2015-01-01
Students evaluate their achievement in a specific domain in relation to their achievement in other domains and form their self-concepts accordingly. These comparison processes have been termed "dimensional comparisons" and shown to be an important source of academic self-concepts in addition to social and temporal comparisons. Research…
26 CFR 31.3401(a)-2 - Exclusions from wages.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Exclusions from wages. 31.3401(a)-2 Section 31... Collection of Income Tax at Source § 31.3401(a)-2 Exclusions from wages. (a) In general. (1) The term “wages... specifically excepted from wages under section 3401(a). (2) The exception attaches to the remuneration for...
Siponen, Taina; Yli-Tuomi, Tarja; Aurela, Minna; Dufva, Hilkka; Hillamo, Risto; Hirvonen, Maija-Riitta; Huttunen, Kati; Pekkanen, Juha; Pennanen, Arto; Salonen, Iiris; Tiittanen, Pekka; Salonen, Raimo O; Lanki, Timo
2015-01-01
Objective To compare short-term effects of fine particles (PM2.5; aerodynamic diameter <2.5 µm) from different sources on the blood levels of markers of systemic inflammation. Methods We followed a panel of 52 ischaemic heart disease patients from 15 November 2005 to 21 April 2006 with clinic visits in every second week in the city of Kotka, Finland, and determined nine inflammatory markers from blood samples. In addition, we monitored outdoor air pollution at a fixed site during the study period and conducted a source apportionment of PM2.5 using the Environmental Protection Agency's model EPA PMF 3.0. We then analysed associations between levels of source-specific PM2.5 and markers of systemic inflammation using linear mixed models. Results We identified five source categories: regional and long-range transport (LRT), traffic, biomass combustion, sea salt, and pulp industry. We found most evidence for the relation of air pollution and inflammation in LRT, traffic and biomass combustion; the most relevant inflammation markers were C-reactive protein, interleukin-12 and myeloperoxidase. Sea salt was not positively associated with any of the inflammatory markers. Conclusions Results suggest that PM2.5 from several sources, such as biomass combustion and traffic, are promoters of systemic inflammation, a risk factor for cardiovascular diseases. PMID:25479755
Greenberg, Jacob A.; Lujan, Daniel A.; DiMenna, Mark A.; Wearing, Helen J.; Hofkin, Bruce V.
2013-01-01
Culex quinquefasciatus Say (Diptera: Culicidae) and Aedes vexans Meigen are two of the most abundant mosquitoes in Bernalillo County, New Mexico, USA. In this study, a polymerase chain reaction based methodology was used to identify the sources of blood meals taken by these two species. Ae. vexans was found to take a large proportion of its meals from mammals. Although less specific in terms of its blood meal preferences, Cx. quinquefasciatus was found to feed more commonly on birds. The results for Ae. vexans are similar to those reported for this species in other parts of their geographic range. Cx. quinquefasciatus appears to be more variable in terms of its host feeding under different environmental or seasonal circumstances. The implications of these results for arbovirus transmission are discussed. PMID:24224615
NASA Astrophysics Data System (ADS)
Smith, R. A.; Moore, R. B.; Shanley, J. B.; Miller, E. K.; Kamman, N. C.; Nacci, D.
2009-12-01
Mercury (Hg) concentrations in fish and aquatic wildlife are complex functions of atmospheric Hg deposition rate, terrestrial and aquatic watershed characteristics that influence Hg methylation and export, and food chain characteristics determining Hg bioaccumulation. Because of the complexity and incomplete understanding of these processes, regional-scale models of fish tissue Hg concentration are necessarily empirical in nature, typically constructed through regression analysis of fish tissue Hg concentration data from many sampling locations on a set of potential explanatory variables. Unless the data sets are unusually long and show clear time trends, the empirical basis for model building must be based solely on spatial correlation. Predictive regional scale models are highly useful for improving understanding of the relevant biogeochemical processes, as well as for practical fish and wildlife management and human health protection. Mechanistically, the logical arrangement of explanatory variables is to multiply each of the individual Hg source terms (e.g. dry, wet, and gaseous deposition rates, and residual watershed Hg) for a given fish sampling location by source-specific terms pertaining to methylation, watershed transport, and biological uptake for that location (e.g. SO4 availability, hill slope, lake size). This mathematical form has the desirable property that predicted tissue concentration will approach zero as all individual source terms approach zero. One complication with this form, however, is that it is inconsistent with the standard linear multiple regression equation in which all terms (including those for sources and physical conditions) are additive. An important practical disadvantage of a model in which the Hg source terms are additive (rather than multiplicative) with their modifying factors is that predicted concentration is not zero when all sources are zero, making it unreliable for predicting the effects of large future reductions in Hg deposition. In this paper we compare the results of using several different linear and non-linear models in an analysis of watershed and fish Hg data for 450 New England lakes. The differences in model results pertain to both their utility in interpreting methylation and export processes as well as in fisheries management.
2017-01-01
Background The Internet is considered to be an effective source of health information and consultation for immigrants. Nutritional interventions for immigrants have become increasingly common over the past few decades. However, each population of immigrants has specific needs. Understanding the factors influencing the success of nutrition programs among immigrants requires an examination of their attitudes and perceptions, as well as their cultural values. Objective The purpose of this study was to examine perceptions of the Internet as a tool for long-term and “real-time” professional, psychological, and nutritional treatment for immigrants from the former Soviet Union who immigrated to Israel (IIFSU) from 1990 to 2012. Methods A sample of nutrition forum users (n=18) was interviewed and comments of 80 users were analyzed qualitatively in accordance with the grounded theory principles. Results The results show that IIFSU perceive the Internet as a platform for long-term and “real-time” dietary treatment and not just as an informative tool. IIFSU report benefits of online psychological support with professional dietary treatment. They attribute importance to cultural customization, which helps reduce barriers to intervention. Conclusions In light of the results, when formulating nutritional programs, it is essential to have a specific understanding of immigrants’ cultural characteristics and their patterns of Internet use concerning dietary care. PMID:28159729
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zachara, John M.; Chen, Xingyuan; Murray, Chris
A tightly spaced well-field within a groundwater uranium (U) plume in the groundwater-surface water transition zone was monitored for a three year period for groundwater elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from mountain snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (Uaq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time series trendsmore » for Uaq and SpC were complex and displayed large temporal well-to well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common temporal behaviors resulting from the intrusion dynamics of river water and the location of source terms. Concentration hot spots were observed in groundwater that varied in location with increasing water table elevation. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While uranium time-series concentration trends varied significantly from year to year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of the river water intrusion event.« less
Radionuclides in the Arctic seas from the former Soviet Union: Potential health and ecological risks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Layton, D W; Edson, R; Varela, M
1999-11-15
The primary goal of the assessment reported here is to evaluate the health and environmental threat to coastal Alaska posed by radioactive-waste dumping in the Arctic and Northwest Pacific Oceans by the FSU. In particular, the FSU discarded 16 nuclear reactors from submarines and an icebreaker in the Kara Sea near the island of Novaya Zemlya, of which 6 contained spent nuclear fuel (SNF); disposed of liquid and solid wastes in the Sea of Japan; lost a {sup 90}Sr-powered radioisotope thermoelectric generator at sea in the Sea of Okhotsk; and disposed of liquid wastes at several sites in the Pacificmore » Ocean, east of the Kamchatka Peninsula. In addition to these known sources in the oceans, the RAIG evaluated FSU waste-disposal practices at inland weapons-development sites that have contaminated major rivers flowing into the Arctic Ocean. The RAIG evaluated these sources for the potential for release to the environment, transport, and impact to Alaskan ecosystems and peoples through a variety of scenarios, including a worst-case total instantaneous and simultaneous release of the sources under investigation. The risk-assessment process described in this report is applicable to and can be used by other circumpolar countries, with the addition of information about specific ecosystems and human life-styles. They can use the ANWAP risk-assessment framework and approach used by ONR to establish potential doses for Alaska, but add their own specific data sets about human and ecological factors. The ANWAP risk assessment addresses the following Russian wastes, media, and receptors: dumped nuclear submarines and icebreaker in Kara Sea--marine pathways; solid reactor parts in Sea of Japan and Pacific Ocean--marine pathways; thermoelectric generator in Sea of Okhotsk--marine pathways; current known aqueous wastes in Mayak reservoirs and Asanov Marshes--riverine to marine pathways; and Alaska as receptor. For these waste and source terms addressed, other pathways, such as atmospheric transport, could be considered under future-funded research efforts for impacts to Alaska. The ANWAP risk assessment does not address the following wastes, media, and receptors: radioactive sources in Alaska (except to add perspective for Russian source term); radioactive wastes associated with Russian naval military operations and decommissioning; Russian production reactor and spent-fuel reprocessing facilities nonaqueous source terms; atmospheric, terrestrial and nonaqueous pathways; and dose calculations for any circumpolar locality other than Alaska. These other, potentially serious sources of radioactivity to the Arctic environment, while outside the scope of the current ANWAP mandate, should be considered for future funding research efforts.« less
Unsupervised Segmentation of Head Tissues from Multi-modal MR Images for EEG Source Localization.
Mahmood, Qaiser; Chodorowski, Artur; Mehnert, Andrew; Gellermann, Johanna; Persson, Mikael
2015-08-01
In this paper, we present and evaluate an automatic unsupervised segmentation method, hierarchical segmentation approach (HSA)-Bayesian-based adaptive mean shift (BAMS), for use in the construction of a patient-specific head conductivity model for electroencephalography (EEG) source localization. It is based on a HSA and BAMS for segmenting the tissues from multi-modal magnetic resonance (MR) head images. The evaluation of the proposed method was done both directly in terms of segmentation accuracy and indirectly in terms of source localization accuracy. The direct evaluation was performed relative to a commonly used reference method brain extraction tool (BET)-FMRIB's automated segmentation tool (FAST) and four variants of the HSA using both synthetic data and real data from ten subjects. The synthetic data includes multiple realizations of four different noise levels and several realizations of typical noise with a 20% bias field level. The Dice index and Hausdorff distance were used to measure the segmentation accuracy. The indirect evaluation was performed relative to the reference method BET-FAST using synthetic two-dimensional (2D) multimodal magnetic resonance (MR) data with 3% noise and synthetic EEG (generated for a prescribed source). The source localization accuracy was determined in terms of localization error and relative error of potential. The experimental results demonstrate the efficacy of HSA-BAMS, its robustness to noise and the bias field, and that it provides better segmentation accuracy than the reference method and variants of the HSA. They also show that it leads to a more accurate localization accuracy than the commonly used reference method and suggest that it has potential as a surrogate for expert manual segmentation for the EEG source localization problem.
Porous elastic system with nonlinear damping and sources terms
NASA Astrophysics Data System (ADS)
Freitas, Mirelson M.; Santos, M. L.; Langa, José A.
2018-02-01
We study the long-time behavior of porous-elastic system, focusing on the interplay between nonlinear damping and source terms. The sources may represent restoring forces, but may also be focusing thus potentially amplifying the total energy which is the primary scenario of interest. By employing nonlinear semigroups and the theory of monotone operators, we obtain several results on the existence of local and global weak solutions, and uniqueness of weak solutions. Moreover, we prove that such unique solutions depend continuously on the initial data. Under some restrictions on the parameters, we also prove that every weak solution to our system blows up in finite time, provided the initial energy is negative and the sources are more dominant than the damping in the system. Additional results are obtained via careful analysis involving the Nehari Manifold. Specifically, we prove the existence of a unique global weak solution with initial data coming from the "good" part of the potential well. For such a global solution, we prove that the total energy of the system decays exponentially or algebraically, depending on the behavior of the dissipation in the system near the origin. We also prove the existence of a global attractor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, R.A.
Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a framework for quantifying the degree to which risk is reduced as mass is removed from shallow, saturated, low-permeability, dual-porosity, DNAPL source zones. Risk is defined in terms of meeting an alternate concentration level (ACL) at a compliance well in an aquifer underlying the source zone. Themore » ACL is back-calculated from a carcinogenic health-risk characterization at a downstream water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phases (dissolved, sorbed, free product). Due to the uncertainties in currently-available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making risk-reduction calculations for specific technologies. Despite the qualitative nature of the exercise, results imply that very high mass-removal efficiencies are required to achieve significant long-term risk reduction with technology, applications of finite duration. 17 refs., 7 figs., 6 tabs.« less
NASA Astrophysics Data System (ADS)
Park, Junghyun; Hayward, Chris; Stump, Brian W.
2018-06-01
Ground truth sources in Utah during 2003-2013 are used to assess the contribution of temporal atmospheric conditions to infrasound detection and the predictive capabilities of atmospheric models. Ground truth sources consist of 28 long duration static rocket motor burn tests and 28 impulsive rocket body demolitions. Automated infrasound detections from a hybrid of regional seismometers and infrasound arrays use a combination of short-term time average/long-term time average ratios and spectral analyses. These detections are grouped into station triads using a Delaunay triangulation network and then associated to estimate phase velocity and azimuth to filter signals associated with a particular source location. The resulting range and azimuth distribution from sources to detecting stations varies seasonally and is consistent with predictions based on seasonal atmospheric models. Impulsive signals from rocket body detonations are observed at greater distances (>700 km) than the extended duration signals generated by the rocket burn test (up to 600 km). Infrasound energy attenuation associated with the two source types is quantified as a function of range and azimuth from infrasound amplitude measurements. Ray-tracing results using Ground-to-Space atmospheric specifications are compared to these observations and illustrate the degree to which the time variations in characteristics of the observations can be predicted over a multiple year time period.
Liu, Yayong; Xing, Jia; Wang, Shuxiao; Fu, Xiao; Zheng, Haotian
2018-08-01
Heavy metals are concerned for its adverse effect on human health and long term burden on biogeochemical cycling in the ecosystem. In this study, a provincial-level emission inventory of 13 kinds of heavy metals including V, Cr, Mn, Co, Ni, Cu, Zn, As, Cd, Sn, Sb, Ba and Pb from 10 anthropogenic sources was developed for China, based on the 2015 national emission inventory of primary particulate matters and source category-specific speciation profiles collected from 50 previous studies measured in China. Uncertainties associated with the speciation profiles were also evaluated. Our results suggested that total emissions of the 13 types of heavy metals in China are estimated at about 58000 ton for the year 2015. The iron production is the dominant source of heavy metal, contributing 42% of total emissions of heavy metals. The emissions of heavy metals vary significantly at regional scale, with largest amount of emissions concentrated in northern and eastern China. Particular, high emissions of Cr, Co, Ni, As and Sb (contributing 8%-18% of the national emissions) are found in Shandong where has large capacity of industrial production. Uncertainty analysis suggested that the implementation of province-specific source profiles in this study significantly reduced the emission uncertainties from (-89%, 289%) to (-99%, 91%), particularly for coal combustion. However, source profiles for industry sectors such as non-metallic mineral manufacturing are quite limited, resulting in a relative high uncertainty. The high-resolution emission inventories of heavy metals are essential not only for their distribution, deposition and transport studies, but for the design of policies to redress critical atmospheric environmental hazards at local and regional scales. Detailed investigation on source-specific profile in China are still needed to achieve more accurate estimations of heavy metals in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
Upper and lower bounds of ground-motion variabilities: implication for source properties
NASA Astrophysics Data System (ADS)
Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino
2017-04-01
One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).
Klausner, Z; Klement, E; Fattal, E
2018-02-01
Viruses that affect the health of humans and farm animals can spread over long distances via atmospheric mechanisms. The phenomenon of atmospheric long-distance dispersal (LDD) is associated with severe consequences because it may introduce pathogens into new areas. The introduction of new pathogens to Israel was attributed to LDD events numerous times. This provided the motivation for this study which is aimed to identify all the locations in the eastern Mediterranean that may serve as sources for pathogen incursion into Israel via LDD. This aim was achieved by calculating source-receptor relationship probability maps. These maps describe the probability that an infected vector or viral aerosol, once airborne, will have an atmospheric route that can transport it to a distant location. The resultant probability maps demonstrate a seasonal tendency in the probability of specific areas to serve as sources for pathogen LDD into Israel. Specifically, Cyprus' season is the summer; southern Turkey and the Greek islands of Crete, Karpathos and Rhodes are associated with spring and summer; lower Egypt and Jordan may serve as sources all year round, except the summer months. The method used in this study can easily be implemented to any other geographic region. The importance of this study is the ability to provide a climatologically valid and accurate risk assessment tool to support long-term decisions regarding preparatory actions for future outbreaks long before a specific outbreak occurs. © 2017 Blackwell Verlag GmbH.
Deterministic Impulsive Vacuum Foundations for Quantum-Mechanical Wavefunctions
NASA Astrophysics Data System (ADS)
Valentine, John S.
2013-09-01
By assuming that a fermion de-constitutes immediately at source, that its constituents, as bosons, propagate uniformly as scalar vacuum terms with phase (radial) symmetry, and that fermions are unique solutions for specific phase conditions, we find a model that self-quantizes matter from continuous waves, unifying bosons and fermion ontologies in a single basis, in a constitution-invariant process. Vacuum energy has a wavefunction context, as a mass-energy term that enables wave collapse and increases its amplitude, with gravitational field as the gradient of the flux density. Gravitational and charge-based force effects emerge as statistics without special treatment. Confinement, entanglement, vacuum statistics, forces, and wavefunction terms emerge from the model's deterministic foundations.
Du, Xu; Hao, Jian
2018-04-01
Specific emotions, especially guilt, are considered to facilitate children's prosocial behavior. The current study differentiated moral stories with a helping theme in terms of the valence and source of emotions and aimed to clarify the effect of these stories on preschoolers' helping intentions and behavior. A total of 322 preschoolers between 4 and 6 years old were randomly assigned to four experimental groups and one control group. A specific type of moral story was presented to each of the experimental groups, whereas a nonmoral story was presented to the control group. The preschoolers were also asked to answer relevant questions to examine their story comprehension. The preschoolers' donating intentions and behavior were then measured. The results showed that all the experimental groups expressed more donating intentions than the control group. However, only the group that read the moral story emphasizing the actor's negative emotions toward his nonhelping behavior displayed more donating behavior than the control group. Therefore, the current study reveals that various moral stories dealing with a helping theme can facilitate helping intentions among preschoolers and that only certain stories can promote their helping behavior. Thus, it indicates the specificity of moral stories that facilitate prosocial behavior in terms of the valence and source of emotions in those stories. Copyright © 2017 Elsevier Inc. All rights reserved.
Lisowska-Myjak, B; Skarżyńska, E; Bakun, M
2018-06-01
Intrauterine environmental factors can be associated with perinatal complications and long-term health outcomes although the underlying mechanisms remain poorly defined. Meconium formed exclusively in utero and passed naturally by a neonate may contain proteins which characterise the intrauterine environment. The aim of the study was proteomic analysis of the composition of meconium proteins and their classification by biological function. Proteomic techniques combining isoelectrofocussing fractionation and LC-MS/MS analysis were used to study the protein composition of a meconium sample obtained by pooling 50 serial meconium portions from 10 healthy full-term neonates. The proteins were classified by function based on the literature search for each protein in the PubMed database. A total of 946 proteins were identified in the meconium, including 430 proteins represented by two or more peptides. When the proteins were classified by their biological function the following were identified: immunoglobulin fragments and enzymatic, neutrophil-derived, structural and fetal intestine-specific proteins. Meconium is a rich source of proteins deposited in the fetal intestine during its development in utero. A better understanding of their specific biological functions in the intrauterine environment may help to identify these proteins which may serve as biomarkers associated with specific clinical conditions/diseases with the possible impact on the fetal development and further health consequences in infants, older children and adults.
Potential for a Near Term Very Low Energy Antiproton Source at Brookhaven National Laboratory.
1989-04-01
9 Table III-1: Cost Summary . . . . * . . .. . * 10 IV. Lattice and Stretcher Properties . . . . . . .............. 11 Fig. IV-1 Cell... lattice functions . . . . . . . . . . 12 Fig. IV-2 Insertion region lattice . . . . . . . . . 12 Fig. IV-3 Superperiod lattice functions . . . . . . 12...8217 * . . . 13 Table IV-Ib Parameters after lattice matching . . . . 13 Table IV-lc Components specification. . . 13 Table IV-2 Random multipoles. .. . . .. 15
NASA Technical Reports Server (NTRS)
Webb, P.
1973-01-01
Human energy is discussed in terms of the whole man. The physical work a man does, the heat he produces, and the quantity of oxygen he takes from the air to combine with food, the fuel source of his energy, are described. The daily energy exchange, work and heat dissipation, oxygen costs of specific activities, anaerobic work, and working in space suits are summarized.
The legal system. Part 1: it's not just for lawyers.
Boylan-Kemp, Jo
This article is the first of two providing an introduction to the foundational elements of the English legal system. The 'English legal system' is a rather generic term that is often used to refer to the different sources of law and the court system in which the law is practiced. Students of law will study the English legal system as a specific topic, but it is as equally important for those who work within a profession that is regulated by the law (as nursing is) to also develop an understanding of the legal boundaries within which such a profession works. Part one, therefore, will consider the matters that form the cornerstone of our legal system, such as the constitution, and it will also explain the specific legal terms and doctrines that influence how our law is made and developed. Part two will then go on to consider the different sources of law that can be found within the English legal system. The aim of these articles is to describe these principles in a way that makes them easily understandable by those who are not involved with practicing law but who instead work within other disciplines, such as nursing.
Hütter, Markus; Brader, Joseph M
2009-06-07
We examine the origins of nonlocality in a nonisothermal hydrodynamic formulation of a one-component fluid of particles that exhibit long-range correlations, e.g., due to a spherically symmetric, long-range interaction potential. In order to furnish the continuum modeling with physical understanding of the microscopic interactions and dynamics, we make use of systematic coarse graining from the microscopic to the continuum level. We thus arrive at a thermodynamically admissible and closed set of evolution equations for the densities of momentum, mass, and internal energy. From the consideration of an illustrative special case, the following main conclusions emerge. There are two different source terms in the momentum balance. The first is a body force, which in special circumstances can be related to the functional derivative of a nonlocal Helmholtz free energy density with respect to the mass density. The second source term is proportional to the temperature gradient, multiplied by the nonlocal entropy density. These two source terms combine into a pressure gradient only in the absence of long-range effects. In the irreversible contributions to the time evolution, the nonlocal contributions arise since the self-correlations of the stress tensor and heat flux, respectively, are nonlocal as a result of the microscopic nonlocal correlations. Finally, we point out specific points that warrant further discussions.
NASA Astrophysics Data System (ADS)
Lin, Wei-Chih; Lin, Yu-Pin; Anthony, Johnathen
2015-04-01
Heavy metal pollution has adverse effects on not only the focal invertebrate species of this study, such as reduction in pupa weight and increased larval mortality, but also on the higher trophic level organisms which feed on them, either directly or indirectly, through the process of biomagnification. Despite this, few studies regarding remediation prioritization take species distribution or biological conservation priorities into consideration. This study develops a novel approach for delineating sites which are both contaminated by any of 5 readily bioaccumulated heavy metal soil contaminants and are of high ecological importance for the highly mobile, low trophic level focal species. The conservation priority of each site was based on the projected distributions of 6 moth species simulated via the presence-only maximum entropy species distribution model followed by the subsequent application of a systematic conservation tool. In order to increase the number of available samples, we also integrated crowd-sourced data with professionally-collected data via a novel optimization procedure based on a simulated annealing algorithm. This integration procedure was important since while crowd-sourced data can drastically increase the number of data samples available to ecologists, still the quality or reliability of crowd-sourced data can be called into question, adding yet another source of uncertainty in projecting species distributions. The optimization method screens crowd-sourced data in terms of the environmental variables which correspond to professionally-collected data. The sample distribution data was derived from two different sources, including the EnjoyMoths project in Taiwan (crowd-sourced data) and the Global Biodiversity Information Facility (GBIF) ?eld data (professional data). The distributions of heavy metal concentrations were generated via 1000 iterations of a geostatistical co-simulation approach. The uncertainties in distributions of the heavy metals were then quantified based on the overall consistency between realizations. Finally, Information-Gap Decision Theory (IGDT) was applied to rank the remediation priorities of contaminated sites in terms of both spatial consensus of multiple heavy metal realizations and the priority of specific conservation areas. Our results show that the crowd-sourced optimization algorithm developed in this study is effective at selecting suitable data from crowd-sourced data. By using this technique the available sample data increased to a total number of 96, 162, 72, 62, 69 and 62 or, that is, 2.6, 1.6, 2.5, 1.6, 1.2 and 1.8 times that originally available through the GBIF professionally-assembled database. Additionally, for all species considered the performance of models, in terms of test-AUC values, based on the combination of both data sources exceeded those models which were based on a single data source. Furthermore, the additional optimization-selected data lowered the overall variability, and therefore uncertainty, of model outputs. Based on the projected species distributions, our results revealed that around 30% of high species hotspot areas were also identified as contaminated. The decision-making tool, IGDT, successfully yielded remediation plans in terms of specific ecological value requirements, false positive tolerance rates of contaminated areas, and expected decision robustness. The proposed approach can be applied both to identify high conservation priority sites contaminated by heavy metals, based on the combination of screened crowd-sourced and professionally-collected data, and in making robust remediation decisions.
Wright, Chris; Heneghan, Nicola; Eveleigh, Gillian; Calvert, Melanie; Freemantle, Nick
2011-01-01
Objective To evaluate effectiveness of physiotherapy management in patients experiencing whiplash associated disorder II, on clinically relevant outcomes in the short and longer term. Design Systematic review and meta-analysis. Two reviewers independently searched information sources, assessed studies for inclusion, evaluated risk of bias and extracted data. A third reviewer mediated disagreement. Assessment of risk of bias was tabulated across included trials. Quantitative synthesis was conducted on comparable outcomes across trials with similar interventions. Meta-analyses compared effect sizes, with random effects as primary analyses. Data sources Predefined terms were employed to search electronic databases. Additional studies were identified from key journals, reference lists, authors and experts. Eligibility criteria for selecting studies Randomised controlled trials (RCTs) published in English before 31 December 2010 evaluating physiotherapy management of patients (>16 years), experiencing whiplash associated disorder II. Any physiotherapy intervention was included, when compared with other types of management, placebo/sham, or no intervention. Measurements reported on ≥1 outcome from the domains within the international classification of function, disability and health, were included. Results 21 RCTs (2126 participants, 9 countries) were included. Interventions were categorised as active physiotherapy or a specific physiotherapy intervention. 20/21 trials were evaluated as high risk of bias and one as unclear. 1395 participants were incorporated in the meta-analyses on 12 trials. In evaluating short term outcome in the acute/sub-acute stage, there was some evidence that active physiotherapy intervention reduces pain and improves range of movement, and that a specific physiotherapy intervention may reduce pain. However, moderate/considerable heterogeneity suggested that treatments may differ in nature or effect in different trial patients. Differences between participants, interventions and trial designs limited potential meta-analyses. Conclusions Inconclusive evidence exists for the effectiveness of physiotherapy management for whiplash associated disorder II. There is potential benefit for improving range of movement and pain short term through active physiotherapy, and for improving pain through a specific physiotherapy intervention. PMID:22102642
Biofuels as an Alternative Energy Source for Aviation-A Survey
NASA Technical Reports Server (NTRS)
McDowellBomani, Bilal M.; Bulzan, Dan L.; Centeno-Gomez, Diana I.; Hendricks, Robert C.
2009-01-01
The use of biofuels has been gaining in popularity over the past few years because of their ability to reduce the dependence on fossil fuels. As a renewable energy source, biofuels can be a viable option for sustaining long-term energy needs if they are managed efficiently. We investigate past, present, and possible future biofuel alternatives currently being researched and applied around the world. More specifically, we investigate the use of ethanol, cellulosic ethanol, biodiesel (palm oil, algae, and halophytes), and synthetic fuel blends that can potentially be used as fuels for aviation and nonaerospace applications. We also investigate the processing of biomass via gasification, hydrolysis, and anaerobic digestion as a way to extract fuel oil from alternative biofuels sources.
Volume Averaging Study of the Capacitive Deionization Process in Homogeneous Porous Media
Gabitto, Jorge; Tsouris, Costas
2015-05-05
Ion storage in porous electrodes is important in applications such as energy storage by supercapacitors, water purification by capacitive deionization, extraction of energy from a salinity difference and heavy ion purification. In this paper, a model is presented to simulate the charge process in homogeneous porous media comprising big pores. It is based on a theory for capacitive charging by ideally polarizable porous electrodes without faradaic reactions or specific adsorption of ions. A volume averaging technique is used to derive the averaged transport equations in the limit of thin electrical double layers. Transport between the electrolyte solution and the chargedmore » wall is described using the Gouy–Chapman–Stern model. The effective transport parameters for isotropic porous media are calculated solving the corresponding closure problems. Finally, the source terms that appear in the average equations are calculated using numerical computations. An alternative way to deal with the source terms is proposed.« less
The Plant Research Unit: Long-Term Plant Growth Support for Space Station
NASA Technical Reports Server (NTRS)
Heathcote, D. G.; Brown, C. S.; Goins, G. D.; Kliss, M.; Levine, H.; Lomax, P. A.; Porter, R. L.; Wheeler, R.
1996-01-01
The specifications of the plant research unit (PRU) plant habitat, designed for space station operations, are presented. A prototype brassboard model of the PRU is described, and the results of the subsystems tests are outlined. The effects of the long term red light emitting diode (LED) illumination as the sole source for plant development were compared with red LEDs supplemented with blue wavelengths, and white fluorescent sources. It was found that wheat and Arabidopsis were able to complete a life cycle under red LEDs alone, but with differences in physiology and morphology. The differences noted were greatest for the Arabidopsis, where the time to flowering was increased under red illumination. The addition of 10 percent of blue light was effective in eliminating the observed differences. The results of the comparative testing of three nutrient delivery systems for the PRU are discussed.
Environmental/chemical thesaurus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shriner, C.R.; Dailey, N.S.; Jordan, A.C.
The Environmental/Chemical Thesaurus approaches scientific language control problems from a multidisciplinary view. The Environmental/Biomedical Terminology Index (EBTI) was used as a base for the present thesaurus. The Environmental/Chemical Thesaurus, funded by the Environmental Protection Agency, used as its source of new terms those major terms found in 13 Environmental Protection Agency data bases. The scope of this thesaurus includes not only environmental and biomedical sciences, but also the physical sciences with emphasis placed on chemistry. Specific chemical compounds are not included; only classes of chemicals are given. To adhere to this level of classification, drugs and pesticides are identified bymore » class rather than by specific chemical name. An attempt was also made to expand the areas of sociology and economics. Terminology dealing with law, demography, and geography was expanded. Proper names of languages and races were excluded. Geographic terms were expanded to include proper names for oceans, continents, major lakes, rivers, and islands. Political divisions were added to allow for proper names of countries and states. With such a broad scope, terminology for specific sciences does not provide for indexing to the lowest levels in plant, animal, or chemical classifications.« less
Long-term consequences of pain in human neonates.
Grunau, Ruth E; Holsti, Liisa; Peters, Jeroen W B
2006-08-01
The low tactile threshold in preterm infants when they are in the neonatal intensive care unit (NICU), while their physiological systems are unstable and immature, potentially renders them more vulnerable to the effects of repeated invasive procedures. There is a small but growing literature on pain and tactile responsivity following procedural pain in the NICU, or early surgery. Long-term effects of repeated pain in the neonatal period on neurodevelopment await further research. However, there are multiple sources of stress in the NICU, which contribute to inducing high overall 'allostatic load', therefore determining specific effects of neonatal pain in human infants is challenging.
Accuracy-preserving source term quadrature for third-order edge-based discretization
NASA Astrophysics Data System (ADS)
Nishikawa, Hiroaki; Liu, Yi
2017-09-01
In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
Revising the Lubben Social Network Scale for use in residential long-term care settings.
Munn, Jean; Radey, Melissa; Brown, Kristin; Kim, Hyejin
2018-04-19
We revised the Lubben Social Network Scale (LSNS) to develop a measure of social support specific to residential long-term care (LTC) settings, the LSNS-LTC with five domains (i.e., family, friends, residents, volunteers, and staff). The authors modified the LSNS-18 to capture sources of social support specific to LTC, specifically relationships with residents, volunteers, and staff. We piloted the resultant 28-item measure with 64 LTC residents. Fifty-four respondents provided adequate information for analyses that included descriptive statistics and reliability coefficients. Twenty of the items performed well (had correlations >0.3, overall α = 0.85) and were retained. Three items required modification. The five items related to volunteers were eliminated due to extensive (>15%) missing data resulting in a proposed 23-item measure. We identified, and to some degree quantified, supportive relationships within the LTC environment, while developing a self-report tool to measure social support in these settings.
Analyzing the generality of conflict adaptation effects.
Funes, Maria Jesús; Lupiáñez, Juan; Humphreys, Glyn
2010-02-01
Conflict adaptation effects refer to the reduction of interference when the incongruent stimulus occurs immediately after an incongruent trial, compared with when it occurs after a congruent trial. The present study analyzes the key conditions that lead to adaptation effects that are specific to the type of conflict involved versus those that are conflict general. In the first 2 experiments, we combined 2 types of conflict for which compatibility arises from clearly different sources in terms of dimensional overlap while keeping the task context constant across conflict types. We found a clear pattern of specificity on conflict adaptation across conflict types. In subsequent experiments, we tested whether this pattern could be accounted in terms of feature integration processes contributing differently to repetition versus alternation of conflict types. The results clearly indicated that feature integration was not key to generating conflict type specificity on conflict adaptation. The data are consistent with there being separate modes of control for different types of cognitive conflict.
Zhou, Peiyu; Chen, Changshu; Ye, Jianjun; Shen, Wenjie; Xiong, Xiaofei; Hu, Ping; Fang, Hongda; Huang, Chuguang; Sun, Yongge
2015-04-15
Oil fingerprints have been a powerful tool widely used for determining the source of spilled oil. In most cases, this tool works well. However, it is usually difficult to identify the source if the oil spill accident occurs during offshore petroleum exploration due to the highly similar physiochemical characteristics of suspected oils from the same drilling platform. In this report, a case study from the waters of the South China Sea is presented, and multidimensional scaling analysis (MDS) is introduced to demonstrate how oil fingerprints can be combined with mathematical methods to identify the source of spilled oil from highly similar suspected sources. The results suggest that the MDS calculation based on oil fingerprints and subsequently integrated with specific biomarkers in spilled oils is the most effective method with a great potential for determining the source in terms of highly similar suspected oils. Copyright © 2015 Elsevier Ltd. All rights reserved.
Endophytic Fungi—Alternative Sources of Cytotoxic Compounds: A Review
Uzma, Fazilath; Mohan, Chakrabhavi D.; Hashem, Abeer; Konappa, Narasimha M.; Rangappa, Shobith; Kamath, Praveen V.; Singh, Bhim P.; Mudili, Venkataramana; Gupta, Vijai K.; Siddaiah, Chandra N.; Chowdappa, Srinivas; Alqarawi, Abdulaziz A.; Abd_Allah, Elsayed F.
2018-01-01
Cancer is a major cause of death worldwide, with an increasing number of cases being reported annually. The elevated rate of mortality necessitates a global challenge to explore newer sources of anticancer drugs. Recent advancements in cancer treatment involve the discovery and development of new and improved chemotherapeutics derived from natural or synthetic sources. Natural sources offer the potential of finding new structural classes with unique bioactivities for cancer therapy. Endophytic fungi represent a rich source of bioactive metabolites that can be manipulated to produce desirable novel analogs for chemotherapy. This review offers a current and integrative account of clinically used anticancer drugs such as taxol, podophyllotoxin, camptothecin, and vinca alkaloids in terms of their mechanism of action, isolation from endophytic fungi and their characterization, yield obtained, and fungal strain improvement strategies. It also covers recent literature on endophytic fungal metabolites from terrestrial, mangrove, and marine sources as potential anticancer agents and emphasizes the findings for cytotoxic bioactive compounds tested against specific cancer cell lines. PMID:29755344
Sensitivity of WRF-chem predictions to dust source function specification in West Asia
NASA Astrophysics Data System (ADS)
Nabavi, Seyed Omid; Haimberger, Leopold; Samimi, Cyrus
2017-02-01
Dust storms tend to form in sparsely populated areas covered by only few observations. Dust source maps, known as source functions, are used in dust models to allocate a certain potential of dust release to each place. Recent research showed that the well known Ginoux source function (GSF), currently used in Weather Research and Forecasting Model coupled with Chemistry (WRF-chem), exhibits large errors over some regions in West Asia, particularly near the IRAQ/Syrian border. This study aims to improve the specification of this critical part of dust forecasts. A new source function based on multi-year analysis of satellite observations, called West Asia source function (WASF), is therefore proposed to raise the quality of WRF-chem predictions in the region. WASF has been implemented in three dust schemes of WRF-chem. Remotely sensed and ground-based observations have been used to verify the horizontal and vertical extent and location of simulated dust clouds. Results indicate that WRF-chem performance is significantly improved in many areas after the implementation of WASF. The modified runs (long term simulations over the summers 2008-2012, using nudging) have yielded an average increase of Spearman correlation between observed and forecast aerosol optical thickness by 12-16 percent points compared to control runs with standard source functions. They even outperform MACC and DREAM dust simulations over many dust source regions. However, the quality of the forecasts decreased with distance from sources, probably due to deficiencies in the transport and deposition characteristics of the forecast model in these areas.
Nnane, Daniel Ekane
2011-11-15
Contamination of surface waters is a pervasive threat to human health, hence, the need to better understand the sources and spatio-temporal variations of contaminants within river catchments. River catchment managers are required to sustainably monitor and manage the quality of surface waters. Catchment managers therefore need cost-effective low-cost long-term sustainable water quality monitoring and management designs to proactively protect public health and aquatic ecosystems. Multivariate and phage-lysis techniques were used to investigate spatio-temporal variations of water quality, main polluting chemophysical and microbial parameters, faecal micro-organisms sources, and to establish 'sentry' sampling sites in the Ouse River catchment, southeast England, UK. 350 river water samples were analysed for fourteen chemophysical and microbial water quality parameters in conjunction with the novel human-specific phages of Bacteroides GB-124 (Bacteroides GB-124). Annual, autumn, spring, summer, and winter principal components (PCs) explained approximately 54%, 75%, 62%, 48%, and 60%, respectively, of the total variance present in the datasets. Significant loadings of Escherichia coli, intestinal enterococci, turbidity, and human-specific Bacteroides GB-124 were observed in all datasets. Cluster analysis successfully grouped sampling sites into five clusters. Importantly, multivariate and phage-lysis techniques were useful in determining the sources and spatial extent of water contamination in the catchment. Though human faecal contamination was significant during dry periods, the main source of contamination was non-human. Bacteroides GB-124 could potentially be used for catchment routine microbial water quality monitoring. For a cost-effective low-cost long-term sustainable water quality monitoring design, E. coli or intestinal enterococci, turbidity, and Bacteroides GB-124 should be monitored all-year round in this river catchment. Copyright © 2011 Elsevier B.V. All rights reserved.
Gesser-Edelsburg, Anat; Shalayeva, Svetlana
2017-02-03
The Internet is considered to be an effective source of health information and consultation for immigrants. Nutritional interventions for immigrants have become increasingly common over the past few decades. However, each population of immigrants has specific needs. Understanding the factors influencing the success of nutrition programs among immigrants requires an examination of their attitudes and perceptions, as well as their cultural values. The purpose of this study was to examine perceptions of the Internet as a tool for long-term and "real-time" professional, psychological, and nutritional treatment for immigrants from the former Soviet Union who immigrated to Israel (IIFSU) from 1990 to 2012. A sample of nutrition forum users (n=18) was interviewed and comments of 80 users were analyzed qualitatively in accordance with the grounded theory principles. The results show that IIFSU perceive the Internet as a platform for long-term and "real-time" dietary treatment and not just as an informative tool. IIFSU report benefits of online psychological support with professional dietary treatment. They attribute importance to cultural customization, which helps reduce barriers to intervention. In light of the results, when formulating nutritional programs, it is essential to have a specific understanding of immigrants' cultural characteristics and their patterns of Internet use concerning dietary care. ©Anat Gesser-Edelsburg, Svetlana Shalayeva. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 03.02.2017.
Sources of Uncertainty and the Interpretation of Short-Term Fluctuations
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Risbey, J.; Cowtan, K.; Rahmstorf, S.
2016-12-01
The alleged significant slowdown in global warming during the first decade of the 21st century, and the appearance of a discrepancy between models and observations, has attracted considerable research attention. We trace the history of this research and show how its conclusions were shaped by several sources of uncertainty and ambiguity about models and observations. We show that as those sources of uncertainty were gradually eliminated by further research, insufficient evidence remained to infer any discrepancy between models and observations or a significant slowing of warming. Specifically, we show that early research had to contend with uncertainties about coverage biases in the global temperature record and biases in the sea surface temperature observations which turned out to have exaggerated the extent of slowing. In addition, uncertainties in the observed forcings were found to have exaggerated the mismatch between models and observations. Further sources of uncertainty that were ultimately eliminated involved the use of incommensurate sea surface temperature data between models and observations and a tacit interpretation of model projections as predictions or forecasts. After all those sources of uncertainty were eliminated, the most recent research finds little evidence for an unusual slowdown or a discrepancy between models and observations. We discuss whether these different kinds of uncertainty could have been anticipated or managed differently, and how one can apply those lessons to future short-term fluctuations in warming.
Probabilistic Volcanic Hazard and Risk Assessment
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.
2007-08-01
Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).
Coding conventions and principles for a National Land-Change Modeling Framework
Donato, David I.
2017-07-14
This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
NASA Astrophysics Data System (ADS)
Zhao, Yang; Dai, Rui-Na; Xiao, Xiang; Zhang, Zong; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe
2017-02-01
Two-person neuroscience, a perspective in understanding human social cognition and interaction, involves designing immersive social interaction experiments as well as simultaneously recording brain activity of two or more subjects, a process termed "hyperscanning." Using newly developed imaging techniques, the interbrain connectivity or hyperlink of various types of social interaction has been revealed. Functional near-infrared spectroscopy (fNIRS)-hyperscanning provides a more naturalistic environment for experimental paradigms of social interaction and has recently drawn much attention. However, most fNIRS-hyperscanning studies have computed hyperlinks using sensor data directly while ignoring the fact that the sensor-level signals contain confounding noises, which may lead to a loss of sensitivity and specificity in hyperlink analysis. In this study, on the basis of independent component analysis (ICA), a source-level analysis framework is proposed to investigate the hyperlinks in a fNIRS two-person neuroscience study. The performance of five widely used ICA algorithms in extracting sources of interaction was compared in simulative datasets, and increased sensitivity and specificity of hyperlink analysis by our proposed method were demonstrated in both simulative and real two-person experiments.
Nikoloski, Zoran
2015-01-01
Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization. PMID:25706533
Arnold, Anne; Sajitz-Hermstein, Max; Nikoloski, Zoran
2015-01-01
Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization.
Hynds, Paul D; Misstear, Bruce D; Gill, Laurence W
2013-09-30
While the safety of public drinking water supplies in the Republic of Ireland is governed and monitored at both local and national levels, there are currently no legislative tools in place relating to private supplies. It is therefore paramount that private well owners (and users) be aware of source specifications and potential contamination risks, to ensure adequate water quality. The objective of this study was to investigate the level of awareness among private well owners in the Republic of Ireland, relating to source characterisation and groundwater contamination issues. This was undertaken through interviews with 245 private well owners. Statistical analysis indicates that respondents' source type significantly influences owner awareness, particularly regarding well construction and design parameters. Water treatment, source maintenance and regular water quality testing are considered the three primary "protective actions" (or "stewardship activities") to consumption of contaminated groundwater and were reported as being absent in 64%, 72% and 40% of cases, respectively. Results indicate that the level of awareness exhibited by well users did not significantly affect the likelihood of their source being contaminated (source susceptibility); increased awareness on behalf of well users was associated with increased levels of protective action, particularly among borehole owners. Hence, lower levels of awareness may result in increased contraction of waterborne illnesses where contaminants have entered the well. Accordingly, focused educational strategies to increase awareness among private groundwater users are advocated in the short-term; the development and introdiction of formal legislation is recommended in the long-term, including an integrated programme of well inspections and risk assessments. Copyright © 2013 Elsevier Ltd. All rights reserved.
Final design of thermal diagnostic system in SPIDER ion source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brombin, M., E-mail: matteo.brombin@igi.cnr.it; Dalla Palma, M.; Pasqualotto, R.
The prototype radio frequency source of the ITER heating neutral beams will be first tested in SPIDER test facility to optimize H{sup −} production, cesium dynamics, and overall plasma characteristics. Several diagnostics will allow to fully characterise the beam in terms of uniformity and divergence and the source, besides supporting a safe and controlled operation. In particular, thermal measurements will be used for beam monitoring and system protection. SPIDER will be instrumented with mineral insulated cable thermocouples, both on the grids, on other components of the beam source, and on the rear side of the beam dump water cooled elements.more » This paper deals with the final design and the technical specification of the thermal sensor diagnostic for SPIDER. In particular the layout of the diagnostic, together with the sensors distribution in the different components, the cables routing and the conditioning and acquisition cubicles are described.« less
Final design of thermal diagnostic system in SPIDER ion source
NASA Astrophysics Data System (ADS)
Brombin, M.; Dalla Palma, M.; Pasqualotto, R.; Pomaro, N.
2016-11-01
The prototype radio frequency source of the ITER heating neutral beams will be first tested in SPIDER test facility to optimize H- production, cesium dynamics, and overall plasma characteristics. Several diagnostics will allow to fully characterise the beam in terms of uniformity and divergence and the source, besides supporting a safe and controlled operation. In particular, thermal measurements will be used for beam monitoring and system protection. SPIDER will be instrumented with mineral insulated cable thermocouples, both on the grids, on other components of the beam source, and on the rear side of the beam dump water cooled elements. This paper deals with the final design and the technical specification of the thermal sensor diagnostic for SPIDER. In particular the layout of the diagnostic, together with the sensors distribution in the different components, the cables routing and the conditioning and acquisition cubicles are described.
SNAP 19 Pioneer F and G. Final Report
DOE R&D Accomplishments Database
1973-06-01
The generator developed for the Pioneer mission evolved from the SNAP 19 RTG`s launched aboard the NIMBUS III spacecraft. In order to satisfy the power requirements and environment of earth escape trajectory, significant modifications were made to the thermoelectric converter, heat source, and structural configuration. Specifically, a TAGS 2N thermoelectric couple was designed to provide higher efficiency and improved long term power performance, and the electrical circuitry was modified to yield very low magnetic field from current flow in the RTG. A new heat source was employed to satisfy operational requirements and its integration with the generator required alteration to the method of providing support to the fuel capsule.
Ancient Glass: A Literature Search and its Role in Waste Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Pierce, Eric M.
2010-07-01
When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less
stable carbon and hydrogen isotopic compositional ranges of methanes (δ13C and δ2H (D)) enable us to distinguish between microbial and thermogenic origin of natural gases. To identify stray gas origins, identify possible gas sources, create baseline, carry out site-specific monitoring, and monitor long-term changes
Characterization and Reliability of Vertical N-Type Gallium Nitride Schottky Contacts
2016-09-01
barrier diode SEM scanning electron microscopy SiC silicon carbide SMU source measure unit xvi THIS PAGE INTENTIONALLY LEFT BLANK xvii...arguably the Schottky barrier diode (SBD). The SBD is a fundamental component in the majority of power electronic devices; specifically, those used in...Ishizuka, and Ueno demonstrated the long-term reliability of vertical metal-GaN Schottky barrier diodes through their analysis of the degradation
CSI-EPT in Presence of RF-Shield for MR-Coils.
Arduino, Alessandro; Zilberti, Luca; Chiampi, Mario; Bottauscio, Oriano
2017-07-01
Contrast source inversion electric properties tomography (CSI-EPT) is a recently developed technique for the electric properties tomography that recovers the electric properties distribution starting from measurements performed by magnetic resonance imaging scanners. This method is an optimal control approach based on the contrast source inversion technique, which distinguishes itself from other electric properties tomography techniques for its capability to recover also the local specific absorption rate distribution, essential for online dosimetry. Up to now, CSI-EPT has only been described in terms of integral equations, limiting its applicability to homogeneous unbounded background. In order to extend the method to the presence of a shield in the domain-as in the recurring case of shielded radio frequency coils-a more general formulation of CSI-EPT, based on a functional viewpoint, is introduced here. Two different implementations of CSI-EPT are proposed for a 2-D transverse magnetic model problem, one dealing with an unbounded domain and one considering the presence of a perfectly conductive shield. The two implementations are applied on the same virtual measurements obtained by numerically simulating a shielded radio frequency coil. The results are compared in terms of both electric properties recovery and local specific absorption rate estimate, in order to investigate the requirement of an accurate modeling of the underlying physical problem.
Hanus, Robert; Vrkoslav, Vladimír; Hrdý, Ivan; Cvačka, Josef; Šobotník, Jan
2010-01-01
In 1959, P. Karlson and M. Lüscher introduced the term ‘pheromone’, broadly used nowadays for various chemicals involved in intraspecific communication. To demonstrate the term, they depicted the situation in termite societies, where king and queen inhibit the reproduction of nest-mates by an unknown chemical substance. Paradoxically, half a century later, neither the source nor the chemical identity of this ‘royal’ pheromone is known. In this study, we report for the first time the secretion of polar compounds of proteinaceous origin by functional reproductives in three termite species, Prorhinotermes simplex, Reticulitermes santonensis and Kalotermes flavicollis. Aqueous washes of functional reproductives contained sex-specific proteinaceous compounds, virtually absent in non-reproducing stages. Moreover, the presence of these compounds was clearly correlated with the age of reproductives and their reproductive status. We discuss the putative function of these substances in termite caste recognition and regulation. PMID:19939837
Domain-specific conflict adaptation without feature repetitions.
Akçay, Çağlar; Hazeltine, Eliot
2011-06-01
An influential account of how cognitive control deals with conflicting sources of information holds that conflict is monitored by a module that automatically recruits attention to resolve the conflict. This leads to reduced effects of conflict on the subsequent trial, a phenomenon termed conflict adaptation. A prominent question is whether control processes are domain specific--that is, recruited only by the particular type of conflict they resolve. Previous studies that have examined this question used two-choice tasks in which feature repetition effects could be responsible for domain-specific adaptation effects. We report two experiments using four-choice (Experiment 1) and five-choice (Experiment 2) tasks that contain two types of irrelevant sources of potentially conflicting information: stimulus location (Simon conflict) and distractors (flanker conflict). In both experiments, we found within-type conflict adaptation for both types of conflict after eliminating trials on which stimulus features were repeated from one trial to the next. Across-type conflict adaptation, however, was not significant. Thus, conflict adaptation was due to domain-specific recruitment of cognitive control. Our results add converging evidence to the idea that multiple independent control processes are involved in reactive cognitive control, although whether control is always local remains to be determined.
Bonkosky, M; Hernández-Delgado, E A; Sandoz, B; Robledo, I E; Norat-Ramírez, J; Mattei, H
2009-01-01
Human fecal contamination of coral reefs is a major cause of concern. Conventional methods used to monitor microbial water quality cannot be used to discriminate between different fecal pollution sources. Fecal coliforms, enterococci, and human-specific Bacteroides (HF183, HF134), general Bacteroides-Prevotella (GB32), and Clostridium coccoides group (CP) 16S rDNA PCR assays were used to test for the presence of non-point source fecal contamination across the southwestern Puerto Rico shelf. Inshore waters were highly turbid, consistently receiving fecal pollution from variable sources, and showing the highest frequency of positive molecular marker signals. Signals were also detected at offshore waters in compliance with existing microbiological quality regulations. Phylogenetic analysis showed that most isolates were of human fecal origin. The geographic extent of non-point source fecal pollution was large and impacted extensive coral reef systems. This could have deleterious long-term impacts on public health, local fisheries and in tourism potential if not adequately addressed.
Werneck, Alexandre Lins; Batigália, Fernando
2009-01-01
Terminology and Lexicography have been especially addressed to the Allied Health Sciences regarding discussion of case reports or concerning publication of scientific articles. The knowledge of Human Anatomy enables the understanding of medical terms and the refinement of Medical Terminology makes possible a better anatomicomedical communication in a highly technical level. Most of the scientific publications in both Anatomy and Medicine are found only in English and most of dictionaries or search resources available do not have specificity enough to explain anatomicomedical, terminological, or lexicographical occurrences. To design and produce a multilingual terminological dictionary (Latin-English-Portuguese-Spanish) containing a list of English anatomicomedical terms in common usage in cardiology subspecialties addressed to medical students and professionals, to other allied health sciences professionals, and to translators working in this specific field. Terms, semantical and grammatical components were selected to compose an anatomicocardiological corpus. The adequacy to the thematic terminological research requests and the translation reliability level will be settled from the terminology specificity in contrast to the semantics, as well as from a peer survey of the main terms used by national and international experts in specialized journals, Internet sites, and from text-books on Anatomy and Cardiology. The inclusion criteria will be the terms included in the English, Portuguese, and Spanish Terminologia Anatomica - the official terminology of the anatomical sciences; nonofficial technical commonly used terms which lead to terminology or translation misunderstanding often being a source of confusion. A table with a sample of the 508 most used anatomical cardiologic terms in English language peer-reviewed journals of cardiology and (pediatric and adult) thoracic surgery is shown. The working up of a multilingual terminological dictionary reduces the risk of ambiguities, inconsistencies, inutilities, and repetitions concerning the Nomenclature addressed to the Allied Health Sciences by prioritizing the inclusion of official technical terms and a judicious selection of commonly used terms. Efforts to standardize lists of structures in Humana Anatomy lead to both opportunities of scientific update and conceptual enlightenment.
Subalpine Forest Carbon Cycling Short- and Long-Term Influence ofClimate and Species
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kueppers, L.; Harte, J.
2005-08-23
Ecosystem carbon cycle feedbacks to climate change comprise one of the largest remaining sources of uncertainty in global model predictions of future climate. Both direct climate effects on carbon cycling and indirect effects via climate-induced shifts in species composition may alter ecosystem carbon balance over the long term. In the short term, climate effects on carbon cycling may be mediated by ecosystem species composition. We used an elevational climate and tree species composition gradient in Rocky Mountain subalpine forest to quantify the sensitivity of all major ecosystem carbon stocks and fluxes to these factors. The climate sensitivities of carbon fluxesmore » were species-specific in the cases of relative above ground productivity and litter decomposition, whereas the climate sensitivity of dead wood decay did not differ between species, and total annual soil CO2 flux showed no strong climate trend. Lodge pole pine relative productivity increased with warmer temperatures and earlier snowmelt, while Engelmann spruce relative productivity was insensitive to climate variables. Engelmann spruce needle decomposition decreased linearly with increasing temperature(decreasing litter moisture), while lodgepole pine and subalpine fir needle decay showed a hump-shaped temperature response. We also found that total ecosystem carbon declined by 50 percent with a 2.88C increase in mean annual temperature and a concurrent 63 percent decrease ingrowing season soil moisture, primarily due to large declines in mineral soil and dead wood carbon. We detected no independent effect of species composition on ecosystem C stocks. Overall, our carbon flux results suggest that, in the short term, any change in subalpine forest net carbon balance will depend on the specific climate scenario and spatial distribution of tree species. Over the long term, our carbon stock results suggest that with regional warming and drying, Rocky Mountain subalpine forest will be a net source of carbon to the atmosphere.« less
SU-E-T-507: Internal Dosimetry in Nuclear Medicine Using GATE and XCAT Phantom: A Simulation Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fallahpoor, M; Abbasi, M; Sen, A
Purpose Monte Carlo simulations are routinely used for internal dosimetry studies. These studies are conducted with humanoid phantoms such as the XCAT phantom. In this abstract we present the absorbed doses for various pairs of source and target organs using three common radiotracers in nuclear medicine. Methods The GATE software package is used for the Monte Carlo simulations. A typical female XCAT phantom is used as the input. Three radiotracers 153Sm, 131I and 99mTc are studied. The Specific Absorbed Fraction (SAF) for gamma rays (99mTc, 153Sm and 131I) and Specific Fraction (SF) for beta particles (153Sm and 131I) are calculatedmore » for all 100 pairs of source target organs including brain, liver, lung, pancreas, kidney, adrenal, spleen, rib bone, bladder and ovaries. Results The source organs themselves gain the highest absorbed dose as compared to other organs. The dose is found to be inversely proportional to distance from the source organ. In SAF results of 153Sm, when the source organ is lung, the rib bone, gain 0.0730 (Kg-1) that is more than lung itself. Conclusion The absorbed dose for various organs was studied in terms of SAF and SF. Such studies hold importance for future therapeutic procedures and optimization of induced radiotracer.« less
Modeling of Selenium for the San Diego Creek Watershed and Newport Bay, California
Presser, Theresa S.; Luoma, Samuel N.
2009-01-01
The San Diego Creek watershed and Newport Bay in southern California are contaminated with selenium (Se) as a result of groundwater associated with urban development overlying a historical wetland, the Swamp of the Frogs. The primary Se source is drainage from surrounding seleniferous marine sedimentary formations. An ecosystem-scale model was employed as a tool to assist development of a site-specific Se objective for the region. The model visualizes outcomes of different exposure scenarios in terms of bioaccumulation in predators using partitioning coefficients, trophic transfer factors, and site-specific data for food-web inhabitants and particulate phases. Predicted Se concentrations agreed well with field observations, validating the use of the model as realistic tool for testing exposure scenarios. Using the fish tissue and bird egg guidelines suggested by regulatory agencies, allowable water concentrations were determined for different conditions and locations in the watershed and the bay. The model thus facilitated development of a site-specific Se objective that was locally relevant and provided a basis for step-by-step implementation of source control.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lill, R.; Sereno, N.; Yang, B.
The Advanced Photon Source (APS) is currently in the preliminary design phase for the multi-bend achromat (MBA) lattice upgrade. Beam stability is critical for the MBA and will require long term drift defined as beam mo-tion over a seven-day timescale to be no more than 1 mi-cron at the insertion device locations and beam angle change no more than 0.25 micro-radian. Mechanical stabil-ity of beam position monitor (BPM) pickup electrodes mounted on insertion device vacuum chambers place a fun-damental limitation on long-term beam stability for inser-tion device beamlines. We present the design and imple-mentation of prototype mechanical motion system (MMS)more » instrumentation for quantifying this type of motion specif-ically in the APS accelerator tunnel and experiment hall floor under normal operating conditions. The MMS pres-ently provides critical position information on the vacuum chamber and BPM support systems. Initial results of the R&D prototype systems have demonstrated that the cham-ber movements far exceed the long-term drift tolerance specified for the APS Upgrade MBA storage ring.« less
77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...
Distinguishing erosive osteoarthritis and calcium pyrophosphate deposition disease.
Rothschild, Bruce M
2013-04-18
Erosive osteoarthritis is a term utilized to describe a specific inflammatory condition of the interphalangeal and first carpal metacarpal joints of the hands. The term has become a part of medical philosophical semantics and paradigms, but the issue is actually more complicated. Even the term osteoarthritis (non-erosive) has been controversial, with some suggesting osteoarthrosis to be more appropriate in view of the perspective that it is a non-inflammatory process undeserving of the "itis" suffix. The term "erosion" has also been a source of confusion in osteoarthritis, as it has been used to describe cartilage, not bone lesions. Inflammation in individuals with osteoarthritis actually appears to be related to complicating phenomena, such as calcium pyrophosphate and hydroxyapatite crystal deposition producing arthritis. Erosive osteoarthritis is the contentious term. It is used to describe a specific form of joint damage to specific joints. The damage has been termed erosions and the distribution of the damage is to the interphalangeal joints of the hand and first carpal metacarpal joint. Inflammation is recognized by joint redness and warmth, while X-rays reveal alteration of the articular surfaces, producing a smudged appearance. This ill-defined, joint damage has a crumbling appearance and is quite distinct from the sharply defined erosions of rheumatoid arthritis and spondyloarthropathy. The appearance is identical to those found with calcium pyrophosphate deposition disease, both in character and their unique responsiveness to hydroxychloroquine treatment. Low doses of the latter often resolve symptoms within weeks, in contrast to higher doses and the months required for response in other forms of inflammatory arthritis. Reconsidering erosive osteoarthritis as a form of calcium pyrophosphate deposition disease guides physicians to more effective therapeutic intervention.
Distinguishing erosive osteoarthritis and calcium pyrophosphate deposition disease
Rothschild, Bruce M
2013-01-01
Erosive osteoarthritis is a term utilized to describe a specific inflammatory condition of the interphalangeal and first carpal metacarpal joints of the hands. The term has become a part of medical philosophical semantics and paradigms, but the issue is actually more complicated. Even the term osteoarthritis (non-erosive) has been controversial, with some suggesting osteoarthrosis to be more appropriate in view of the perspective that it is a non-inflammatory process undeserving of the “itis” suffix. The term “erosion” has also been a source of confusion in osteoarthritis, as it has been used to describe cartilage, not bone lesions. Inflammation in individuals with osteoarthritis actually appears to be related to complicating phenomena, such as calcium pyrophosphate and hydroxyapatite crystal deposition producing arthritis. Erosive osteoarthritis is the contentious term. It is used to describe a specific form of joint damage to specific joints. The damage has been termed erosions and the distribution of the damage is to the interphalangeal joints of the hand and first carpal metacarpal joint. Inflammation is recognized by joint redness and warmth, while X-rays reveal alteration of the articular surfaces, producing a smudged appearance. This ill-defined, joint damage has a crumbling appearance and is quite distinct from the sharply defined erosions of rheumatoid arthritis and spondyloarthropathy. The appearance is identical to those found with calcium pyrophosphate deposition disease, both in character and their unique responsiveness to hydroxychloroquine treatment. Low doses of the latter often resolve symptoms within weeks, in contrast to higher doses and the months required for response in other forms of inflammatory arthritis. Reconsidering erosive osteoarthritis as a form of calcium pyrophosphate deposition disease guides physicians to more effective therapeutic intervention. PMID:23610748
Exploiting semantic linkages among multiple sources for semantic information retrieval
NASA Astrophysics Data System (ADS)
Li, JianQiang; Yang, Ji-Jiang; Liu, Chunchen; Zhao, Yu; Liu, Bo; Shi, Yuliang
2014-07-01
The vision of the Semantic Web is to build a global Web of machine-readable data to be consumed by intelligent applications. As the first step to make this vision come true, the initiative of linked open data has fostered many novel applications aimed at improving data accessibility in the public Web. Comparably, the enterprise environment is so different from the public Web that most potentially usable business information originates in an unstructured form (typically in free text), which poses a challenge for the adoption of semantic technologies in the enterprise environment. Considering that the business information in a company is highly specific and centred around a set of commonly used concepts, this paper describes a pilot study to migrate the concept of linked data into the development of a domain-specific application, i.e. the vehicle repair support system. The set of commonly used concepts, including the part name of a car and the phenomenon term on the car repairing, are employed to build the linkage between data and documents distributed among different sources, leading to the fusion of documents and data across source boundaries. Then, we describe the approaches of semantic information retrieval to consume these linkages for value creation for companies. The experiments on two real-world data sets show that the proposed approaches outperform the best baseline 6.3-10.8% and 6.4-11.1% in terms of top five and top 10 precisions, respectively. We believe that our pilot study can serve as an important reference for the development of similar semantic applications in an enterprise environment.
Bastos, A M; Litvak, V; Moran, R; Bosman, C A; Fries, P; Friston, K J
2015-03-01
This paper reports a dynamic causal modeling study of electrocorticographic (ECoG) data that addresses functional asymmetries between forward and backward connections in the visual cortical hierarchy. Specifically, we ask whether forward connections employ gamma-band frequencies, while backward connections preferentially use lower (beta-band) frequencies. We addressed this question by modeling empirical cross spectra using a neural mass model equipped with superficial and deep pyramidal cell populations-that model the source of forward and backward connections, respectively. This enabled us to reconstruct the transfer functions and associated spectra of specific subpopulations within cortical sources. We first established that Bayesian model comparison was able to discriminate between forward and backward connections, defined in terms of their cells of origin. We then confirmed that model selection was able to identify extrastriate (V4) sources as being hierarchically higher than early visual (V1) sources. Finally, an examination of the auto spectra and transfer functions associated with superficial and deep pyramidal cells confirmed that forward connections employed predominantly higher (gamma) frequencies, while backward connections were mediated by lower (alpha/beta) frequencies. We discuss these findings in relation to current views about alpha, beta, and gamma oscillations and predictive coding in the brain. Copyright © 2015. Published by Elsevier Inc.
The aesthetic experience of nursing.
Austgard, Kitt
2006-01-01
This article highlights the distinction between the 'art of nursing' and 'fine art'. While something in the nature of nursing can be described as 'the art of nursing', it is not to be misunderstood as 'fine art' or craft. Therefore, the term 'aesthetic' in relation to nursing should not be linked to the aesthetic of modern art, but instead to a broader and more general meaning of the word. The paper's main focus is the aesthetic experience, which is treated in a hermeneutic way and elucidated from classical sources and the philosophy of nursing and from Art. The paper argues that the pioneers used the term 'art of nursing' in a metaphorical way to say something more specific on the nature of nursing. The term illustrates the nurse's ability to practise at the highest possible level of excellence.
Romanian Experience in The Conditioning of Radium Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dogaru, Gh.; Dragolici, F.; Rotarescu, Gh.
2008-07-01
Ra{sup 226} first radionuclide separated from pitchblende in 1898 by Pierre and Marie Curie was successfully used in medicine, industry as in other fields being the only one available radionuclide till 1940 when were produced other radionuclides in accelerators. On long term the use of Ra{sup 226} sealed sources are not any more safe due to: the high specific activity, long half live, decays in Rn{sup 226} gas which increases the internal pressure of capsule leading in time to the leakage, the salts as raw materials from which the sealed sources are manufactured are soluble, there is a leak ofmore » information and records on the manufacture and operation. Based on this consideration in Romania regulatory authority did not authorized any more the use of these sealed sources [1]. The paper presents some aspects from Romanian experience related to the collection and conditioning of radium sealed sources. Data relating the radium inventory as well as the arrangements made in order to create a workshop for the conditioning of radium sources are presented. (authors)« less
Shim, Kyusung; Do, Nhu Tri; An, Beongku
2017-01-01
In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286
Local tsunamis and earthquake source parameters
Geist, Eric L.; Dmowska, Renata; Saltzman, Barry
1999-01-01
This chapter establishes the relationship among earthquake source parameters and the generation, propagation, and run-up of local tsunamis. In general terms, displacement of the seafloor during the earthquake rupture is modeled using the elastic dislocation theory for which the displacement field is dependent on the slip distribution, fault geometry, and the elastic response and properties of the medium. Specifically, nonlinear long-wave theory governs the propagation and run-up of tsunamis. A parametric study is devised to examine the relative importance of individual earthquake source parameters on local tsunamis, because the physics that describes tsunamis from generation through run-up is complex. Analysis of the source parameters of various tsunamigenic earthquakes have indicated that the details of the earthquake source, namely, nonuniform distribution of slip along the fault plane, have a significant effect on the local tsunami run-up. Numerical methods have been developed to address the realistic bathymetric and shoreline conditions. The accuracy of determining the run-up on shore is directly dependent on the source parameters of the earthquake, which provide the initial conditions used for the hydrodynamic models.
Unmixing Magnetic Hysteresis Loops
NASA Astrophysics Data System (ADS)
Heslop, D.; Roberts, A. P.
2012-04-01
Magnetic hysteresis loops provide important information in rock and environmental magnetic studies. Natural samples often contain an assemblage of magnetic particles composed of components with different origins. Each component potentially carries important environmental information. Hysteresis loops, however, provide information concerning the bulk magnetic assemblage, which makes it difficult to isolate the specific contributions from different sources. For complex mineral assemblages an unmixing strategy with which to separate hysteresis loops into their component parts is therefore essential. Previous methods to unmix hysteresis data have aimed at separating individual loops into their constituent parts using libraries of type-curves thought to correspond to specific mineral types. We demonstrate an alternative approach, which rather than decomposing a single loop into monomineralic contributions, examines a collection of loops to determine their constituent source materials. These source materials may themselves be mineral mixtures, but they provide a genetically meaningful decomposition of a magnetic assemblage in terms of the processes that controlled its formation. We show how an empirically derived hysteresis mixing space can be created, without resorting to type-curves, based on the co-variation within a collection of measured loops. Physically realistic end-members, which respect the expected behaviour and symmetries of hysteresis loops, can then be extracted from the mixing space. These end-members allow the measured loops to be described as a combination of invariant parts that are assumed to represent the different sources in the mixing model. Particular attention is paid to model selection and estimating the complexity of the mixing model, specifically, how many end-members should be included. We demonstrate application of this approach using lake sediments from Butte Valley, northern California. Our method successfully separates the hysteresis loops into sources with a variety of terrigenous and authigenic origins.
Optimum rocket propulsion for energy-limited transfer
NASA Technical Reports Server (NTRS)
Zuppero, Anthony; Landis, Geoffrey A.
1991-01-01
In order to effect large-scale return of extraterrestrial resources to Earth orbit, it is desirable to optimize the propulsion system to maximize the mass of payload returned per unit energy expended. This optimization problem is different from the conventional rocket propulsion optimization. A rocket propulsion system consists of an energy source plus reaction mass. In a conventional chemical rocket, the energy source and the reaction mass are the same. For the transportation system required, however, the best system performance is achieved if the reaction mass used is from a locally available source. In general, the energy source and the reaction mass will be separate. One such rocket system is the nuclear thermal rocket, in which the energy source is a reactor and the reaction mass a fluid which is heated by the reactor and exhausted. Another energy-limited rocket system is the hydrogen/oxygen rocket where H2/O2 fuel is produced by electrolysis of water using a solar array or a nuclear reactor. The problem is to choose the optimum specific impulse (or equivalently exhaust velocity) to minimize the amount of energy required to produce a given mission delta-v in the payload. The somewhat surprising result is that the optimum specific impulse is not the maximum possible value, but is proportional to the mission delta-v. In general terms, at the beginning of the mission it is optimum to use a very low specific impulse and expend a lot of reaction mass, since this is the most energy efficient way to transfer momentum. However, as the mission progresses, it becomes important to minimize the amount of reaction mass expelled, since energy is wasted moving the reaction mass. Thus, the optimum specific impulse will increase with the mission delta-v. Optimum I(sub sp) is derived for maximum payload return per energy expended for both the case of fixed and variable I(sub sp) engines. Sample missions analyzed include return of water payloads from the moons of Mars and of Saturn.
NASA Astrophysics Data System (ADS)
Gica, E.
2016-12-01
The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.
Ma, Jing; Hipel, Keith W; Hanson, Mark L
2017-12-21
A comprehensive evaluation of public participation in rural domestic waste (RDW) source-separated collection in China was carried out within a social-dimension framework, specifically in terms of public perception, awareness, attitude, and willingness to pay for RDW management. The evaluation was based on a case study conducted in Guilin, Guangxi Zhuang Autonomous Region, China, which is a representative of most inland areas of the country with a GDP around the national average. It was found that unlike urban residents, rural residents maintained a high rate of recycling, but in a spontaneous manner; they paid more attention to issues closely related to their daily lives, but less attention to those at the general level; their awareness of RDW source-separated collection was low and different age groups showed significantly different preferences regarding the sources of knowledge acquirement. Among potential information sources, village committees played a very important role in knowledge dissemination; for the respondents' pro-environmental attitudes, the influencing factor of "lack of legislation/policy" was considered to be significant; mandatory charges for waste collection and disposal had a high rate of acceptance among rural residents; and high monthly incomes had a positive correlation with both public pro-environmental attitudes and public willingness to pay for extra charges levied by RDW management. These observations imply that, for decision-makers in the short term, implementing mandatory RDW source-separated collection programs with enforced guidelines and economic compensation is more effective, while in the long run, promoting pro-environmental education to rural residents is more important.
Radiological analysis of plutonium glass batches with natural/enriched boron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
2000-06-22
The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less
Short-Term Memory Stages in Sign vs. Speech: The Source of the Serial Span Discrepancy
Hall, Matthew L.
2011-01-01
Speakers generally outperform signers when asked to recall a list of unrelated verbal items. This phenomenon is well established, but its source has remained unclear. In this study, we evaluate the relative contribution of the three main processing stages of short-term memory – perception, encoding, and recall – in this effect. The present study factorially manipulates whether American Sign Language (ASL) or English was used for perception, memory encoding, and recall in hearing ASL-English bilinguals. Results indicate that using ASL during both perception and encoding contributes to the serial span discrepancy. Interestingly, performing recall in ASL slightly increased span, ruling out the view that signing is in general a poor choice for short-term memory. These results suggest that despite the general equivalence of sign and speech in other memory domains, speech-based representations are better suited for the specific task of perception and memory encoding of a series of unrelated verbal items in serial order through the phonological loop. This work suggests that interpretation of performance on serial recall tasks in English may not translate straightforwardly to serial tasks in sign language. PMID:21450284
Pollution monitoring using networks of honey bees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bromenshenk, J.J.; Dewart, M.L.; Thomas, J.M.
1983-08-01
Each year thousands of chemicals in large quantities are introduced into the global environment and the need for effective methods of monitoring these substances has steadily increased. Most monitoring programs rely upon instrumentation to measure specific contaminants in air, water, or soil. However, it has become apparent that humans and their environment are exposed to complex mixtures of chemicals rather than single entities. As our ability to detect ever smaller quantities of pollutants has increased, the biological significance of these findings has become more uncertain. Also, it is clear that monitoring efforts should shift from short-term studies of easily identifiablemore » sources in localized areas to long-term studies of multiple sources over widespread regions. Our investigations aim at providing better tools to meet these exigencies. Honey bees are discussed as an effective, long-term, self-sustaining system for monitoring environmental impacts. Our results indicate that the use of regional, and possibly national or international, capability can be realized with the aid of beekeepers in obtaining samples and conducting measurements. This approach has the added advantage of public involvement in environmental problem solving and protection of human health and environmental quality.« less
NASA Astrophysics Data System (ADS)
Nooshiri, Nima; Saul, Joachim; Heimann, Sebastian; Tilmann, Frederik; Dahm, Torsten
2017-02-01
Global earthquake locations are often associated with very large systematic travel-time residuals even for clear arrivals, especially for regional and near-regional stations in subduction zones because of their strongly heterogeneous velocity structure. Travel-time corrections can drastically reduce travel-time residuals at regional stations and, in consequence, improve the relative location accuracy. We have extended the shrinking-box source-specific station terms technique to regional and teleseismic distances and adopted the algorithm for probabilistic, nonlinear, global-search location. We evaluated the potential of the method to compute precise relative hypocentre locations on a global scale. The method has been applied to two specific test regions using existing P- and pP-phase picks. The first data set consists of 3103 events along the Chilean margin and the second one comprises 1680 earthquakes in the Tonga-Fiji subduction zone. Pick data were obtained from the GEOFON earthquake bulletin, produced using data from all available, global station networks. A set of timing corrections varying as a function of source position was calculated for each seismic station. In this way, we could correct the systematic errors introduced into the locations by the inaccuracies in the assumed velocity structure without explicitly solving for a velocity model. Residual statistics show that the median absolute deviation of the travel-time residuals is reduced by 40-60 per cent at regional distances, where the velocity anomalies are strong. Moreover, the spread of the travel-time residuals decreased by ˜20 per cent at teleseismic distances (>28°). Furthermore, strong variations in initial residuals as a function of recording distance are smoothed out in the final residuals. The relocated catalogues exhibit less scattered locations in depth and sharper images of the seismicity associated with the subducting slabs. Comparison with a high-resolution local catalogue reveals that our relocation process significantly improves the hypocentre locations compared to standard locations.
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
Liquid-metal-ion source development for space propulsion at ARC.
Tajmar, M; Scharlemann, C; Genovese, A; Buldrini, N; Steiger, W; Vasiljevich, I
2009-04-01
The Austrian Research Centers have a long history of developing indium Liquid-Metal-Ion Source (LMIS) for space applications including spacecraft charging compensators, SIMS and propulsion. Specifically the application as a thruster requires long-term operation as well as high-current operation which is very challenging. Recently, we demonstrated the operation of a cluster of single LMIS at an average current of 100muA each for more than 4800h and developed models for tip erosion and droplet deposition suggesting that such a LMIS can operate up to 20,000h or more. In order to drastically increase the current, a porous multi-tip source that allows operation up to several mA was developed. Our paper will highlight the problem areas and challenges from our LMIS development focusing on space propulsion applications.
Computer model to simulate ionizing radiation effects correlates with experimental data
NASA Astrophysics Data System (ADS)
Perez-Poch, Antoni
Exposure to radiation from high energy protons and particles with ionizing properties is a major challenge for long-term space missions. The specific effect of such radiation on hematopoietic cells is still not fully understood. A number of experiments have been conducted on ground and in space. Those experiments on one hand, measure the extent of damage on blood markers. On the other hand, they intend to quantify the correlation between dose and energy from the radiation particles, with their ability to impair the hematopoietic stem and progenitor function. We present a computer model based on a neural network that intends to assess the relationship between dose, energy and number of hits on a particular cell, to the damage incurred to the human marrow cells. Calibration of the network is performed with the existing experimental data available in bibliography. Different sources of ionizing radiation at different doses (0-90 cGy) and along different patterns of a long-term exposure scenarios are simulated. Results are shown for a continuous variation of doses and are compared with specific data available in the literature. Some predictions are inferred for long-term scenarios of spaceflight, and the risk of jeopardizing a mission due to a major disfunction of the bone marrow is calculated. The method has proved successful in reproducing specific experimental data. We also discuss the significance and validity of the predicted ionizing radiation effects in situations such as long-term missions for a continuous range of dose.
Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.
Benson, Tim
2016-07-04
Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.
Engineering Venom’s Toxin-Neutralizing Antibody Fragments and Its Therapeutic Potential
Alvarenga, Larissa M.; Zahid, Muhammad; di Tommaso, Anne; Juste, Matthieu O.; Aubrey, Nicolas; Billiald, Philippe; Muzard, Julien
2014-01-01
Serum therapy remains the only specific treatment against envenoming, but anti-venoms are still prepared by fragmentation of polyclonal antibodies isolated from hyper-immunized horse serum. Most of these anti-venoms are considered to be efficient, but their production is tedious, and their use may be associated with adverse effects. Recombinant antibodies and smaller functional units are now emerging as credible alternatives and constitute a source of still unexploited biomolecules capable of neutralizing venoms. This review will be a walk through the technologies that have recently been applied leading to novel antibody formats with better properties in terms of homogeneity, specific activity and possible safety. PMID:25153256
Outlook for alternative energy sources. [aviation fuels
NASA Technical Reports Server (NTRS)
Card, M. E.
1980-01-01
Predictions are made concerning the development of alternative energy sources in the light of the present national energy situation. Particular emphasis is given to the impact of alternative fuels development on aviation fuels. The future outlook for aircraft fuels is that for the near term, there possibly will be no major fuel changes, but minor specification changes may be possible if supplies decrease. In the midterm, a broad cut fuel may be used if current development efforts are successful. As synfuel production levels increase beyond the 1990's there may be some mixtures of petroleum-based and synfuel products with the possibility of some shale distillate and indirect coal liquefaction products near the year 2000.
Committing to coal and gas: Long-term contracts, regulation, and fuel switching in power generation
NASA Astrophysics Data System (ADS)
Rice, Michael
Fuel switching in the electricity sector has important economic and environmental consequences. In the United States, the increased supply of gas during the last decade has led to substantial switching in the short term. Fuel switching is constrained, however, by the existing infrastructure. The power generation infrastructure, in turn, represents commitments to specific sources of energy over the long term. This dissertation explores fuel contracts as the link between short-term price response and long-term plant investments. Contracting choices enable power plant investments that are relationship-specific, often regulated, and face uncertainty. Many power plants are subject to both hold-up in investment and cost-of-service regulation. I find that capital bias is robust when considering either irreversibility or hold-up due to the uncertain arrival of an outside option. For sunk capital, the rental rate is inappropriate for determining capital bias. Instead, capital bias depends on the regulated rate of return, discount rate, and depreciation schedule. If policies such as emissions regulations increase fuel-switching flexibility, this can lead to capital bias. Cost-of-service regulation can shorten the duration of a long-term contract. From the firm's perspective, the existing literature provides limited guidance when bargaining and writing contracts for fuel procurement. I develop a stochastic programming framework to optimize long-term contracting decisions under both endogenous and exogenous sources of hold-up risk. These typically include policy changes, price shocks, availability of fuel, and volatility in derived demand. For price risks, the optimal contract duration is the moment when the expected benefits of the contract are just outweighed by the expected opportunity costs of remaining in the contract. I prove that imposing early renegotiation costs decreases contract duration. Finally, I provide an empirical approach to show how coal contracts can limit short-term fuel switching in power production. During the era prior to shale gas and electricity market deregulation, I do not find evidence that gas generation substituted for coal in response to fuel price changes. However, I do find evidence that coal plant operations are constrained by fuel contracts. As the min-take commitment to coal increases, changes to annual coal plant output decrease. My conclusions are robust in spite of bias due to the selective reporting of proprietary coal delivery contracts by utilities.
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
Smith, David S; Jones, Benedict C; Allan, Kevin
2013-08-01
The functionalist memory perspective predicts that information of adaptive value may trigger specific processing modes. It was recently demonstrated that women's memory is sensitive to cues of male sexual dimorphism (i.e., masculinity) that convey information of adaptive value for mate choice because they signal health and genetic quality, as well as personality traits important in relationship contexts. Here, we show that individual differences in women's mating strategies predict the effect of facial masculinity cues upon memory, strengthening the case for functional design within memory. Using the revised socio-sexual orientation inventory, Experiment 1 demonstrates that women pursuing a short-term, uncommitted mating strategy have enhanced source memory for men with exaggerated versus reduced masculine facial features, an effect that reverses in women who favor long-term committed relationships. The reversal in the direction of the effect indicates that it does not reflect the sex typicality of male faces per se. The same pattern occurred within women's source memory for women's faces, implying that the memory bias does not reflect the perceived attractiveness of faces per se. In Experiment 2, we reran the experiment using men's faces to establish the reliability of the core finding and replicated Experiment 1's results. Masculinity cues may therefore trigger a specific mode within women's episodic memory. We discuss why this mode may be triggered by female faces and its possible role in mate choice. In so doing, we draw upon the encoding specificity principle and the idea that episodic memory limits the scope of stereotypical inferences about male behavior.
A Cross-Lingual Similarity Measure for Detecting Biomedical Term Translations
Bollegala, Danushka; Kontonatsios, Georgios; Ananiadou, Sophia
2015-01-01
Bilingual dictionaries for technical terms such as biomedical terms are an important resource for machine translation systems as well as for humans who would like to understand a concept described in a foreign language. Often a biomedical term is first proposed in English and later it is manually translated to other languages. Despite the fact that there are large monolingual lexicons of biomedical terms, only a fraction of those term lexicons are translated to other languages. Manually compiling large-scale bilingual dictionaries for technical domains is a challenging task because it is difficult to find a sufficiently large number of bilingual experts. We propose a cross-lingual similarity measure for detecting most similar translation candidates for a biomedical term specified in one language (source) from another language (target). Specifically, a biomedical term in a language is represented using two types of features: (a) intrinsic features that consist of character n-grams extracted from the term under consideration, and (b) extrinsic features that consist of unigrams and bigrams extracted from the contextual windows surrounding the term under consideration. We propose a cross-lingual similarity measure using each of those feature types. First, to reduce the dimensionality of the feature space in each language, we propose prototype vector projection (PVP)—a non-negative lower-dimensional vector projection method. Second, we propose a method to learn a mapping between the feature spaces in the source and target language using partial least squares regression (PLSR). The proposed method requires only a small number of training instances to learn a cross-lingual similarity measure. The proposed PVP method outperforms popular dimensionality reduction methods such as the singular value decomposition (SVD) and non-negative matrix factorization (NMF) in a nearest neighbor prediction task. Moreover, our experimental results covering several language pairs such as English–French, English–Spanish, English–Greek, and English–Japanese show that the proposed method outperforms several other feature projection methods in biomedical term translation prediction tasks. PMID:26030738
Toward a common language for biobanking.
Fransson, Martin N; Rial-Sebbag, Emmanuelle; Brochhausen, Mathias; Litton, Jan-Eric
2015-01-01
To encourage the process of harmonization, the biobank community should support and use a common terminology. Relevant terms may be found in general thesauri for medicine, legal instruments or specific glossaries for biobanking. A comparison of the use of these sources has so far not been conducted and would be a useful instrument to further promote harmonization and data sharing. Thus, the purpose of the present study was to investigate the preference of definitions important for sharing biological samples and data. Definitions for 10 terms -[human] biobank, sample/specimen, sample collection, study, aliquot, coded, identifying information, anonymised, personal data and informed consent-were collected from several sources. A web-based questionnaire was sent to 560 European individuals working with biobanks asking to select their preferred definition for the terms. A total of 123 people participated in the survey, giving a response rate of 23%. The result was evaluated from four aspects: scope of definitions, potential regional differences, differences in semantics and definitions in the context of ontologies, guided by comments from responders. Indicative from the survey is the risk of focusing only on the research aspect of biobanking in definitions. Hence, it is recommended that important terms should be formulated in such a way that all areas of biobanking are covered to improve the bridges between research and clinical application. Since several of the terms investigated here within can also be found in a legal context, which may differ between countries, establishing what is a proper definition on how it adheres to law is also crucial.
Sound source localization and segregation with internally coupled ears: the treefrog model
Christensen-Dalsgaard, Jakob
2016-01-01
Acoustic signaling plays key roles in mediating many of the reproductive and social behaviors of anurans (frogs and toads). Moreover, acoustic signaling often occurs at night, in structurally complex habitats, such as densely vegetated ponds, and in dense breeding choruses characterized by high levels of background noise and acoustic clutter. Fundamental to anuran behavior is the ability of the auditory system to determine accurately the location from where sounds originate in space (sound source localization) and to assign specific sounds in the complex acoustic milieu of a chorus to their correct sources (sound source segregation). Here, we review anatomical, biophysical, neurophysiological, and behavioral studies aimed at identifying how the internally coupled ears of frogs contribute to sound source localization and segregation. Our review focuses on treefrogs in the genus Hyla, as they are the most thoroughly studied frogs in terms of sound source localization and segregation. They also represent promising model systems for future work aimed at understanding better how internally coupled ears contribute to sound source localization and segregation. We conclude our review by enumerating directions for future research on these animals that will require the collaborative efforts of biologists, physicists, and roboticists. PMID:27730384
Reflections on the role of open source in health information system interoperability.
Sfakianakis, S; Chronaki, C E; Chiarugi, F; Conforti, F; Katehakis, D G
2007-01-01
This paper reflects on the role of open source in health information system interoperability. Open source is a driving force in computer science research and the development of information systems. It facilitates the sharing of information and ideas, enables evolutionary development and open collaborative testing of code, and broadens the adoption of interoperability standards. In health care, information systems have been developed largely ad hoc following proprietary specifications and customized design. However, the wide deployment of integrated services such as Electronic Health Records (EHRs) over regional health information networks (RHINs) relies on interoperability of the underlying information systems and medical devices. This reflection is built on the experiences of the PICNIC project that developed shared software infrastructure components in open source for RHINs and the OpenECG network that offers open source components to lower the implementation cost of interoperability standards such as SCP-ECG, in electrocardiography. Open source components implementing standards and a community providing feedback from real-world use are key enablers of health care information system interoperability. Investing in open source is investing in interoperability and a vital aspect of a long term strategy towards comprehensive health services and clinical research.
Globalization, women's migration, and the long-term-care workforce.
Browne, Colette V; Braun, Kathryn L
2008-02-01
With the aging of the world's population comes the rising need for qualified direct long-term-care (DLTC) workers (i.e., those who provide personal care to frail and disabled older adults). Developed nations are increasingly turning to immigrant women to fill these needs. In this article, we examine the impact of three global trends-population aging, globalization, and women's migration-on the supply and demand for DLTC workers in the United States. Following an overview of these trends, we identify three areas with embedded social justice issues that are shaping the DLTC workforce in the United States, with a specific focus on immigrant workers in these settings. These include world poverty and economic inequalities, the feminization and colorization of labor (especially in long-term care), and empowerment and women's rights. We conclude with a discussion of the contradictory effects that both population aging and globalization have on immigrant women, source countries, and the long-term-care workforce in the United States. We raise a number of policy, practice, and research implications and questions. For policy makers and long-term-care administrators in receiver nations such as the United States, the meeting of DLTC worker needs with immigrants may result in greater access to needed employees but also in the continued devaluation of eldercare as a profession. Source (supply) nations must balance the real and potential economic benefits of remittances from women who migrate for labor with the negative consequences of disrupting family care traditions and draining the long-term-care workforce of those countries.
Prioritizing environmental justice and equality: diesel emissions in southern California.
Marshall, Julian D; Swor, Kathryn R; Nguyen, Nam P
2014-04-01
Existing environmental policies aim to reduce emissions but lack standards for addressing environmental justice. Environmental justice research documents disparities in exposure to air pollution; however, little guidance currently exists on how to make improvements or on how specific emission-reduction scenarios would improve or deteriorate environmental justice conditions. Here, we quantify how emission reductions from specific sources would change various measures of environmental equality and justice. We evaluate potential emission reductions for fine diesel particulate matter (DPM) in Southern California for five sources: on-road mobile, off-road mobile, ships, trains, and stationary. Our approach employs state-of-the-science dispersion and exposure models. We compare four environmental goals: impact, efficiency, equality, and justice. Results indicate potential trade-offs among those goals. For example, reductions in train emissions produce the greatest improvements in terms of efficiency, equality, and justice, whereas off-road mobile source reductions can have the greatest total impact. Reductions in on-road emissions produce improvements in impact, equality, and justice, whereas emission reductions from ships would widen existing population inequalities. Results are similar for complex versus simplified exposure analyses. The approach employed here could usefully be applied elsewhere to evaluate opportunities for improving environmental equality and justice in other locations.
Bosire Onyancha, Omwoyo
2008-05-01
As channels of communicating HIV/AIDS research information, serial publications and particularly journals are increasingly used in response to the pandemic. The last few decades have witnessed a proliferation of sources of HIV/AIDS-related information, bringing many challenges to collection-development librarians as well as to researchers. This study uses an informetric approach to examine the growth, productivity and scientific impact of these sources, during the period 1980 to 2005, and especially to measure performance in the publication and dissemination of HIV/AIDS research about or from eastern or southern Africa. Data were collected from MEDLINE, Science Citation Index (SCI), Social Sciences Citation Index (SSCI), and Ulrich's Periodical Directory. The analysis used Sitkis version 1.5, Microsoft Office Access, Microsoft Office Excel, Bibexcel, and Citespace version 2.0.1. The specific objectives were to identify the number of sources of HIV/AIDS-related information that have been published in the region, the coverage of these in key bibliographic databases, the most commonly used publication type for HIV/AIDS research, the countries in which the sources are published, the sources' productivity in terms of numbers of papers and citations, the most influential sources, the subject coverage of the sources, and the core sources of HIV/AIDS-information.
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Long-term changes after brief dynamic psychotherapy: symptomatic versus dynamic assessments.
Høglend, P; Sørlie, T; Sørbye, O; Heyerdahl, O; Amlo, S
1992-08-01
Dynamic change in psychotherapy, as measured by theory-related or mode-specific instruments, have been criticized for being too intercorrelated with symptomatic change measures. In this study, long-term changes after brief dynamic psychotherapy were studied in 45 moderately disturbed neurotic patients by a reliable outcome battery. The factor structure of all the change variables suggested that they tapped 2 distinct and stable sources of variance: dynamic and symptomatic change. The categories of overall dynamic change were different from categories of change on the Global Assessment Scale. A small systematic difference was found between the categories of overall dynamic change and the categories of target complaints change also, due to false solutions of dynamic conflicts.
Modeling the contribution of point sources and non-point sources to Thachin River water pollution.
Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth
2009-08-15
Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2014 CFR
2014-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2012 CFR
2012-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2010 CFR
2010-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2013 CFR
2013-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2011 CFR
2011-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
Implication of using different carbon sources for denitrification in wastewater treatments.
Cherchi, Carla; Onnis-Hayden, Annalisa; El-Shawabkeh, Ibrahim; Gu, April Z
2009-08-01
Application of external carbon sources for denitrification becomes necessary for wastewater treatment plants that have to meet very stringent effluent nitrogen limits (e.g., 3 to 5 mgTN/L). In this study, we evaluated and compared three carbon sources--MicroC (Environmental Operating Solutions, Bourne, Massachusetts), methanol, and acetate-in terms of their denitrification rates and kinetics, effect on overall nitrogen removal performance, and microbial community structure of carbon-specific denitrifying enrichments. Denitrification rates and kinetics were determined with both acclimated and non-acclimated biomass, obtained from laboratory-scale sequencing batch reactor systems or full-scale plants. The results demonstrate the feasibility of the use of MicroC for denitrification processes, with maximum denitrification rates (k(dmax)) of 6.4 mgN/gVSSh and an observed yield of 0.36 mgVSS/mgCOD. Comparable maximum nitrate uptake rates were found with methanol, while acetate showed a maximum denitrification rate nearly twice as high as the others. The maximum growth rates measured at 20 degrees C for MicroC and methanol were 3.7 and 1.2 day(-1), respectively. The implications resulting from the differences in the denitrification rates and kinetics of different carbon sources on the full-scale nitrogen removal performance, under various configurations and operational conditions, were assessed using Biowin (EnviroSim Associates, Ltd., Flamborough, Ontario, Canada) simulations for both pre- and post-denitrification systems. Examination of microbial population structures using Automated Ribosomal Intergenic Spacer Analysis (ARISA) throughout the study period showed dynamic temporal changes and distinct microbial community structures of different carbon-specific denitrifying cultures. The ability of a specific carbon-acclimated denitrifying population to instantly use other carbon source also was investigated, and the chemical-structure-associated behavior patterns observed suggested that the complex biochemical pathways/enzymes involved in the denitrification process depended on the carbon sources used.
NASA Technical Reports Server (NTRS)
Orth, Charles D.; Klein, Gail; Sercel, Joel; Hoffman, Nate; Murray, Kathy; Chang-Diaz, Franklin
1987-01-01
Inertial Confinement Fusion (ICF) is an attractive engine power source for interplanetary manned spacecraft, especially for near-term missions requiring minimum flight duration, because ICF has inherent high power-to-mass ratios and high specific impulses. We have developed a new vehicle concept called VISTA that uses ICF and is capable of round-trip manned missions to Mars in 100 days using A.D. 2020 technology. We describe VISTA's engine operation, discuss associated plasma issues, and describe the advantages of DT fuel for near-term applications. Although ICF is potentially superior to non-fusion technologies for near-term interplanetary transport, the performance capabilities of VISTA cannot be meaningfully compared with those of magnetic-fusion systems because of the lack of a comparable study of the magnetic-fusion systems. We urge that such a study be conducted.
NASA Astrophysics Data System (ADS)
Sheffield, J.
1981-08-01
For a specific configuration of magnetic field and plasma to be economically attractive as a commercial source of energy, it must contain a high-pressure plasma in a stable fashion while thermally isolating the plasma from the walls of the containment vessel. The tokamak magnetic configuration is presently the most successful in terms of reaching the considered goals. Tokamaks were developed in the USSR in a program initiated in the mid-1950s. By the early 1970s tokamaks were operating not only in the USSR but also in the U.S., Australia, Europe, and Japan. The advanced state of the tokamak program is indicated by the fact that it is used as a testbed for generic fusion development - for auxiliary heating, diagnostics, materials - as well as for specific tokamak advancement. This has occurred because it is the most economic source of a large, reproducible, hot, dense plasma. The basic tokamak is considered along with tokamak improvements, impurity control, additional heating, particle and power balance in a tokamak, aspects of microscopic transport, and macroscopic stability.
The National Geographic Names Data Base: Phase II instructions
Orth, Donald J.; Payne, Roger L.
1987-01-01
not recorded on topographic maps be added. The systematic collection of names from other sources, including maps, charts, and texts, is termed Phase II. In addition, specific types of features not compiled during Phase I are encoded and added to the data base. Other names of importance to researchers and users, such as historical and variant names, are also included. The rules and procedures for Phase II research, compilation, and encoding are contained in this publication.
1979-12-01
required of the Army aviator. The successful accomplishment of many of these activities depends upon the aviator’s ability to extract information from maps...Cruise NOE VBI Determine Position VB2 Crew Coordination (Topographic) VB3 Radio Communication VI . TERM4INATION C. Post-Flight VIC1 Debriefing 11LA 1I...NOE FUNCTION: VBI DETERMINE POSITION INFORMATION REQUIREMENT SPECIFICS SOURCE COMMENTS See Function IIIAl ! FUNCTION: VB2 CREW COORDINATION
Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar
2017-04-01
Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was found single dominating source over southern part while over Bangladesh, both vehicular, biomass burning and industrial sources were significant. Copyright © 2016 Elsevier Ltd. All rights reserved.
Medical Subject Headings (MeSH) for indexing and retrieving open-source healthcare data.
Marc, David T; Khairat, Saif S
2014-01-01
The US federal government initiated the Open Government Directive where federal agencies are required to publish high value datasets so that they are available to the public. Data.gov and the community site Healthdata.gov were initiated to disperse such datasets. However, data searches and retrieval for these sites are keyword driven and severely limited in performance. The purpose of this paper is to address the issue of extracting relevant open-source data by proposing a method of adopting the MeSH framework for indexing and data retrieval. A pilot study was conducted to compare the performance of traditional keywords to MeSH terms for retrieving relevant open-source datasets related to "mortality". The MeSH framework resulted in greater sensitivity with comparable specificity to the keyword search. MeSH showed promise as a method for indexing and retrieving data, yet future research should conduct a larger scale evaluation of the performance of the MeSH framework for retrieving relevant open-source healthcare datasets.
Recent Approaches to Estimate Associations Between Source-Specific Air Pollution and Health.
Krall, Jenna R; Strickland, Matthew J
2017-03-01
Estimating health effects associated with source-specific exposure is important for better understanding how pollution impacts health and for developing policies to better protect public health. Although epidemiologic studies of sources can be informative, these studies are challenging to conduct because source-specific exposures (e.g., particulate matter from vehicles) often are not directly observed and must be estimated. We reviewed recent studies that estimated associations between pollution sources and health to identify methodological developments designed to address important challenges. Notable advances in epidemiologic studies of sources include approaches for (1) propagating uncertainty in source estimation into health effect estimates, (2) assessing regional and seasonal variability in emissions sources and source-specific health effects, and (3) addressing potential confounding in estimated health effects. Novel methodological approaches to address challenges in studies of pollution sources, particularly evaluation of source-specific health effects, are important for determining how source-specific exposure impacts health.
NASA Technical Reports Server (NTRS)
Longwell, J. P.; Grobman, J.
1978-01-01
In connection with the anticipated impossibility to provide on a long-term basis liquid fuels derived from petroleum, an investigation has been conducted with the objective to assess the suitability of jet fuels made from oil shale and coal and to develop a data base which will allow optimization of future fuel characteristics, taking energy efficiency of manufacture and the tradeoffs in aircraft and engine design into account. The properties of future aviation fuels are examined and proposed solutions to problems of alternative fuels are discussed. Attention is given to the refining of jet fuel to current specifications, the control of fuel thermal stability, and combustor technology for use of broad specification fuels. The first solution is to continue to develop the necessary technology at the refinery to produce specification jet fuels regardless of the crude source.
Design requirements for a stand alone EUV interferometer
NASA Astrophysics Data System (ADS)
Michallon, Ph.; Constancias, C.; Lagrange, A.; Dalzotto, B.
2008-03-01
EUV lithography is expected to be inserted for the 32/22 nm nodes with possible extension below. EUV resist availability remains one of the main issues to be resolved. There is an urgent need to provide suitable tools to accelerate resist development and to achieve resolution, LER and sensitivity specifications simultaneously. An interferometer lithography tool offers advantages regarding conventional EUV exposure tool. It allows the evaluation of resists, free from the deficiencies of optics and mask which are limiting the achieved resolution. Traditionally, a dedicated beam line from a synchrotron, with limited access, is used as a light source in EUV interference lithography. This paper identifies the technology locks to develop a stand alone EUV interferometer using a compact EUV source. It will describe the theoretical solutions adopted and especially look at the feasibility according to available technologies. EUV sources available on the market have been evaluated in terms of power level, source size, spatial coherency, dose uniformity, accuracy, stability and reproducibility. According to the EUV source characteristics, several optic designs were studied (simple or double gratings). For each of these solutions, the source and collimation optic specifications have been determined. To reduce the exposure time, a new grating technology will also be presented allowing to significantly increasing the transmission system efficiency. The optical grating designs were studied to allow multi-pitch resolution print on the same exposure without any focus adjustment. Finally micro mechanical system supporting the gratings was studied integrating the issues due to vacuum environment, alignment capability, motion precision, automation and metrology to ensure the needed placement control between gratings and wafer. A similar study was carried out for the collimation-optics mechanical support which depends on the source characteristics.
Follow-up of high energy neutrinos detected by the ANTARES telescope
NASA Astrophysics Data System (ADS)
Mathieu, Aurore
2016-04-01
The ANTARES telescope is well-suited to detect high energy neutrinos produced in astrophysical transient sources as it can observe a full hemisphere of the sky with a high duty cycle. Potential neutrino sources are gamma-ray bursts, core-collapse supernovae and flaring active galactic nuclei. To enhance the sensitivity of ANTARES to such sources, a detection method based on follow-up observations from the neutrino direction has been developed. This program, denoted as TAToO, includes a network of robotic optical telescopes (TAROT, Zadko and MASTER) and the Swift-XRT telescope, which are triggered when an "interesting" neutrino is detected by ANTARES. A follow-up of special events, such as neutrino doublets in time/space coincidence or a single neutrino having a very high energy or in the specific direction of a local galaxy, significantly improves the perspective for the detection of transient sources. The analysis of early and long term follow-up observations to search for fast and slowly varying transient sources, respectively, has been performed and the results covering optical and X-ray data are presented in this contribution.
NASA Astrophysics Data System (ADS)
Prikryl, Richard
2016-04-01
Prior to industrial era, the quarrying of natural stone was primarily local (the stone has been used very close to its extraction in most of the cases), small scale, occasional (the stone has been extracted only when needed for specific construction, permanent operations were much rarer than nowadays) but long-term (the quarrying activity at one site persisted over centuries very often). The landscape affected by such quarrying (as we can observe it at present) gained numerous new values (e.g., increased morphological contrast, succession of wildlife habitat, etc.) that are often appreciated more than the presence of valuable mineral resource - natural stone. If these site were claimed natural monuments or gained another type of environmental protection, any further extraction of natural stone is prohibited. However, if the specific site was used for extraction of natural stone that has been used for construction which later became cultural heritage object, the antagonistic perception of the site might appear - the site might be protected as a geomorphosite but, at the same time, it can be a source of unique natural stone required for the restoration of cultural heritage objects. This paper, along with above mentioned basic relationships, provides some real examples connected with the difficulties to find the extractable source of natural stone for restoration of iconic cultural heritage objects - specifically search for sources of Carboniferous arkoses to be used for replacement of the decayed ashlars at the Gothic Charles Bridge in Prague (Czech Republic).
Bendixen, Alexandra; Scharinger, Mathias; Strauß, Antje; Obleser, Jonas
2014-04-01
Speech signals are often compromised by disruptions originating from external (e.g., masking noise) or internal (e.g., inaccurate articulation) sources. Speech comprehension thus entails detecting and replacing missing information based on predictive and restorative neural mechanisms. The present study targets predictive mechanisms by investigating the influence of a speech segment's predictability on early, modality-specific electrophysiological responses to this segment's omission. Predictability was manipulated in simple physical terms in a single-word framework (Experiment 1) or in more complex semantic terms in a sentence framework (Experiment 2). In both experiments, final consonants of the German words Lachs ([laks], salmon) or Latz ([lats], bib) were occasionally omitted, resulting in the syllable La ([la], no semantic meaning), while brain responses were measured with multi-channel electroencephalography (EEG). In both experiments, the occasional presentation of the fragment La elicited a larger omission response when the final speech segment had been predictable. The omission response occurred ∼125-165 msec after the expected onset of the final segment and showed characteristics of the omission mismatch negativity (MMN), with generators in auditory cortical areas. Suggestive of a general auditory predictive mechanism at work, this main observation was robust against varying source of predictive information or attentional allocation, differing between the two experiments. Source localization further suggested the omission response enhancement by predictability to emerge from left superior temporal gyrus and left angular gyrus in both experiments, with additional experiment-specific contributions. These results are consistent with the existence of predictive coding mechanisms in the central auditory system, and suggestive of the general predictive properties of the auditory system to support spoken word recognition. Copyright © 2014 Elsevier Ltd. All rights reserved.
Design, production, and testing of field effect transistors. [cryogenic MOSFETS
NASA Technical Reports Server (NTRS)
Sclar, N.
1982-01-01
Cryogenic MOSFETS (CRYOFETS), specifically designed for low temperature preamplifier application with infrared extrinsic detectors were produced and comparatively tested with p-channel MOSFETs under matched conditions. The CRYOFETs exhibit lower voltage thresholds, high source-follower gains at lower bias voltage, and lower dc offset source voltage. The noise of the CRYOFET is found to be 2 to 4 times greater than the MOSFET with a correspondingly lower figure of merit (which is established for source-follower amplifiers). The device power dissipation at a gain of 0.98 is some two orders of magnitude lower than for the MOSFET. Further, CRYOFETs are free of low temperature I vs V character hysteresis and balky conduction turn-on effects and operate effectively in the 2.4 to 20 K range. These devices have promise for use on long term duration sensor missions and for on-focal-plane signal processing at low temperatures.
The legal system. Part 2: it's not just for lawyers.
Boylan-Kemp, Jo
This is the second part of the two-part article intended to provide an introduction to the foundation elements of the English legal system. The 'English legal system' is a rather generic term that is often used to refer to the different sources of law and the court system in which the law is practised. Students of law will study the English legal system as a specific topic, but it is as equally important for those who work within a profession that is regulated by the law (as nursing is) to also develop an understanding of the legal boundaries within which such a profession works. This particular article follows on from our earlier consideration of the cornerstones of the legal system and looks at the different sources of law that can be found within the English legal system; focusing particularly on the concepts of common law, equity, and the impact of European sources on domestic law.
Exploring the Earth's crust: history and results of controlled-source seismology
Prodehl, Claus; Mooney, Walter D.
2012-01-01
This volume contains a comprehensive, worldwide history of seismological studies of the Earth’s crust using controlled sources from 1850 to 2005. Essentially all major seismic projects on land and the most important oceanic projects are covered. The time period 1850 to 1939 is presented as a general synthesis, and from 1940 onward the history and results are presented in separate chapters for each decade, with the material organized by geographical region. Each chapter highlights the major advances achieved during that decade in terms of data acquisition, processing technology, and interpretation methods. For all major seismic projects, the authors provide specific details on field observations, interpreted crustal cross sections, and key references. They conclude with global and continental-scale maps of all field measurements and interpreted Moho contours. An accompanying DVD contains important out-of-print publications and an extensive collection of controlled-source data, location maps, and crustal cross sections.
Organic aerosols over Indo-Gangetic Plain: Sources, distributions and climatic implications
NASA Astrophysics Data System (ADS)
Singh, Nandita; Mhawish, Alaa; Deboudt, Karine; Singh, R. S.; Banerjee, Tirthankar
2017-05-01
Organic aerosol (OA) constitutes a dominant fraction of airborne particulates over Indo-Gangetic Plain (IGP) especially during post-monsoon and winter. Its exposure has been associated with adverse health effects while there are evidences of its interference with Earth's radiation balance and cloud condensation (CC), resulting possible alteration of hydrological cycle. Therefore, presence and effects of OA directly link it with food security and thereby, sustainability issues. In these contexts, atmospheric chemistry involving formation, volatility and aging of primary OA (POA) and secondary OA (SOA) have been reviewed with specific reference to IGP. Systematic reviews on science of OA sources, evolution and climate perturbations are presented with databases collected from 82 publications available throughout IGP till 2016. Both gaseous and aqueous phase chemical reactions were studied in terms of their potential to form SOA. Efforts were made to recognize the regional variation of OA, its chemical constituents and sources throughout IGP and inferences were made on its possible impacts on regional air quality. Mass fractions of OA to airborne particulate showed spatial variation likewise in Lahore (37 and 44% in fine and coarse fractions, respectively), Patiala (28 and 37%), Delhi (25 and 38%), Kanpur (24 and 30%), Kolkata (11 and 21%) and Dhaka. Source apportionment studies indicate biomass burning, coal combustion and vehicular emissions as predominant OA sources. However, sources represent considerable seasonal variations with dominance of gasoline and diesel emissions during summer and coal and biomass based emissions during winter and post-monsoon. Crop residue burning over upper-IGP was also frequently held responsible for massive OA emission, mostly characterized by its hygroscopic nature, thus having potential to act as CC nuclei. Conclusively, climatic implication of particulate bound OA has been discussed in terms of its interaction with radiation balance.
NASA Astrophysics Data System (ADS)
Kis, A.; Lemperger, I.; Wesztergom, V.; Menvielle, M.; Szalai, S.; Novák, A.; Hada, T.; Matsukiyo, S.; Lethy, A. M.
2016-12-01
Magnetotelluric method is widely applied for investigation of subsurface structures by imaging the spatial distribution of electric conductivity. The method is based on the experimental determination of surface electromagnetic impedance tensor (Z) by surface geomagnetic and telluric registrations in two perpendicular orientation. In practical explorations the accurate estimation of Z necessitates the application of robust statistical methods for two reasons:1) the geomagnetic and telluric time series' are contaminated by man-made noise components and2) the non-homogeneous behavior of ionospheric current systems in the period range of interest (ELF-ULF and longer periods) results in systematic deviation of the impedance of individual time windows.Robust statistics manage both load of Z for the purpose of subsurface investigations. However, accurate analysis of the long term temporal variation of the first and second statistical moments of Z may provide valuable information about the characteristics of the ionospheric source current systems. Temporal variation of extent, spatial variability and orientation of the ionospheric source currents has specific effects on the surface impedance tensor. Twenty year long geomagnetic and telluric recordings of the Nagycenk Geophysical Observatory provides unique opportunity to reconstruct the so called magnetotelluric source effect and obtain information about the spatial and temporal behavior of ionospheric source currents at mid-latitudes. Detailed investigation of time series of surface electromagnetic impedance tensor has been carried out in different frequency classes of the ULF range. The presentation aims to provide a brief review of our results related to long term periodic modulations, up to solar cycle scale and about eventual deviations of the electromagnetic impedance and so the reconstructed equivalent ionospheric source effects.
Atmospheric Tracer Inverse Modeling Using Markov Chain Monte Carlo (MCMC)
NASA Astrophysics Data System (ADS)
Kasibhatla, P.
2004-12-01
In recent years, there has been an increasing emphasis on the use of Bayesian statistical estimation techniques to characterize the temporal and spatial variability of atmospheric trace gas sources and sinks. The applications have been varied in terms of the particular species of interest, as well as in terms of the spatial and temporal resolution of the estimated fluxes. However, one common characteristic has been the use of relatively simple statistical models for describing the measurement and chemical transport model error statistics and prior source statistics. For example, multivariate normal probability distribution functions (pdfs) are commonly used to model these quantities and inverse source estimates are derived for fixed values of pdf paramaters. While the advantage of this approach is that closed form analytical solutions for the a posteriori pdfs of interest are available, it is worth exploring Bayesian analysis approaches which allow for a more general treatment of error and prior source statistics. Here, we present an application of the Markov Chain Monte Carlo (MCMC) methodology to an atmospheric tracer inversion problem to demonstrate how more gereral statistical models for errors can be incorporated into the analysis in a relatively straightforward manner. The MCMC approach to Bayesian analysis, which has found wide application in a variety of fields, is a statistical simulation approach that involves computing moments of interest of the a posteriori pdf by efficiently sampling this pdf. The specific inverse problem that we focus on is the annual mean CO2 source/sink estimation problem considered by the TransCom3 project. TransCom3 was a collaborative effort involving various modeling groups and followed a common modeling and analysis protocoal. As such, this problem provides a convenient case study to demonstrate the applicability of the MCMC methodology to atmospheric tracer source/sink estimation problems.
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Fate of hydrocarbon pollutants in source and non-source control sustainable drainage systems.
Roinas, Georgios; Mant, Cath; Williams, John B
2014-01-01
Sustainable drainage (SuDs) is an established method for managing runoff from developments, and source control is part of accepted design philosophy. However, there are limited studies into the contribution source control makes to pollutant removal, especially for roads. This study examines organic pollutants, total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAHs), in paired source and non-source control full-scale SuDs systems. Sites were selected to cover local roads, trunk roads and housing developments, with a range of SuDs, including porous asphalt, swales, detention basins and ponds. Soil and water samples were taken bi-monthly over 12 months to assess pollutant loads. Results show first flush patterns in storm events for solids, but not for TPH. The patterns of removal for specific PAHs were also different, reflecting varying physico-chemical properties. The potential of trunk roads for pollution was illustrated by peak runoff for TPH of > 17,000 μg/l. Overall there was no significant difference between pollutant loads from source and non-source control systems, but the dynamic nature of runoff means that longer-term data are required. The outcomes of this project will increase understanding of organic pollutants behaviour in SuDs. This will provide design guidance about the most appropriate systems for treating these pollutants.
Planck 2015 results. XXII. A map of the thermal Sunyaev-Zeldovich effect
NASA Astrophysics Data System (ADS)
Planck Collaboration; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Battye, R.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chiang, H. C.; Christensen, P. R.; Churazov, E.; Clements, D. L.; Colombo, L. P. L.; Combet, C.; Comis, B.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Génova-Santos, R. T.; Giard, M.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Holmes, W. A.; Hornstrup, A.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lacasa, F.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Melchiorri, A.; Melin, J.-B.; Migliaccio, M.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Perdereau, O.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Sauvé, A.; Savelainen, M.; Savini, G.; Scott, D.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sunyaev, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tramonte, D.; Tristram, M.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.
2016-09-01
We have constructed all-sky Compton parameters maps, y-maps, of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 30 to 857 GHz frequency channel maps from the Planck satellite. These reconstructed y-maps are delivered as part of the Planck 2015 release. The y-maps are characterized in terms of noise properties and residual foreground contamination, mainly thermal dust emission at large angular scales, and cosmic infrared background and extragalactic point sources at small angular scales. Specific masks are defined to minimize foreground residuals and systematics. Using these masks, we compute the y-map angular power spectrum and higher order statistics. From these we conclude that the y-map is dominated by tSZ signal in the multipole range, 20 <ℓ< 600. We compare the measured tSZ power spectrum and higher order statistics to various physically motivated models and discuss the implications of our results in terms of cluster physics and cosmology.
A Disorder of Executive Function and Its Role in Language Processing
Martin, Randi C.; Allen, Corinne M.
2014-01-01
R. Martin and colleagues have proposed separate stores for the maintenance of phonological and semantic information in short-term memory. Evidence from patients with aphasia has shown that damage to these separable buffers has specific consequences for language comprehension and production, suggesting an interdependence between language and memory systems. This article discusses recent research on aphasic patients with limited-capacity short-term memories (STMs) and reviews evidence suggesting that deficits in retaining semantic information in STM may be caused by a disorder in the executive control process of inhibition, specific to verbal representations. In contrast, a phonological STM deficit may be due to overly rapid decay. In semantic STM deficits, it is hypothesized that the inhibitory deficit produces difficulty inhibiting irrelevant verbal representations, which may lead to excessive interference. In turn, the excessive interference associated with semantic STM deficits has implications for single-word and sentence processing, and it may be the source of the reduced STM capacity shown by these patients. PMID:18720317
Low birth weight and air pollution in California: Which sources and components drive the risk?
Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun
2016-01-01
Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Emissions of microplastic fibers from microfiber fleece during domestic washing.
Pirc, U; Vidmar, M; Mozer, A; Kržan, A
2016-11-01
Microplastics are found in marine and freshwater environments; however, their specific sources are not yet well understood. Understanding sources will be of key importance in efforts to reduce emissions into the environment. We examined the emissions of microfibers from domestic washing of a new microfiber polyester fleece textile. Analyzing released fibers collected with a 200 μm filter during 10 mild, successive washing cycles showed that emission initially decreased and then stabilized at approx. 0.0012 wt%. This value is our estimation for the long-term release of fibers during each washing. Use of detergent and softener did not significantly influence emission. Release of fibers during tumble drying was approx. 3.5 times higher than during washing.
Stafoggia, Massimo; Zauli-Sajani, Stefano; Pey, Jorge; Samoli, Evangelia; Alessandrini, Ester; Basagaña, Xavier; Cernigliaro, Achille; Chiusolo, Monica; Demaria, Moreno; Díaz, Julio; Faustini, Annunziata; Katsouyanni, Klea; Kelessis, Apostolos G; Linares, Cristina; Marchesi, Stefano; Medina, Sylvia; Pandolfi, Paolo; Pérez, Noemí; Querol, Xavier; Randi, Giorgia; Ranzi, Andrea; Tobias, Aurelio; Forastiere, Francesco
2016-04-01
Evidence on the association between short-term exposure to desert dust and health outcomes is controversial. We aimed to estimate the short-term effects of particulate matter ≤ 10 μm (PM10) on mortality and hospital admissions in 13 Southern European cities, distinguishing between PM10 originating from the desert and from other sources. We identified desert dust advection days in multiple Mediterranean areas for 2001-2010 by combining modeling tools, back-trajectories, and satellite data. For each advection day, we estimated PM10 concentrations originating from desert, and computed PM10 from other sources by difference. We fitted city-specific Poisson regression models to estimate the association between PM from different sources (desert and non-desert) and daily mortality and emergency hospitalizations. Finally, we pooled city-specific results in a random-effects meta-analysis. On average, 15% of days were affected by desert dust at ground level (desert PM10 > 0 μg/m3). Most episodes occurred in spring-summer, with increasing gradient of both frequency and intensity north-south and west-east of the Mediterranean basin. We found significant associations of both PM10 concentrations with mortality. Increases of 10 μg/m3 in non-desert and desert PM10 (lag 0-1 days) were associated with increases in natural mortality of 0.55% (95% CI: 0.24, 0.87%) and 0.65% (95% CI: 0.24, 1.06%), respectively. Similar associations were estimated for cardio-respiratory mortality and hospital admissions. PM10 originating from the desert was positively associated with mortality and hospitalizations in Southern Europe. Policy measures should aim at reducing population exposure to anthropogenic airborne particles even in areas with large contribution from desert dust advections. Stafoggia M, Zauli-Sajani S, Pey J, Samoli E, Alessandrini E, Basagaña X, Cernigliaro A, Chiusolo M, Demaria M, Díaz J, Faustini A, Katsouyanni K, Kelessis AG, Linares C, Marchesi S, Medina S, Pandolfi P, Pérez N, Querol X, Randi G, Ranzi A, Tobias A, Forastiere F, MED-PARTICLES Study Group. 2016. Desert dust outbreaks in Southern Europe: contribution to daily PM10 concentrations and short-term associations with mortality and hospital admissions. Environ Health Perspect 124:413-419; http://dx.doi.org/10.1289/ehp.1409164.
Silverstein, S. M.; Miller, P. L.; Cullen, M. R.
1993-01-01
This paper describes a prototype information sources map (ISM), an on-line information source finder, for Occupational and Environmental Medicine (OEM). The OEM ISM was built as part of the Unified Medical Language System (UMLS) project of the National Library of Medicine. It allows a user to identify sources of on-line information appropriate to a specific OEM question, and connect to the sources. In the OEM ISM we explore a domain-specific method of indexing information source contents, and also a domain-specific user interface. The indexing represents a domain expert's opinion of the specificity of an information source in helping to answer specific types of domain questions. For each information source, an index field represents whether a source might provide useful information in an occupational, industrial, or environmental category. Additional fields represent the degree of specificity of a source in individual question types in each category. The paper discusses the development, design, and implementation of the prototype OEM ISM. PMID:8130548
Sensitivity Analysis Tailored to Constrain 21st Century Terrestrial Carbon-Uptake
NASA Astrophysics Data System (ADS)
Muller, S. J.; Gerber, S.
2013-12-01
The long-term fate of terrestrial carbon (C) in response to climate change remains a dominant source of uncertainty in Earth-system model projections. Increasing atmospheric CO2 could be mitigated by long-term net uptake of C, through processes such as increased plant productivity due to "CO2-fertilization". Conversely, atmospheric conditions could be exacerbated by long-term net release of C, through processes such as increased decomposition due to higher temperatures. This balance is an important area of study, and a major source of uncertainty in long-term (>year 2050) projections of planetary response to climate change. We present results from an innovative application of sensitivity analysis to LM3V, a dynamic global vegetation model (DGVM), intended to identify observed/observable variables that are useful for constraining long-term projections of C-uptake. We analyzed the sensitivity of cumulative C-uptake by 2100, as modeled by LM3V in response to IPCC AR4 scenario climate data (1860-2100), to perturbations in over 50 model parameters. We concurrently analyzed the sensitivity of over 100 observable model variables, during the extant record period (1970-2010), to the same parameter changes. By correlating the sensitivities of observable variables with the sensitivity of long-term C-uptake we identified model calibration variables that would also constrain long-term C-uptake projections. LM3V employs a coupled carbon-nitrogen cycle to account for N-limitation, and we find that N-related variables have an important role to play in constraining long-term C-uptake. This work has implications for prioritizing field campaigns to collect global data that can help reduce uncertainties in the long-term land-atmosphere C-balance. Though results of this study are specific to LM3V, the processes that characterize this model are not completely divorced from other DGVMs (or reality), and our approach provides valuable insights into how data can be leveraged to be better constrain projections for the land carbon sink.
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Antineoplastic Efficacy of Novel Polyamine Analogues in Human Breast Cancer
2006-06-01
Davidson, N.E., and Casero, R.A.. Spermine oxidase SMO(PAOh1), not N1-acetylpolyamine oxidase (PAO) is the primary source of cytotoxic H2O2 in...human spermine oxidase SMO(PAOh1). SMO(PAOh1) uses unacetylated spermine as substrate and is inducible by specific polyamine analogs [15,16]. These...technique to find the identical clone termed spermine oxidase (SMO) [16]. The function of SMO(PAOh1) as a spermine oxidase has been confirmed [15,67,68
An update on airborne contact dermatitis.
Huygens, S; Goossens, A
2001-01-01
This review is an update of 2 previously published articles on airborne contact dermatoses. Because reports in the literature often omit the term 'airborne', 18 volumes of Contact Dermatitis (April 1991-June 2000), 8 volumes of the American Journal of Contact Dermatitis (1992 1999) and 4 volumes of La Lettre du Gerda (1996-1999) were screened, and the cases cited were classified as to history, lesion locations, sensitization sources, and other factors. Reports on airborne dermatitis are increasingly being published, sometimes in relation to specific occupational areas.
Effect of Automatic Processing on Specification of Problem Solutions for Computer Programs.
1981-03-01
Number 7 ± 2" item limitaion on human short-term memory capability (Miller, 1956) should be a guiding principle in program design. Yourdon and...input either a single example solution or multiple exam’- le solutions in sequence. If a participant’s P1 has a low value - near 0 - it may be concluded... Principles in Experimental Design, Winer ,1971). 55 Table 12 ANOVA Resultt, For Performance Measure 2 Sb DF MS F Source of Variation Between Subjects
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Select Committee on Aging.
This paper, on the health hazards of cold weather for elderly persons, presents information from various sources on the death rates in winter throughout the United States. After reviewing the scope of the problem, specific health hazards associated with cold weather are discussed, i.e., hypothermia, fires, carbon monoxide poisoning, and influenza…
2010-07-01
the ground source heat pump system . During installation, construction equipment would remove vegetation from the surface and disturb soil to a depth...levels of 50 to 55 dBA or higher on a daily basis. Studies specifically conducted to determine noise effects on various human activities show that about...needs to be evaluated for its potential effects on a project site and adjacent land uses. The foremost factor affecting a proposed action in terms of
NASA Astrophysics Data System (ADS)
Herzel, Hanspeter; Reuter, Robert
1996-06-01
Irregularities in voiced speech are often observed as a consequence of vocal fold lesions, paralyses, and other pathological conditions. Many of these instabilities are related to the intrinsic nonlinearities in the vibrations of the vocal folds. In this paper, a specific nonlinear phenomenon is discussed: The appearance of two independent fundamental frequencies termed biphonation. Several narrow-band spectrograms are presented showing biphonation in signals from voice patients, a newborn cry, a singer, and excised larynx experiments. Finally, possible physiological mechanisms of instabilities of the voice source are discussed.
Computational Investigation of Combustion Dynamics in a Lean-Direct Injection Gas Turbine Combustor
2012-11-01
variable vector which includes turbulence kinetic energy and specific dissipation, k and w; In the viscous flux, D is the molecular diffusion coefficient...for the liquid particle. This equation assumes the uniform temperature inside the liquid particle. The source term consist of the net sensible ...Spray Characteristics on Diesel Engine Combustion and Emission, SAE 980131, 1998 24 Fu, Y., “Aerodynamics and Combustion of Axial Swirlers,” Ph . D. dissertation from the University of Cincinnati, 2008.
Agoncillo, A V; Mejino, J L; Rosse, C
1999-01-01
A principled and logical representation of the structure of the human body has led to conflicts with traditional representations of the same knowledge by anatomy textbooks. The examples which illustrate resolution of these conflicts suggest that stricter requirements must be met for semantic consistency, expressivity and specificity by knowledge sources intended to support inference than by textbooks and term lists. These next-generation resources should influence traditional concept representation, rather than be constrained by convention.
NASA Astrophysics Data System (ADS)
Braban, Christine; Tang, Sim; Bealey, Bill; Roberts, Elin; Stephens, Amy; Galloway, Megan; Greenwood, Sarah; Sutton, Mark; Nemitz, Eiko; Leaver, David
2017-04-01
Ambient ammonia measurements have been undertaken both in the atmosphere to understand sources, concentrations at background and vulnerable ecosystems and for long term monitoring of concentrations. As a pollutant which is projected to increase concentration in the coming decades with significant policy challenges to implementing mitigation strategies it is useful to assess what has been measured, where and why. In this study a review of the literature, has shown that ammonia measurements are frequently not publically reported and in general not reposited in the open data centres, available for research. The specific sectors where measurements have been undertaken are: agricultural point source assessments, agricultural surface exchange measurements, sensitive ecosystem monitoring, landscape/regional studies and governmental long term monitoring. Less frequently ammonia is measured as part of an intensive atmospheric chemistry field campaign. Technology is developing which means a shift from chemical denuder methods to spectroscopic techniques may be possible, however chemical denuding techniques with off-line laboratory analysis will likely be an economical approach for some time to come. This paper reviews existing datasets from the different sectors of research and integrates them for a global picture to allow both a long term understanding and facilitate comparison with future measurements.
A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms
2014-01-01
Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581
Yo, Chia-Hung; Lee, Si-Huei; Chang, Shy-Shin; Lee, Matthew Chien-Hung; Lee, Chien-Chang
2014-01-01
Objectives We performed a systematic review and meta-analysis of studies on high-sensitivity C-reactive protein (hs-CRP) assays to see whether these tests are predictive of atrial fibrillation (AF) recurrence after cardioversion. Design Systematic review and meta-analysis. Data sources PubMed, EMBASE and Cochrane databases as well as a hand search of the reference lists in the retrieved articles from inception to December 2013. Study eligibility criteria This review selected observational studies in which the measurements of serum CRP were used to predict AF recurrence. An hs-CRP assay was defined as any CRP test capable of measuring serum CRP to below 0.6 mg/dL. Primary and secondary outcome measures We summarised test performance characteristics with the use of forest plots, hierarchical summary receiver operating characteristic curves and bivariate random effects models. Meta-regression analysis was performed to explore the source of heterogeneity. Results We included nine qualifying studies comprising a total of 347 patients with AF recurrence and 335 controls. A CRP level higher than the optimal cut-off point was an independent predictor of AF recurrence after cardioversion (summary adjusted OR: 3.33; 95% CI 2.10 to 5.28). The estimated pooled sensitivity and specificity for hs-CRP was 71.0% (95% CI 63% to 78%) and 72.0% (61% to 81%), respectively. Most studies used a CRP cut-off point of 1.9 mg/L to predict long-term AF recurrence (77% sensitivity, 65% specificity), and 3 mg/L to predict short-term AF recurrence (73% sensitivity, 71% specificity). Conclusions hs-CRP assays are moderately accurate in predicting AF recurrence after successful cardioversion. PMID:24556243
26 CFR 1.737-1 - Recognition of precontribution gain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Xu, Rong; Supekar, Kaustubh; Morgan, Alex; Das, Amar; Garber, Alan
2008-11-06
Concept specific lexicons (e.g. diseases, drugs, anatomy) are a critical source of background knowledge for many medical language-processing systems. However, the rapid pace of biomedical research and the lack of constraints on usage ensure that such dictionaries are incomplete. Focusing on disease terminology, we have developed an automated, unsupervised, iterative pattern learning approach for constructing a comprehensive medical dictionary of disease terms from randomized clinical trial (RCT) abstracts, and we compared different ranking methods for automatically extracting con-textual patterns and concept terms. When used to identify disease concepts from 100 randomly chosen, manually annotated clinical abstracts, our disease dictionary shows significant performance improvement (F1 increased by 35-88%) over available, manually created disease terminologies.
Xu, Rong; Supekar, Kaustubh; Morgan, Alex; Das, Amar; Garber, Alan
2008-01-01
Concept specific lexicons (e.g. diseases, drugs, anatomy) are a critical source of background knowledge for many medical language-processing systems. However, the rapid pace of biomedical research and the lack of constraints on usage ensure that such dictionaries are incomplete. Focusing on disease terminology, we have developed an automated, unsupervised, iterative pattern learning approach for constructing a comprehensive medical dictionary of disease terms from randomized clinical trial (RCT) abstracts, and we compared different ranking methods for automatically extracting contextual patterns and concept terms. When used to identify disease concepts from 100 randomly chosen, manually annotated clinical abstracts, our disease dictionary shows significant performance improvement (F1 increased by 35–88%) over available, manually created disease terminologies. PMID:18999169
The Disposal of Spacecraft and Launch Vehicle Stages in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Johnson, Nicholas L.
2007-01-01
Spacecraft and launch vehicle stages abandoned in Earth orbit have historically been a primary source of debris from accidental explosions. In the future, such satellites will become the principal cause of orbital debris via inadvertent collisions. To curtail both the near-term and far-term risks posed by derelict spacecraft and launch vehicle stages to operational space systems, numerous national and international orbital debris mitigation guidelines specifically recommend actions which could prevent or limit such future debris generation. Although considerable progress has been made in implementing these recommendations, some changes to existing vehicle designs can be difficult. Moreover, the nature of some missions also can present technological and budgetary challenges to be compliant with widely accepted orbital debris mitigation measures.
Effects of metals within ambient air particulate matter (PM) on human health.
Chen, Lung Chi; Lippmann, Morton
2009-01-01
We review literature providing insights on health-related effects caused by inhalation of ambient air particulate matter (PM) containing metals, emphasizing effects associated with in vivo exposures at or near contemporary atmospheric concentrations. Inhalation of much higher concentrations, and high-level exposures via intratracheal (IT) instillation that inform mechanistic processes, are also reviewed. The most informative studies of effects at realistic exposure levels, in terms of identifying influential individual PM components or source-related mixtures, have been based on (1) human and laboratory animal exposures to concentrated ambient particles (CAPs), and (2) human population studies for which both health-related effects were observed and PM composition data were available for multipollutant regression analyses or source apportionment. Such studies have implicated residual oil fly ash (ROFA) as the most toxic source-related mixture, and Ni and V, which are characteristic tracers of ROFA, as particularly influential components in terms of acute cardiac function changes and excess short-term mortality. There is evidence that other metals within ambient air PM, such as Pb and Zn, also affect human health. Most evidence now available is based on the use of ambient air PM components concentration data, rather than actual exposures, to determine significant associations and/or effects coefficients. Therefore, considerable uncertainties about causality are associated with exposure misclassification and measurement errors. As more PM speciation data and more refined modeling techniques become available, and as more CAPs studies involving PM component analyses are performed, the roles of specific metals and other components within PM will become clearer.
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeze, R.A.; McWhorter, D.B.
Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a proposed framework for quantifying the degree to which risk is reduced as mass is removed from DNAPL source areas in shallow, saturated, low-permeability media. Risk is defined in terms of meeting an alternate concentration limit (ACL) at a compliance well in an aquifer underlying the sourcemore » zone. The ACL is back-calculated from a carcinogenic health-risk characterization at a downgradient water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phase (aqueous, sorbed, NAPL). Due to the uncertainties in currently available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making specific risk-reduction calculations for individual technologies. Despite the qualitative nature of the exercise, results imply that very high total mass-removal efficiencies are required to achieve significant long-term risk reduction with technology applications of finite duration. This paper is not an argument for no action at contaminated sites. Rather, it provides support for the conclusions of Cherry et al. (1992) that the primary goal of current remediation should be short-term risk reduction through containment, with the aim to pass on to future generations site conditions that are well-suited to the future applications of emerging technologies with improved mass-removal capabilities.« less
Observation-based source terms in the third-generation wave model WAVEWATCH
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.
2015-12-01
Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.
Beyond the double banana: improved recognition of temporal lobe seizures in long-term EEG.
Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger; Alving, Jørgen; Fabricius, Martin Ejler; Scherg, Michael; Neufeld, Miri Y; Pressler, Ronit; Kjaer, Troels W; van Emde Boas, Walter; Beniczky, Sándor
2014-02-01
To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source analysis was investigated. Forty EEG samples were reviewed by 7 EEG experts in various montages (longitudinal and transversal bipolar, common average, source derivation, source montage, current source density, and reference-free montages) using 2 electrode arrays (10-20 and the extended one). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest sensitivity and identified ictal activity at earlier time-point than visual inspection. There was no significant difference concerning specificity. The findings advocate for the use of these digital EEG technology-derived analysis methods in clinical practice.
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat.
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-04-21
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation.
Conceptualization and validation of an open-source closed-loop deep brain stimulation system in rat
Wu, Hemmings; Ghekiere, Hartwin; Beeckmans, Dorien; Tambuyzer, Tim; van Kuyck, Kris; Aerts, Jean-Marie; Nuttin, Bart
2015-01-01
Conventional deep brain stimulation (DBS) applies constant electrical stimulation to specific brain regions to treat neurological disorders. Closed-loop DBS with real-time feedback is gaining attention in recent years, after proved more effective than conventional DBS in terms of pathological symptom control clinically. Here we demonstrate the conceptualization and validation of a closed-loop DBS system using open-source hardware. We used hippocampal theta oscillations as system input, and electrical stimulation in the mesencephalic reticular formation (mRt) as controller output. It is well documented that hippocampal theta oscillations are highly related to locomotion, while electrical stimulation in the mRt induces freezing. We used an Arduino open-source microcontroller between input and output sources. This allowed us to use hippocampal local field potentials (LFPs) to steer electrical stimulation in the mRt. Our results showed that closed-loop DBS significantly suppressed locomotion compared to no stimulation, and required on average only 56% of the stimulation used in open-loop DBS to reach similar effects. The main advantages of open-source hardware include wide selection and availability, high customizability, and affordability. Our open-source closed-loop DBS system is effective, and warrants further research using open-source hardware for closed-loop neuromodulation. PMID:25897892
Becker, Mark F.; Peter, Kathy D.; Masoner, Jason
2002-01-01
Samples collected and analyzed by the Oklahoma Department of Agriculture, Food, and Forestry from 1999 to 2001 determined that nitrate exceeded the U.S. Environmental Protection Agency maximum contaminant level for public drinking-water supplies of 10 milligrams per liter as nitrogen in 79 monitoring wells at 35 swine licensed-managed feeding operations (LMFO) in Oklahoma. The LMFOs are located in rural agricultural settings where long-term agriculture has potentially affected the ground-water quality in some areas. Land use prior to the construction of the LMFOs was assessed to evaluate the types of agricultural land use within a 500-meter radius of the sampled wells. Chemical and microbiological techniques were used to determine the possible sources of nitrate in water sampled from 10 wastewater lagoons and 79 wells. Samples were analyzed for dissolved major ions, dissolved trace elements, dissolved nutrients, nitrogen isotope ratios of nitrate and ammonia, wastewater organic compounds, and fecal coliform bacteria. Bacteria ribotyping analysis was done on selected samples to identify possible specific animal sources. A decision process was developed to identify the possible sources of nitrate. First, nitrogen isotope ratios were used to define sources as animal, mixed animal and fertilizer, or fertilizer. Second, wastewater organic compound detections, nitrogen-isotope ratios, fecal coliform bacteria detections, and ribotyping were used to refine the identification of possible sources as LFMO waste, fertilizer, or unidentified animal or mixtures of these sources. Additional evidence provided by ribotyping and wastewater organic compound data can, in some cases, specifically indicate the animal source. Detections of three or more wastewater organic compounds that are indicators of animal sources and detections of fecal coliform bacteria provided additional evidence of an animal source. LMFO waste was designated as a possible source of nitrate in water from 10 wells. The source of waste in water from five of those wells was determined through ribotyping, and the source of waste in water from the remaining five wells was determined by detections of three or more animal-waste compounds in the well samples. LMFO waste in the water from wells with unidentified animal source of nitrate does not indicate that LMFO waste was not the source, but indicated that multiple animal sources, including LMFO waste, may be the source of the nitrate.
Bayesian source term determination with unknown covariance of measurements
NASA Astrophysics Data System (ADS)
Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav
2017-04-01
Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
NASA Astrophysics Data System (ADS)
Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.
2016-12-01
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Labarrière, Nathalie; Gervois, Nadine; Bonnin, Annabelle; Bouquié, Régis; Jotereau, Francine; Lang, François
2008-02-01
Choosing a reliable source of tumor-specific T lymphocytes and an efficient method to isolate these cells still remains a critical issue in adoptive cellular therapy (ACT). In this study, we assessed the capacity of MHC/peptide based immunomagnetic sorting followed by polyclonal T cell expansion to derive pure polyclonal and tumor-reactive Melan-A specific T cell populations from melanoma patient's PBMC and TIL. We first demonstrated that this approach was extremely efficient and reproducible. We then used this procedure to compare PBMC and TIL-derived cells from three melanoma patients in terms of avidity for Melan-A A27L analog, Melan-A(26-35)and Melan-A(27-35), tumor reactivity (lysis and cytokine production) and repertoire. Regardless of their origin, i.e., fresh PBMC, peptide stimulated PBMC or TIL, all sorted populations (from the three patients) were cytotoxic against HLA-A2+ melanoma cell lines expressing Melan-A. Although some variability in peptide avidity, lytic activity and cytokine production was observed between populations of different origins in a given patient, it differed from one patient to another and thus no correlation could be drawn between T cell source and reactivity. Analysis of Vbeta usage within the sorted populations showed the recurrence of Vbeta3 and Vbeta14 subfamilies in the three patients but differences in the rest of the Melan-A repertoire. In addition, in two patients, we observed major repertoire differences between populations sorted from the three sources. We especially documented that in vitro peptide stimulation of PBMC, used to facilitate the sort by enriching in specific T lymphocytes, could significantly alter their repertoire and reactivity towards tumor cells. We conclude that PBMC which are easily obtained from all melanoma patients, can be as good a source as TIL to derive high amounts of tumor-reactive Melan-A specific T cells, with this selection/amplification procedure. However, the conditions of peptide stimulation should be improved to prevent a possible loss of reactive clonotypes.
Seltmann, Katja C.; Pénzes, Zsolt; Yoder, Matthew J.; Bertone, Matthew A.; Deans, Andrew R.
2013-01-01
Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the ‘Proofer’, as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction. PMID:23441153
Seltmann, Katja C; Pénzes, Zsolt; Yoder, Matthew J; Bertone, Matthew A; Deans, Andrew R
2013-01-01
Hymenoptera, the insect order that includes sawflies, bees, wasps, and ants, exhibits an incredible diversity of phenotypes, with over 145,000 species described in a corpus of textual knowledge since Carolus Linnaeus. In the absence of specialized training, often spanning decades, however, these articles can be challenging to decipher. Much of the vocabulary is domain-specific (e.g., Hymenoptera biology), historically without a comprehensive glossary, and contains much homonymous and synonymous terminology. The Hymenoptera Anatomy Ontology was developed to surmount this challenge and to aid future communication related to hymenopteran anatomy, as well as provide support for domain experts so they may actively benefit from the anatomy ontology development. As part of HAO development, an active learning, dictionary-based, natural language recognition tool was implemented to facilitate Hymenoptera anatomy term discovery in literature. We present this tool, referred to as the 'Proofer', as part of an iterative approach to growing phenotype-relevant ontologies, regardless of domain. The process of ontology development results in a critical mass of terms that is applied as a filter to the source collection of articles in order to reveal term occurrence and biases in natural language species descriptions. Our results indicate that taxonomists use domain-specific terminology that follows taxonomic specialization, particularly at superfamily and family level groupings and that the developed Proofer tool is effective for term discovery, facilitating ontology construction.
Design of an Air Pollution Monitoring Campaign in Beijing for Application to Cohort Health Studies.
Vedal, Sverre; Han, Bin; Xu, Jia; Szpiro, Adam; Bai, Zhipeng
2017-12-15
No cohort studies in China on the health effects of long-term air pollution exposure have employed exposure estimates at the fine spatial scales desirable for cohort studies with individual-level health outcome data. Here we assess an array of modern air pollution exposure estimation approaches for assigning within-city exposure estimates in Beijing for individual pollutants and pollutant sources to individual members of a cohort. Issues considered in selecting specific monitoring data or new monitoring campaigns include: needed spatial resolution, exposure measurement error and its impact on health effect estimates, spatial alignment and compatibility with the cohort, and feasibility and expense. Sources of existing data largely include administrative monitoring data, predictions from air dispersion or chemical transport models and remote sensing (specifically satellite) data. New air monitoring campaigns include additional fixed site monitoring, snapshot monitoring, passive badge or micro-sensor saturation monitoring and mobile monitoring, as well as combinations of these. Each of these has relative advantages and disadvantages. It is concluded that a campaign in Beijing that at least includes a mobile monitoring component, when coupled with currently available spatio-temporal modeling methods, should be strongly considered. Such a campaign is economical and capable of providing the desired fine-scale spatial resolution for pollutants and sources.
Design of an Air Pollution Monitoring Campaign in Beijing for Application to Cohort Health Studies
Vedal, Sverre; Han, Bin; Szpiro, Adam; Bai, Zhipeng
2017-01-01
No cohort studies in China on the health effects of long-term air pollution exposure have employed exposure estimates at the fine spatial scales desirable for cohort studies with individual-level health outcome data. Here we assess an array of modern air pollution exposure estimation approaches for assigning within-city exposure estimates in Beijing for individual pollutants and pollutant sources to individual members of a cohort. Issues considered in selecting specific monitoring data or new monitoring campaigns include: needed spatial resolution, exposure measurement error and its impact on health effect estimates, spatial alignment and compatibility with the cohort, and feasibility and expense. Sources of existing data largely include administrative monitoring data, predictions from air dispersion or chemical transport models and remote sensing (specifically satellite) data. New air monitoring campaigns include additional fixed site monitoring, snapshot monitoring, passive badge or micro-sensor saturation monitoring and mobile monitoring, as well as combinations of these. Each of these has relative advantages and disadvantages. It is concluded that a campaign in Beijing that at least includes a mobile monitoring component, when coupled with currently available spatio-temporal modeling methods, should be strongly considered. Such a campaign is economical and capable of providing the desired fine-scale spatial resolution for pollutants and sources. PMID:29244738
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Hacke, Peter L.; Kempe, Michael D.
2015-06-14
Reduced optical transmittance of encapsulation resulting from ultraviolet (UV) degradation has frequently been identified as a cause of decreased PV module performance through the life of installations in the field. The present module safety and qualification standards, however, apply short UV doses only capable of examining design robustness or 'infant mortality' failures. Essential information that might be used to screen encapsulation through product lifetime remains unknown. For example, the relative efficacy of xenon-arc and UVA-340 fluorescent sources or the typical range of activation energy for degradation is not quantified. We have conducted an interlaboratory experiment to provide the understanding thatmore » will be used towards developing a climate- and configuration-specific (UV) weathering test. Five representative, known formulations of EVA were studied in addition to one TPU material. Replicate laminated silica/polymer/silica specimens are being examined at 14 institutions using a variety of indoor chambers (including Xe, UVA-340, and metal-halide light sources) or field aging. The solar-weighted transmittance, yellowness index, and the UV cut-off wavelength, determined from the measured hemispherical transmittance, are examined to provide understanding and guidance for the UV light source (lamp type) and temperature used in accelerated UV aging tests. Index Terms -- reliability, durability, thermal activation.« less
Source apportionment of particulate organic matter using infrared spectra at multiple IMPROVE sites
NASA Astrophysics Data System (ADS)
Kuzmiakova, A.; Dillner, A. M.; Takahama, S.
2016-12-01
As organic aerosol is a dominant contributor to air pollution and radiative forcing in many regions in the United States, characterizing its composition and apportioning the organic mass to its major sources provides insight into atmospheric processes and guidance for decreasing its abundance. National networks, such as Interagency Monitoring of Protected Visual Environment (IMPROVE), provide multi-site and multi-year particulate matter samples useful for evaluating sources over all four seasons. To this end, our study focuses on apportioning the particulate organic matter (OM) to specific anthropogenic and biological processes from year-long infrared aerosol measurements collected at six IMPROVE sites (five national park sites and one urban site) during 2011. Pooling these organic aerosol samples into one dataset, we apply factor and cluster analyses to extract four chemical factors (two dominated by processed emissions, one dominated by hydroxyl groups, and one by hydrocarbons) and ascribe each factor to a specific source depending on the site and season. We also present a method to characterize measurement uncertainty in infrared instrumental analysis and investigate sensitivity analysis in generated factors. In Phoenix (the urban site) we find the majority (80-95%) of the OM consisted of anthropogenic activities, such as traffic emissions, fossil fuel combustion (both all year long), and residential wood burning (fall to winter). Mineral dust emissions accounted for the rest of OM (5-20%). At the National Park sites the OM concentration was lower on average and consisted of marine and dust aerosols, summertime biomass burning and biogenic aerosols, processed fossil fuel combustion, and emissions from ships and oil refineries. Our study highlights the potential for further site-specific or multi-year aerosol characterization in the context of a long-term atmospheric sampling program to quantify sources of organic particles impacting air quality, aid in policy-making, and assess which (trans)formation mechanisms proposed in laboratory studies are consistent with observations.
The Tea-Carbon Dioxide Laser as a Means of Generating Ultrasound in Solids
NASA Astrophysics Data System (ADS)
Taylor, Gregory Stuart
1990-01-01
Available from UMI in association with The British Library. Requires signed TDF. The aim of this thesis is to characterise the interaction between pulsed, high power, 10.6 mu m radiation and solids. The work is considered both in the general context of laser generation of ultrasound and specifically to gain a deeper understanding of the interaction between a laser supported plasma and a solid. The predominant experimental tools used are the homodyne Michelson interferometer and a range of electromagnetic acoustic transducers. To complement the ultrasonic data, various plasma inspection techniques, such as high speed, streak camera photography and reflection photometry, have been used to correlate the plasma properties with those of the ultrasonic transients. The work involving the characterisation of a laser supported plasma with a solid, which is based on previous experimental and theoretical analysis, gives an increased understanding of the plasma's ultrasonic generation mechanism. The ability to record the entire plasma-sample interaction, time history yields information of the internal dynamics of the plasma growth and shock wave generation. The interaction of the radiation with a solid is characterised in both the plasma breakdown and non-breakdown regimes by a wide ultrasonic source. The variation in source diameter enables the transition from a point to a near planar ultrasonic source to be studied. The resultant ultrasonic modifications are examined in terms of the wave structure and the directivity pattern. The wave structure is analysed in terms of existing wide source, bulk wave theories and extended to consider the effects on surface and Lamb waves. The directivity patterns of the longitudinal and shear waves are analysed in terms of top-hat and non -uniform source profiles, giving additional information into the radiation-solid interaction. The wide, one dimensional source analysis is continued to a two dimensional, extended ultrasonic source, generated on non-metals by the optical penetration of radiation within the target. The generation of ultrasound in both metals and non-metals, using the CO_2 laser, is shown to be an efficient process and may be employed almost totally non-destructively. Such a laser may therefore be used effectively on a greatly enhanced range of materials than those tested to-date via laser generation, resulting in the increased suitability of the laser technique within the field of Non Destructive Testing.
That's not how the learning works - the paradox of Reverse Innovation: a qualitative study.
Harris, Matthew; Weisberger, Emily; Silver, Diana; Dadwal, Viva; Macinko, James
2016-07-05
There are significant differences in the meaning and use of the term 'Reverse Innovation' between industry circles, where the term originated, and health policy circles where the term has gained traction. It is often conflated with other popularized terms such as Frugal Innovation, Co-development and Trickle-up Innovation. Compared to its use in the industrial sector, this conceptualization of Reverse Innovation describes a more complex, fragmented process, and one with no particular institution in charge. It follows that the way in which the term 'Reverse Innovation', specifically, is understood and used in the healthcare space is worthy of examination. Between September and December 2014, we conducted eleven in-depth face-to-face or telephone interviews with key informants from innovation, health and social policy circles, experts in international comparative policy research and leaders in the Reverse Innovation space in the United States. Interviews were open-ended with guiding probes into the barriers and enablers to Reverse Innovation in the US context, specifically also informants' experience and understanding of the term Reverse Innovation. Interviews were recorded, transcribed and analyzed thematically using the process of constant comparison. We describe three main themes derived from the interviews. First, 'Reverse Innovation,' the term, has marketing currency to convince policy-makers that may be wary of learning from or adopting innovations from unexpected sources, in this case Low-Income Countries. Second, the term can have the opposite effect - by connoting frugality, or innovation arising from necessity as opposed to good leadership, the proposed innovation may be associated with poor quality, undermining potential translation into other contexts. Finally, the term 'Reverse Innovation' is a paradox - it breaks down preconceptions of the directionality of knowledge and learning, whilst simultaneously reinforcing it. We conclude that this term means different things to different people and should be used strategically, and with some caution, depending on the audience.
NASA Astrophysics Data System (ADS)
Bonhoff, H. A.; Petersson, B. A. T.
2010-08-01
For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.
Iterative weighting of multiblock data in the orthogonal partial least squares framework.
Boccard, Julien; Rutledge, Douglas N
2014-02-27
The integration of multiple data sources has emerged as a pivotal aspect to assess complex systems comprehensively. This new paradigm requires the ability to separate common and redundant from specific and complementary information during the joint analysis of several data blocks. However, inherent problems encountered when analysing single tables are amplified with the generation of multiblock datasets. Finding the relationships between data layers of increasing complexity constitutes therefore a challenging task. In the present work, an algorithm is proposed for the supervised analysis of multiblock data structures. It associates the advantages of interpretability from the orthogonal partial least squares (OPLS) framework and the ability of common component and specific weights analysis (CCSWA) to weight each data table individually in order to grasp its specificities and handle efficiently the different sources of Y-orthogonal variation. Three applications are proposed for illustration purposes. A first example refers to a quantitative structure-activity relationship study aiming to predict the binding affinity of flavonoids toward the P-glycoprotein based on physicochemical properties. A second application concerns the integration of several groups of sensory attributes for overall quality assessment of a series of red wines. A third case study highlights the ability of the method to combine very large heterogeneous data blocks from Omics experiments in systems biology. Results were compared to the reference multiblock partial least squares (MBPLS) method to assess the performance of the proposed algorithm in terms of predictive ability and model interpretability. In all cases, ComDim-OPLS was demonstrated as a relevant data mining strategy for the simultaneous analysis of multiblock structures by accounting for specific variation sources in each dataset and providing a balance between predictive and descriptive purpose. Copyright © 2014 Elsevier B.V. All rights reserved.
Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms.
Li, Le; Yip, Kevin Y
2016-12-15
Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature. Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/.
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barry, Kenneth
The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted tomore » the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced settling by particle growth are the dominant processes for determining DFs for expected conditions in an iPWR containment. These processes are dependent on the areato-volume (A/V) ratio, which should benefit iPWR designs because these reactors have higher A/Vs compared to existing LWRs.« less
NASA Astrophysics Data System (ADS)
Nooshiri, N.; Saul, J.; Heimann, S.; Tilmann, F. J.; Dahm, T.
2015-12-01
The use of a 1D velocity model for seismic event location is often associated with significant travel-time residuals. Particularly for regional stations in subduction zones, where the velocity structure strongly deviates from the assumed 1D model, residuals of up to ±10 seconds are observed even for clear arrivals, which leads to strongly biased locations. In fact, due to mostly regional travel-time anomalies, arrival times at regional stations do not match the location obtained with teleseismic picks, and vice versa. If the earthquake is weak and only recorded regionally, or if fast locations based on regional stations are needed, the location may be far off the corresponding teleseismic location. In this case, implementation of travel-time corrections may leads to a reduction of the travel-time residuals at regional stations and, in consequence, significantly improve the relative location accuracy. Here, we have extended the source-specific station terms (SSST) technique to regional and teleseismic distances and adopted the algorithm for probabilistic, non-linear, global-search earthquake location. The method has been applied to specific test regions using P and pP phases from the GEOFON bulletin data for all available station networks. By using this method, a set of timing corrections has been calculated for each station varying as a function of source position. In this way, an attempt is made to correct for the systematic errors, introduced by limitations and inaccuracies in the assumed velocity structure, without solving for a new earth model itself. In this presentation, we draw on examples of the application of this global SSST technique to relocate earthquakes from the Tonga-Fiji subduction zone and from the Chilean margin. Our results have been showing a considerable decrease of the root-mean-square (RMS) residual in earthquake location final catalogs, a major reduction of the median absolute deviation (MAD) of the travel-time residuals at regional stations and sharper images of the seismicity compared to the initial locations.
Shi, Han-Chang; Song, Bao-Dong; Long, Feng; Zhou, Xiao-Hong; He, Miao; Lv, Qing; Yang, Hai-Yang
2013-05-07
The accelerated eutrophication of surface water sources and climate change have led to an annual occurrence of cyanobacterial blooms in many drinking water resources. To minimize the health risks to the public, cyanotoxin detection methods that are rapid, sensitive, real time, and high frequency must be established. In this study, an innovative automated online optical biosensing system (AOBS) was developed for the rapid detection and early warning of microcystin-LR (MC-LR), one of the most toxic cyanotoxins and most frequently detected in environmental water. In this system, the capturing molecular MC-LR-ovalbumin (MC-LR-OVA) was covalently immobilized onto a biochip surface. By an indirect competitive detection mode, samples containing different concentrations of MC-LR were premixed with a certain concentration of fluorescence-labeled anti-MC-LR-mAb, which binds to MC-LR with high specificity. Then, the sample mixture was pumped onto the biochip surface, and a higher concentration of MC-LR led to less fluorescence-labeled antibody bound onto the biochip surface and thus to lower fluorescence signal. The quantification of MC-LR ranges from 0.2 to 4 μg/L, with a detection limit determined as 0.09 μg/L. The high specificity and selectivity of the sensor were evaluated in terms of its response to a number of potentially interfering cyanotoxins. Potential interference of the environmental sample matrix was assessed by spiked samples, and the recovery of MC-LR ranged from 90 to 120% with relative standard deviation values <8%. The immunoassay performance of the AOBS was validated with respect to that of conventional high-performance liquid chromatography, and the correlation between methods agreed well (R(2) = 0.9762). This system has successfully been applied to long-term, continuous determination and early warning for MC-LR in Lake Tai from June 2011 to May 2012. Thus, the AOBS paves the way for a vital routine online analysis that satisfies the high demand for ensuring the safety of drinking water sources. The AOBS can also serve as early warning system for accidental or intentional water pollution.
Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E
1994-01-01
OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788
Laser induced heat source distribution in bio-tissues
NASA Astrophysics Data System (ADS)
Li, Xiaoxia; Fan, Shifu; Zhao, Youquan
2006-09-01
During numerical simulation of laser and tissue thermal interaction, the light fluence rate distribution should be formularized and constituted to the source term in the heat transfer equation. Usually the solution of light irradiative transport equation is given in extreme conditions such as full absorption (Lambert-Beer Law), full scattering (Lubelka-Munk theory), most scattering (Diffusion Approximation) et al. But in specific conditions, these solutions will induce different errors. The usually used Monte Carlo simulation (MCS) is more universal and exact but has difficulty to deal with dynamic parameter and fast simulation. Its area partition pattern has limits when applying FEM (finite element method) to solve the bio-heat transfer partial differential coefficient equation. Laser heat source plots of above methods showed much difference with MCS. In order to solve this problem, through analyzing different optical actions such as reflection, scattering and absorption on the laser induced heat generation in bio-tissue, a new attempt was made out which combined the modified beam broaden model and the diffusion approximation model. First the scattering coefficient was replaced by reduced scattering coefficient in the beam broaden model, which is more reasonable when scattering was treated as anisotropic scattering. Secondly the attenuation coefficient was replaced by effective attenuation coefficient in scattering dominating turbid bio-tissue. The computation results of the modified method were compared with Monte Carlo simulation and showed the model provided reasonable predictions of heat source term distribution than past methods. Such a research is useful for explaining the physical characteristics of heat source in the heat transfer equation, establishing effective photo-thermal model, and providing theory contrast for related laser medicine experiments.
NASA Astrophysics Data System (ADS)
Saheer, Sahana; Pathak, Amey; Mathew, Roxy; Ghosh, Subimal
2016-04-01
Simulations of Indian Summer Monsoon (ISM) with its seasonal and subseasonal characteristics is highly crucial for predictions/ projections towards sustainable agricultural planning and water resources management. The Climate forecast system version 2 (CFSv2), the state of the art coupled climate model developed by National Center for Environmental Prediction (NCEP), is evaluated here for the simulations of ISM. Even though CFSv2 is a fully coupled ocean-atmosphere-land model with advanced physics, increased resolution and refined initialization, its ISM simulations/ predictions/ projections, in terms of seasonal mean and variability are not satisfactory. Numerous works have been done for verifying the CFSv2 forecasts in terms of the seasonal mean, its mean and variability, active and break spells, and El Nino Southern Oscillation (ENSO)-monsoon interactions. Underestimation of JJAS precipitation over the Indian land mass is one of the major drawbacks of CFSv2. ISM gets the moisture required to maintain the precipitation from different oceanic and land sources. In this work, we find the fraction of moisture supplied by different sources in the CFSv2 simulations and the findings are compared with observed fractions. We also investigate the possible variations in the moisture contributions from these different sources. We suspect that the deviation in the relative moisture contribution from different sources to various sinks over the monsoon region has resulted in the observed dry bias. We also find that over the Arabian Sea region, which is the key moisture source of ISM, there is a premature built up of specific humidity during the month of May and a decline during the later months of JJAS. This is also one of the reasons for the underestimation of JJAS mean precipitation.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.
Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi
2011-10-13
Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text
2011-01-01
Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939
NASA Astrophysics Data System (ADS)
Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Krause, Stefan
2017-04-01
At the river catchment scale, storm events can drive highly variable behaviour in nutrient and water fluxes, yet short-term dynamics are frequently missed by low resolution sampling regimes. In addition, nutrient source contributions can vary significantly within and between storm events. Our inability to identify and characterise time dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here, we utilise an 8-month high-frequency (hourly) time series of streamflow, nitrate concentration (NO3) and fluorescent dissolved organic matter concentration (FDOM) derived from optical in-situ sensors located in a headwater agricultural catchment. We characterised variability in flow and nutrient dynamics across 29 storm events. Storm events represented 31% of the time series and contributed disproportionately to nutrient loads (43% of NO3 and 36% of CDOM) relative to their duration. Principal components analysis of potential hydroclimatological controls on nutrient fluxes demonstrated that a small number of components, representing >90% of variance in the dataset, were highly significant model predictors of inter-event variability in catchment nutrient export. Hysteresis analysis of nutrient concentration-discharge relationships suggested spatially discrete source zones existed for NO3 and FDOM, and that activation of these zones varied on an event-specific basis. Our results highlight the benefits of high-frequency in-situ monitoring for characterising complex short-term nutrient dynamics and unravelling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights are fundamental to underpinning the development of targeted management measures to reduce nutrient loading of surface waters.
Assessment of seismic hazard in the North Caucasus
NASA Astrophysics Data System (ADS)
Ulomov, V. I.; Danilova, T. I.; Medvedeva, N. S.; Polyakova, T. P.; Shumilina, L. S.
2007-07-01
The seismicity of the North Caucasus is the highest in the European part of Russia. The detection of potential seismic sources here and long-term prediction of earthquakes are extremely important for the assessment of seismic hazard and seismic risk in this densely populated and industrially developed region of the country. The seismogenic structures of the Iran-Caucasus-Anatolia and Central Asia regions, adjacent to European Russia, are the subjects of this study. These structures are responsible for the specific features of regional seismicity and for the geodynamic interaction with adjacent areas of the Scythian and Turan platforms. The most probable potential sources of earthquakes with magnitudes M = 7.0 ± 0.2 and 7.5 ± 0.2 in the North Caucasus are located. The possible macroseismic effect of one of them is assessed.
Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil
NASA Astrophysics Data System (ADS)
de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.
2018-05-01
A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.
A Semi-implicit Treatment of Porous Media in Steady-State CFD.
Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund
There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.
Kuang, Yuan-wen; Zhou, Guo-yi; Wen, Da-zhi; Li, Jiong; Sun, Fang-fang
2011-09-01
Concentrations of polycyclic aromatic hydrocarbons (PAHs) were examined and potential sources of PAHs were identified from the dated tree-rings of Masson pine (Pinus massoniana L.) near two industrial sites (Danshuikeng, DSK and Xiqiaoshan, XQS) in the Pearl River Delta of south China. Total concentrations of PAHs (∑PAHs) were revealed with similar patterns of temporal trends in the tree-rings at both sites, suggesting tree-rings recorded the historical variation in atmospheric PAHs. The differences of individual PAHs and of ∑PAHs detected in the tree-rings between the two sites reflected the historical differences of airborne PAHs. Regional changes in industrial activities might contribute to the site-specific and period-specific patterns of the tree-ring PAHs. The diagnostic PAH ratios of Ant/(Ant + PA), FL/(FL + Pyr), and BaA/(BaA + Chr)) revealed that PAHs in the tree-rings at both sites mainly stemmed from the combustion process (pyrogenic sources). Principal component analysis further confirmed that wood burning, coal combustion, diesel, and gasoline-powered vehicular emissions were the dominant contributors of PAHs sources at DSK, while diesel combustion, gasoline and natural gas combustion, and incomplete coal combustion were responsible for the main origins of PAHs at XQS. Tree-ring analysis of PAHs was indicative of PAHs from a mixture of sources of combustion, thus minimizing the bias of short-term active air sampling.
Malley, Christopher S; Heal, Mathew R; Braban, Christine F; Kentisbeer, John; Leeson, Sarah R; Malcolm, Heath; Lingard, Justin J N; Ritchie, Stuart; Maggs, Richard; Beccaceci, Sonya; Quincey, Paul; Brown, Richard J C; Twigg, Marsailidh M
2016-10-01
Human health burdens associated with long-term exposure to particulate matter (PM) are substantial. The metrics currently recommended by the World Health Organization for quantification of long-term health-relevant PM are the annual average PM10 and PM2.5 mass concentrations, with no low concentration threshold. However, within an annual average, there is substantial variation in the composition of PM associated with different sources. To inform effective mitigation strategies, therefore, it is necessary to quantify the conditions that contribute to annual average PM10 and PM2.5 (rather than just short-term episodic concentrations). PM10, PM2.5, and speciated water-soluble inorganic, carbonaceous, heavy metal and polycyclic aromatic hydrocarbon components are concurrently measured at the two UK European Monitoring and Evaluation Programme (EMEP) 'supersites' at Harwell (SE England) and Auchencorth Moss (SE Scotland). In this work, statistical analyses of these measurements are integrated with air-mass back trajectory data to characterise the 'chemical climate' associated with the long-term health-relevant PM metrics at these sites. Specifically, the contributions from different PM concentrations, months, components and geographic regions are detailed. The analyses at these sites provide policy-relevant conclusions on mitigation of (i) long-term health-relevant PM in the spatial domain for which these sites are representative, and (ii) the contribution of regional background PM to long-term health-relevant PM. At Harwell the mean (±1 sd) 2010-2013 annual average concentrations were PM10=16.4±1.4μgm(-3) and PM2.5=11.9±1.1μgm(-3) and at Auchencorth PM10=7.4±0.4μgm(-3) and PM2.5=4.1±0.2μgm(-3). The chemical climate state at each site showed that frequent, moderate hourly PM10 and PM2.5 concentrations (defined as approximately 5-15μgm(-3) for PM10 and PM2.5 at Harwell and 5-10μgm(-3) for PM10 at Auchencorth) determined the magnitude of annual average PM10 and PM2.5 to a greater extent than the relatively infrequent high, episodic PM10 and PM2.5 concentrations. These moderate PM10 and PM2.5 concentrations were derived across the range of chemical components, seasons and air-mass pathways, in contrast to the highest PM concentrations which tended to associate with specific conditions. For example, the largest contribution to moderate PM10 and PM2.5 concentrations - the secondary inorganic aerosol components, specifically NO3(-) - were accumulated during the arrival of trajectories traversing the spectrum of marine, UK, and continental Europe areas. Mitigation of the long-term health-relevant PM impact in the regions characterised by these two sites requires multilateral action, across species (and hence source sectors), both nationally and internationally; there is no dominant determinant of the long-term PM metrics to target. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Lessons for pediatric anesthesia from audit and incident reporting.
Bell, Graham
2011-07-01
This review will attempt to put the various systems that allow clinicians to assess errors, omissions, or avoidable incidents into context and where possible, look for areas that deserve more or less attention and resource specifically for those of us who practice pediatric anesthesia. Different approaches will be contrasted with respect to their outputs in terms of positive impact on the practice of anesthesia. These approaches include audits by governmental organizations, national representative bodies, specialist societies, commissioned boards of inquiry, medicolegal sources, and police force investigations. Implementation strategies are considered alongside the reports as the reports cannot be considered end points themselves. Specific areas where pediatric anesthetics has failed to address recurring risk through any currently available tools will be highlighted. © 2011 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne
2014-01-01
Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.
Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford
2015-06-01
A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a model with five sources (that seemed to be refinery, petrochemical production, gasoline evaporation, natural gas, and vehicular exhaust) among several candidate models, with the number of sources varying between three and seven and with different identifiability conditions. Our multipollutant approach assessing source-specific health effects is more advantageous than a single-pollutant approach in that it can estimate total health effects from multiple pollutants and can also identify emission sources that are responsible for adverse health effects. Our Bayesian approach can incorporate not only uncertainty in the estimated source contributions, but also model uncertainty that has not been addressed in previous studies on assessing source-specific health effects. The new Bayesian spatial multivariate receptor modeling approach enables predictions of source contributions at unmonitored sites, minimizing exposure misclassification and providing improved exposure estimates along with their uncertainty estimates, as well as accounting for uncertainty in the number of sources and identifiability conditions.
Smith, Joseph P; Oktay, Sarah I; Kada, John; Olsen, Curtis R
2008-08-01
The short-lived, fission-produced radioisotope, 131I (t1/2 = 8.04 days), was detected in wastewater, surficial sediment, and suspended particulate matter (SPM) samples collected from New York Harbor (NYH) between 2001 and 2002. lodine-131 is used as a radiopharmaceutical for medical imaging, diagnostics, and treatments for conditions of the thyroid. It is introduced into the municipal waste stream by medical facilities and patients and is subsequently released into the estuary via wastewater effluent. Measured 131I activities in surface sediments were correlated with those of 7Be (t1/2 = 53.2 days), a naturally occurring radioisotope that is widely used to quantify particle dynamics, sediment focusing, and short-term sediment deposition and accumulation in aquatic systems. Surficial sediment 131I activities were also compared with measured trace metal (Cu, Pb) and organic carbon (OC(sed)) concentrations which can be linked to wastewater inputs. These preliminary results from NYH introduce 131I as a potentially valuable source-specific, shortlived biogeochemical tracer (timescales < 1 month) for particles, sediments, and wastewater-sourced contaminants in urbanized aquatic systems.
Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo
The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.
Dietary protein intake and quality in early life: impact on growth and obesity.
Lind, Mads V; Larnkjær, Anni; Mølgaard, Christian; Michaelsen, Kim F
2017-01-01
Obesity is an increasing problem and high-protein intake early in life seems to increase later risk of obesity. This review summarizes recent publications in the area including observational and intervention studies and publications on underlying mechanisms. Recent observational and randomized controlled trials confirmed that high-protein intake in early life seems to increase early weight gain and the risk of later overweight and obesity. Recent studies have looked at the effect of different sources of protein, and especially high-animal protein intake seems to have an effect on obesity. Specific amino acids, such as leucine, have also been implicated in increasing later obesity risk maybe via specific actions on insulin-like growth factor I. Furthermore, additional underlying mechanisms including epigenetics have been linked to long-term obesogenic programming. Finally, infants with catch-up growth or specific genotypes might be particularly vulnerable to high-protein intake. Recent studies confirm the associations between high-protein intake during the first 2 years and later obesity. Furthermore, knowledge of the mechanisms involved and the role of different dietary protein sources and amino acids has increased, but intervention studies are needed to confirm the mechanisms. Avoiding high-protein intake in early life holds promise as a preventive strategy for childhood obesity.
NASA Astrophysics Data System (ADS)
Wong, J. C.; Williams, D.
2009-05-01
Detrital energy in temperate headwater streams is mainly derived from the annual input of leaf litter from the surrounding landscape. Presumably, its decomposition and other sources of autochthonous organic matter will change dissolved organic carbon (DOC) concentrations and dissolved organic matter (DOM) quality. To investigate this, DOM was leached from two allochthonous sources: white birch (Betula papyrifera) and white cedar (Thuja occidentalis); and one autochthonous source, streambed biofilm, for a period of 7 days on 3 separate occasions in fall 2007. As a second treatment, microorganisms from the water column were filtered out. Deciduous leaf litter was responsible for high, short-term increases to DOC concentrations whereas the amounts leached from conifer needles were relatively constant in each month. Using UV spectroscopy, changes to DOM characteristics like aromaticity, spectral slopes, and molecular weight were mainly determined by source and indicated a preferential use of the labile DOM pool by the microorganisms. Excitation-emission matrices (EEMs) collected using fluorescence spectroscopy suggested that cedar litter was an important source of protein-like fluorescence and that the nature of the fluorescing DOM components changed in the presence of microorganisms. This study demonstrates that simultaneous examination of DOC concentrations and DOM quality will allow a better understanding of the carbon dynamics that connect terrestrial with aquatic ecosystems.
Integrating new Storage Technologies into EOS
NASA Astrophysics Data System (ADS)
Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul
2015-12-01
The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.
Auroral Proper Motion in the Era of AMISR and EMCCD
NASA Astrophysics Data System (ADS)
Semeter, J. L.
2016-12-01
The term "aurora" is a catch-all for luminosity produced by the deposition of magnetospheric energy in the outer atmosphere. The use of this single phenomenological term occludes the rich variety of sources and mechanisms responsible for the excitation. Among these are electron thermal conduction (SAR arcs), electrostatic potential fields ("inverted-V" aurora), wave-particle resonance (Alfvenic aurora, pulsating aurora), pitch-angle scattering (diffuse aurora), and direct injection of plasma sheet particles (PBIs, substorms). Much information about auroral energization has been derived from the energy spectrum of primary particles, which may be measured directly with an in situ detector or indirectly via analysis of the atmospheric response (e.g., auroral spectroscopy, tomography, ionization). Somewhat less emphasized has been the information in the B_perp dimension. Specifically, the scale-dependent motions of auroral forms in the rest frame of the ambient plasma provide a means of partitioning both the source region and the source mechanism. These results, in turn, affect ionospheric state parameters that control the M-I coupling process-most notably, the degree of structure imparted to the conductance field. This paper describes recent results enabled by the advent of two technologies: high frame-rate, high-resolution imaging detectors, and electronically steerable incoherent scatter radar (the AMISR systems). In addition to contributing to our understanding of the aurora, these results may be used in predictive models of multi-scale energy transfer within the disturbed geospace system.
A Unified Flash Flood Database across the United States
Gourley, Jonathan J.; Hong, Yang; Flamig, Zachary L.; Arthur, Ami; Clark, Robert; Calianno, Martin; Ruin, Isabelle; Ortel, Terry W.; Wieczorek, Michael; Kirstetter, Pierre-Emmanuel; Clark, Edward; Krajewski, Witold F.
2013-01-01
Despite flash flooding being one of the most deadly and costly weather-related natural hazards worldwide, individual datasets to characterize them in the United States are hampered by limited documentation and can be difficult to access. This study is the first of its kind to assemble, reprocess, describe, and disseminate a georeferenced U.S. database providing a long-term, detailed characterization of flash flooding in terms of spatiotemporal behavior and specificity of impacts. The database is composed of three primary sources: 1) the entire archive of automated discharge observations from the U.S. Geological Survey that has been reprocessed to describe individual flooding events, 2) flash-flooding reports collected by the National Weather Service from 2006 to the present, and 3) witness reports obtained directly from the public in the Severe Hazards Analysis and Verification Experiment during the summers 2008–10. Each observational data source has limitations; a major asset of the unified flash flood database is its collation of relevant information from a variety of sources that is now readily available to the community in common formats. It is anticipated that this database will be used for many diverse purposes, such as evaluating tools to predict flash flooding, characterizing seasonal and regional trends, and improving understanding of dominant flood-producing processes. We envision the initiation of this community database effort will attract and encompass future datasets.
Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry
NASA Astrophysics Data System (ADS)
Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter
2014-06-01
Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.
Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.
Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali
2017-01-01
With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.
[Knowledge identification and management in a surgery department].
Rodríguez-Montes, José Antonio
2006-08-01
The hospital is an enterprise in which the surgery department represents a specific healthcare unit. The purpose of the surgery department, like that of any other enterprise, is assumed to be indefinite survival; to that end, it must be able to achieve and maintain a competitive advantage in the long term. Nevertheless, each surgery department, like each enterprise, can precisely define the scope of the above-mentioned terms, the main source of an enterprise's competitive advantage being its knowledge stock. Knowledge is recognized as being the basis of competitive success among institutions. This article presents the concept and classification of knowledge and discusses how it should be identified, inventoried, and managed. Special emphasis is placed on healthcare activity, since this sector presents certain characteristics distinguishing it from other sectors of economic and business activity.
NASA Technical Reports Server (NTRS)
Yee, H. C.; Shinn, J. L.
1986-01-01
Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.
Common Calibration Source for Monitoring Long-term Ozone Trends
NASA Technical Reports Server (NTRS)
Kowalewski, Matthew
2004-01-01
Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.
Watershed nitrogen and phosphorus balance: The upper Potomac River basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaworski, N.A.; Groffman, P.M.; Keller, A.A.
1992-01-01
Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less
Barbeta, Adrià; Mejía-Chang, Monica; Ogaya, Romà; Voltas, Jordi; Dawson, Todd E; Peñuelas, Josep
2015-03-01
Vegetation in water-limited ecosystems relies strongly on access to deep water reserves to withstand dry periods. Most of these ecosystems have shallow soils over deep groundwater reserves. Understanding the functioning and functional plasticity of species-specific root systems and the patterns of or differences in the use of water sources under more frequent or intense droughts is therefore necessary to properly predict the responses of seasonally dry ecosystems to future climate. We used stable isotopes to investigate the seasonal patterns of water uptake by a sclerophyll forest on sloped terrain with shallow soils. We assessed the effect of a long-term experimental drought (12 years) and the added impact of an extreme natural drought that produced widespread tree mortality and crown defoliation. The dominant species, Quercus ilex, Arbutus unedo and Phillyrea latifolia, all have dimorphic root systems enabling them to access different water sources in space and time. The plants extracted water mainly from the soil in the cold and wet seasons but increased their use of groundwater during the summer drought. Interestingly, the plants subjected to the long-term experimental drought shifted water uptake toward deeper (10-35 cm) soil layers during the wet season and reduced groundwater uptake in summer, indicating plasticity in the functional distribution of fine roots that dampened the effect of our experimental drought over the long term. An extreme drought in 2011, however, further reduced the contribution of deep soil layers and groundwater to transpiration, which resulted in greater crown defoliation in the drought-affected plants. This study suggests that extreme droughts aggravate moderate but persistent drier conditions (simulated by our manipulation) and may lead to the depletion of water from groundwater reservoirs and weathered bedrock, threatening the preservation of these Mediterranean ecosystems in their current structures and compositions. © 2014 John Wiley & Sons Ltd.
Tiemuerbieke, Bahejiayinaer; Min, Xiao-Jun; Zang, Yong-Xin; Xing, Peng; Ma, Jian-Ying; Sun, Wei
2018-09-01
In water-limited ecosystems, spatial and temporal partitioning of water sources is an important mechanism that facilitates plant survival and lessens the competition intensity of co-existing plants. Insights into species-specific root functional plasticity and differences in the water sources of co-existing plants under changing water conditions can aid in accurate prediction of the response of desert ecosystems to future climate change. We used stable isotopes of soil water, groundwater and xylem water to determine the seasonal and inter- and intraspecific differences variations in the water sources of six C 3 and C 4 shrubs in the Gurbantonggut desert. We also measured the stem water potentials to determine the water stress levels of each species under varying water conditions. The studied shrubs exhibited similar seasonal water uptake patterns, i.e., all shrubs extracted shallow soil water recharged by snowmelt water during early spring and reverted to deeper water sources during dry summer periods, indicating that all of the studied shrubs have dimorphic root systems that enable them to obtain water sources that differ in space and time. Species in the C 4 shrub community exhibited differences in seasonal water absorption and water status due to differences in topography and rooting depth, demonstrating divergent adaptations to water availability and water stress. Haloxylon ammodendron and T. ramosissima in the C 3 /C 4 mixed community were similar in terms of seasonal water extraction but differed with respect to water potential, which indicated that plant water status is controlled by both root functioning and shoot eco-physiological traits. The two Tamarix species in the C 3 shrub community were similar in terms of water uptake and water status, which suggests functional convergence of the root system and physiological performance under same soil water conditions. In different communities, Haloxylon ammodendron differed in terms of summer water extraction, which suggests that this species exhibits plasticity with respect to rooting depth under different soil water conditions. Shrubs in the Gurbantonggut desert displayed varying adaptations across species and communities through divergent root functioning and shoot eco-physiological traits. Copyright © 2018 Elsevier B.V. All rights reserved.
75 FR 48743 - Mandatory Reporting of Greenhouse Gases
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-11
...EPA is proposing to amend specific provisions in the GHG reporting rule to clarify certain provisions, to correct technical and editorial errors, and to address certain questions and issues that have arisen since promulgation. These proposed changes include providing additional information and clarity on existing requirements, allowing greater flexibility or simplified calculation methods for certain sources in a facility, amending data reporting requirements to provide additional clarity on when different types of GHG emissions need to be calculated and reported, clarifying terms and definitions in certain equations, and technical corrections.
75 FR 79091 - Mandatory Reporting of Greenhouse Gases
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
...EPA is amending specific provisions in the greenhouse gas reporting rule to clarify certain provisions, to correct technical and editorial errors, and to address certain questions and issues that have arisen since promulgation. These final changes include generally providing additional information and clarity on existing requirements, allowing greater flexibility or simplified calculation methods for certain sources, amending data reporting requirements to provide additional clarity on when different types of greenhouse gas emissions need to be calculated and reported, clarifying terms and definitions in certain equations and other technical corrections and amendments.
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.
1980-01-01
The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.
Supple, Megan Ann; Bragg, Jason G; Broadhurst, Linda M; Nicotra, Adrienne B; Byrne, Margaret; Andrew, Rose L; Widdup, Abigail; Aitken, Nicola C; Borevitz, Justin O
2018-04-24
As species face rapid environmental change, we can build resilient populations through restoration projects that incorporate predicted future climates into seed sourcing decisions. Eucalyptus melliodora is a foundation species of a critically endangered community in Australia that is a target for restoration. We examined genomic and phenotypic variation to make empirical based recommendations for seed sourcing. We examined isolation by distance and isolation by environment, determining high levels of gene flow extending for 500 km and correlations with climate and soil variables. Growth experiments revealed extensive phenotypic variation both within and among sampling sites, but no site-specific differentiation in phenotypic plasticity. Model predictions suggest that seed can be sourced broadly across the landscape, providing ample diversity for adaptation to environmental change. Application of our landscape genomic model to E. melliodora restoration projects can identify genomic variation suitable for predicted future climates, thereby increasing the long term probability of successful restoration. © 2018, Supple et al.
Induced lexico-syntactic patterns improve information extraction from online medical forums.
Gupta, Sonal; MacLean, Diana L; Heer, Jeffrey; Manning, Christopher D
2014-01-01
To reliably extract two entity types, symptoms and conditions (SCs), and drugs and treatments (DTs), from patient-authored text (PAT) by learning lexico-syntactic patterns from data annotated with seed dictionaries. Despite the increasing quantity of PAT (eg, online discussion threads), tools for identifying medical entities in PAT are limited. When applied to PAT, existing tools either fail to identify specific entity types or perform poorly. Identification of SC and DT terms in PAT would enable exploration of efficacy and side effects for not only pharmaceutical drugs, but also for home remedies and components of daily care. We use SC and DT term dictionaries compiled from online sources to label several discussion forums from MedHelp (http://www.medhelp.org). We then iteratively induce lexico-syntactic patterns corresponding strongly to each entity type to extract new SC and DT terms. Our system is able to extract symptom descriptions and treatments absent from our original dictionaries, such as 'LADA', 'stabbing pain', and 'cinnamon pills'. Our system extracts DT terms with 58-70% F1 score and SC terms with 66-76% F1 score on two forums from MedHelp. We show improvements over MetaMap, OBA, a conditional random field-based classifier, and a previous pattern learning approach. Our entity extractor based on lexico-syntactic patterns is a successful and preferable technique for identifying specific entity types in PAT. To the best of our knowledge, this is the first paper to extract SC and DT entities from PAT. We exhibit learning of informal terms often used in PAT but missing from typical dictionaries. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Developing a comprehensive time series of GDP per capita for 210 countries from 1950 to 2015
2012-01-01
Background Income has been extensively studied and utilized as a determinant of health. There are several sources of income expressed as gross domestic product (GDP) per capita, but there are no time series that are complete for the years between 1950 and 2015 for the 210 countries for which data exist. It is in the interest of population health research to establish a global time series that is complete from 1950 to 2015. Methods We collected GDP per capita estimates expressed in either constant US dollar terms or international dollar terms (corrected for purchasing power parity) from seven sources. We applied several stages of models, including ordinary least-squares regressions and mixed effects models, to complete each of the seven source series from 1950 to 2015. The three US dollar and four international dollar series were each averaged to produce two new GDP per capita series. Results and discussion Nine complete series from 1950 to 2015 for 210 countries are available for use. These series can serve various analytical purposes and can illustrate myriad economic trends and features. The derivation of the two new series allows for researchers to avoid any series-specific biases that may exist. The modeling approach used is flexible and will allow for yearly updating as new estimates are produced by the source series. Conclusion GDP per capita is a necessary tool in population health research, and our development and implementation of a new method has allowed for the most comprehensive known time series to date. PMID:22846561
NASA Astrophysics Data System (ADS)
Williamson, Jeffrey F.
2006-09-01
This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as a means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.
YouTube as a source of quitting smoking information.
Backinger, Cathy L; Pilsner, Alison M; Augustson, Erik M; Frydl, Andrea; Phillips, Todd; Rowden, Jessica
2011-03-01
To conduct analyses to determine the extent to which YouTube videos posted specific to smoking cessation were actually about quitting smoking and if so, whether or not they portrayed evidence-based practices (EBPs). In August 2008, researchers identified YouTube videos by search strategies, 'relevance' and 'view count' using the following three search terms: 'stop smoking', 'quit smoking' and 'smoking cessation (n=296 for full sample and n=191 for unique videos). Overall, almost 60% of videos contained a message about quitting smoking. Differences were found across search terms for videos about quitting smoking, with 'stop smoking' yielding the highest percentage (80.8%) of videos about quitting smoking. Almost half of the videos (48.9%) contained EBPs for cessation strategies; however, a significant portion contained either non--EBPs (28.4%) or both EBPs and non-EBPs (22.7%). The number of views per an individual video across the six categories ranged from a low of 8 in the 'relevance' strategy and 'smoking cessation' search term to a high of 1,247,540 in the 'view count' strategy and 'stop smoking' search term. Of the top three most viewed videos by strategy and search term, 66.7% included a specific mention of quitting smoking and, of these, the majority included EBPs. Results highlight the need to develop and upload videos containing EBPs both to increase the overall proportion of EBP videos in all categories, particularly in 'quit smoking' and 'stop smoking.' Research is needed to study whether YouTube videos influence knowledge, attitudes and behaviours regarding quitting smoking.
Choi, Moo Jin; Choi, Byung Tae; Shin, Hwa Kyoung; Shin, Byung Cheul; Han, Yoo Kyoung; Baek, Jin Ung
2015-01-01
The major objectives of this study were to provide a list of candidate antiaging medicinal herbs that have been widely utilized in Korean medicine and to organize preliminary data for the benefit of experimental and clinical researchers to develop new drug therapies by analyzing previous studies. “Dongeuibogam,” a representative source of the Korean medicine literature, was selected to investigate candidate antiaging medicinal herbs and to identify appropriate terms that describe the specific antiaging effects that these herbs are predicted to elicit. In addition, we aimed to review previous studies that referenced the selected candidate antiaging medicinal herbs. From our chosen source, “Dongeuibogam,” we were able to screen 102 terms describing antiaging effects, which were further classified into 11 subtypes. Ninety-seven candidate antiaging medicinal herbs were selected using the criterion that their antiaging effects were described using the same terms as those employed in “Dongeuibogam.” These candidates were classified into 11 subtypes. Of the 97 candidate antiaging medicinal herbs selected, 47 are widely used by Korean medical doctors in Korea and were selected for further analysis of their antiaging effects. Overall, we found an average of 7.7 previous studies per candidate herb that described their antiaging effects. PMID:25861371
NASA Astrophysics Data System (ADS)
Emelko, M.; Silins, U.; Stone, M.
2016-12-01
Wildfire remains the most catastrophic agent of landscape disturbance in many forested source water regions. Notably, while wildfire impacts on water have been well studied, little if any of that work has specifically focused on drinking water treatability impacts, which will have both significant regional differences and similarities. Wildfire effects on water quality, particularly nutrient concentrations and character/forms, can be significant. The longevity and downstream propagation of these effects, as well as the geochemical mechanisms regulating them have been largely undocumented at larger river basin scales. This work demonstrates that fine sediment in gravel-bed rivers is a significant, long-term source of in-stream bioavailable P that contributes to a legacy of wildfire impacts on downstream water quality, aquatic ecology, and drinking water treatability in some ecoregions. The short- and mid-term impacts include increases in primary productivity and dissolved organic carbon, associated changes in carbon character, and increased potential for the formation of disinfection byproducts during drinking water treatment. The longer term impacts also may include increases in potentially toxic algal blooms and the production of taste and odor compounds. These documented impacts, as well as strategies for assessing the risk of wildfire-associated water service disruptions and infrastructure and land management-associated opportunities for adaptation to and mitigation of wildfire risk to drinking water supply will be discussed.
A systematic review of administrative and clinical databases of infants admitted to neonatal units.
Statnikov, Yevgeniy; Ibrahim, Buthaina; Modi, Neena
2017-05-01
High quality information, increasingly captured in clinical databases, is a useful resource for evaluating and improving newborn care. We conducted a systematic review to identify neonatal databases, and define their characteristics. We followed a preregistered protocol using MesH terms to search MEDLINE, EMBASE, CINAHL, Web of Science and OVID Maternity and Infant Care Databases for articles identifying patient level databases covering more than one neonatal unit. Full-text articles were reviewed and information extracted on geographical coverage, criteria for inclusion, data source, and maternal and infant characteristics. We identified 82 databases from 2037 publications. Of the country-specific databases there were 39 regional and 39 national. Sixty databases restricted entries to neonatal unit admissions by birth characteristic or insurance cover; 22 had no restrictions. Data were captured specifically for 53 databases; 21 administrative sources; 8 clinical sources. Two clinical databases hold the largest range of data on patient characteristics, USA's Pediatrix BabySteps Clinical Data Warehouse and UK's National Neonatal Research Database. A number of neonatal databases exist that have potential to contribute to evaluating neonatal care. The majority is created by entering data specifically for the database, duplicating information likely already captured in other administrative and clinical patient records. This repetitive data entry represents an unnecessary burden in an environment where electronic patient records are increasingly used. Standardisation of data items is necessary to facilitate linkage within and between countries. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Snow, Mathew S; Snyder, Darin C; Clark, Sue B; Kelley, Morgan; Delmore, James E
2015-03-03
Radiometric and mass spectrometric analyses of Cs contamination in the environment can reveal the location of Cs emission sources, release mechanisms, modes of transport, prediction of future contamination migration, and attribution of contamination to specific generator(s) and/or process(es). The Subsurface Disposal Area (SDA) at Idaho National Laboratory (INL) represents a complicated case study for demonstrating the current capabilities and limitations to environmental Cs analyses. (137)Cs distribution patterns, (135)Cs/(137)Cs isotope ratios, known Cs chemistry at this site, and historical records enable narrowing the list of possible emission sources and release events to a single source and event, with the SDA identified as the emission source and flood transport of material from within Pit 9 and Trench 48 as the primary release event. These data combined allow refining the possible number of waste generators from dozens to a single generator, with INL on-site research and reactor programs identified as the most likely waste generator. A discussion on the ultimate limitations to the information that (135)Cs/(137)Cs ratios alone can provide is presented and includes (1) uncertainties in the exact date of the fission event and (2) possibility of mixing between different Cs source terms (including nuclear weapons fallout and a source of interest).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Mathew S.; Snyder, Darin C.; Clark, Sue B.
2015-03-03
Radiometric and mass spectrometric analyses of Cs contamination in the environment can reveal the location of Cs emission sources, release mechanisms, modes of transport, prediction of future contamination migration, and attribution of contamination to specific generator(s) and/or process(es). The Subsurface Disposal Area (SDA) at Idaho National Laboratory (INL) represents a complicated case study for demonstrating the current capabilities and limitations to environmental Cs analyses. 137Cs distribution patterns, 135Cs/ 137Cs isotope ratios, known Cs chemistry at this site, and historical records enable narrowing the list of possible emission sources and release events to a single source and event, with the SDAmore » identified as the emission source and flood transport of material from within Pit 9 and Trench 48 as the primary release event. These data combined allow refining the possible number of waste generators from dozens to a single generator, with INL on-site research and reactor programs identified as the most likely waste generator. A discussion on the ultimate limitations to the information that 135Cs/ 137Cs ratios alone can provide is presented and includes (1) uncertainties in the exact date of the fission event and (2) possibility of mixing between different Cs source terms (including nuclear weapons fallout and a source of interest).« less
Briache, Abdelaali; Marrakchi, Kamar; Kerzazi, Amine; Navas-Delgado, Ismael; Rossi Hassani, Badr D; Lairini, Khalid; Aldana-Montes, José F
2012-01-25
Saccharomyces cerevisiae is recognized as a model system representing a simple eukaryote whose genome can be easily manipulated. Information solicited by scientists on its biological entities (Proteins, Genes, RNAs...) is scattered within several data sources like SGD, Yeastract, CYGD-MIPS, BioGrid, PhosphoGrid, etc. Because of the heterogeneity of these sources, querying them separately and then manually combining the returned results is a complex and time-consuming task for biologists most of whom are not bioinformatics expert. It also reduces and limits the use that can be made on the available data. To provide transparent and simultaneous access to yeast sources, we have developed YeastMed: an XML and mediator-based system. In this paper, we present our approach in developing this system which takes advantage of SB-KOM to perform the query transformation needed and a set of Data Services to reach the integrated data sources. The system is composed of a set of modules that depend heavily on XML and Semantic Web technologies. User queries are expressed in terms of a domain ontology through a simple form-based web interface. YeastMed is the first mediation-based system specific for integrating yeast data sources. It was conceived mainly to help biologists to find simultaneously relevant data from multiple data sources. It has a biologist-friendly interface easy to use. The system is available at http://www.khaos.uma.es/yeastmed/.
Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations
NASA Technical Reports Server (NTRS)
Alter, Stephen J.
2003-01-01
A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.
BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Lotz
1997-02-15
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less
Srivastava, D; Favez, O; Bonnaire, N; Lucarelli, F; Haeffelin, M; Perraudin, E; Gros, V; Villenave, E; Albinet, A
2018-09-01
The present study aimed at performing PM 10 source apportionment, using positive matrix factorization (PMF), based on filter samples collected every 4h at a sub-urban station in the Paris region (France) during a PM pollution event in March 2015 (PM 10 >50μgm -3 for several consecutive days). The PMF model allowed to deconvolve 11 source factors. The use of specific primary and secondary organic molecular markers favoured the determination of common sources such as biomass burning and primary traffic emissions, as well as 2 specific biogenic SOA (marine+isoprene) and 3 anthropogenic SOA (nitro-PAHs+oxy-PAHs+phenolic compounds oxidation) factors. This study is probably the first one to report the use of methylnitrocatechol isomers as well as 1-nitropyrene to apportion secondary OA linked to biomass burning emissions and primary traffic emissions, respectively. Secondary organic carbon (SOC) fractions were found to account for 47% of the total OC. The use of organic molecular markers allowed the identification of 41% of the total SOC composed of anthropogenic SOA (namely, oxy-PAHs, nitro-PAHs and phenolic compounds oxidation, representing 15%, 9%, 11% of the total OC, respectively) and biogenic SOA (marine+isoprene) (6% in total). Results obtained also showed that 35% of the total SOC originated from anthropogenic sources and especially PAH SOA (oxy-PAHs+nitro-PAHs), accounting for 24% of the total SOC, highlighting its significant contribution in urban influenced environments. Anthropogenic SOA related to nitro-PAHs and phenolic compounds exhibited a clear diurnal pattern with high concentrations during the night indicating the prominent role of night-time chemistry but with different chemical processes involved. Copyright © 2018 Elsevier B.V. All rights reserved.
Stafoggia, Massimo; Zauli-Sajani, Stefano; Pey, Jorge; Samoli, Evangelia; Alessandrini, Ester; Basagaña, Xavier; Cernigliaro, Achille; Chiusolo, Monica; Demaria, Moreno; Díaz, Julio; Faustini, Annunziata; Katsouyanni, Klea; Kelessis, Apostolos G.; Linares, Cristina; Marchesi, Stefano; Medina, Sylvia; Pandolfi, Paolo; Pérez, Noemí; Querol, Xavier; Randi, Giorgia; Ranzi, Andrea; Tobias, Aurelio; Forastiere, Francesco
2015-01-01
Background: Evidence on the association between short-term exposure to desert dust and health outcomes is controversial. Objectives: We aimed to estimate the short-term effects of particulate matter ≤ 10 μm (PM10) on mortality and hospital admissions in 13 Southern European cities, distinguishing between PM10 originating from the desert and from other sources. Methods: We identified desert dust advection days in multiple Mediterranean areas for 2001–2010 by combining modeling tools, back-trajectories, and satellite data. For each advection day, we estimated PM10 concentrations originating from desert, and computed PM10 from other sources by difference. We fitted city-specific Poisson regression models to estimate the association between PM from different sources (desert and non-desert) and daily mortality and emergency hospitalizations. Finally, we pooled city-specific results in a random-effects meta-analysis. Results: On average, 15% of days were affected by desert dust at ground level (desert PM10 > 0 μg/m3). Most episodes occurred in spring–summer, with increasing gradient of both frequency and intensity north–south and west–east of the Mediterranean basin. We found significant associations of both PM10 concentrations with mortality. Increases of 10 μg/m3 in non-desert and desert PM10 (lag 0–1 days) were associated with increases in natural mortality of 0.55% (95% CI: 0.24, 0.87%) and 0.65% (95% CI: 0.24, 1.06%), respectively. Similar associations were estimated for cardio-respiratory mortality and hospital admissions. Conclusions: PM10 originating from the desert was positively associated with mortality and hospitalizations in Southern Europe. Policy measures should aim at reducing population exposure to anthropogenic airborne particles even in areas with large contribution from desert dust advections. Citation: Stafoggia M, Zauli-Sajani S, Pey J, Samoli E, Alessandrini E, Basagaña X, Cernigliaro A, Chiusolo M, Demaria M, Díaz J, Faustini A, Katsouyanni K, Kelessis AG, Linares C, Marchesi S, Medina S, Pandolfi P, Pérez N, Querol X, Randi G, Ranzi A, Tobias A, Forastiere F, MED-PARTICLES Study Group. 2016. Desert dust outbreaks in Southern Europe: contribution to daily PM10 concentrations and short-term associations with mortality and hospital admissions. Environ Health Perspect 124:413–419; http://dx.doi.org/10.1289/ehp.1409164 PMID:26219103
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
The spatiotemporal MEG covariance matrix modeled as a sum of Kronecker products.
Bijma, Fetsje; de Munck, Jan C; Heethaar, Rob M
2005-08-15
The single Kronecker product (KP) model for the spatiotemporal covariance of MEG residuals is extended to a sum of Kronecker products. This sum of KP is estimated such that it approximates the spatiotemporal sample covariance best in matrix norm. Contrary to the single KP, this extension allows for describing multiple, independent phenomena in the ongoing background activity. Whereas the single KP model can be interpreted by assuming that background activity is generated by randomly distributed dipoles with certain spatial and temporal characteristics, the sum model can be physiologically interpreted by assuming a composite of such processes. Taking enough terms into account, the spatiotemporal sample covariance matrix can be described exactly by this extended model. In the estimation of the sum of KP model, it appears that the sum of the first 2 KP describes between 67% and 93%. Moreover, these first two terms describe two physiological processes in the background activity: focal, frequency-specific alpha activity, and more widespread non-frequency-specific activity. Furthermore, temporal nonstationarities due to trial-to-trial variations are not clearly visible in the first two terms, and, hence, play only a minor role in the sample covariance matrix in terms of matrix power. Considering the dipole localization, the single KP model appears to describe around 80% of the noise and seems therefore adequate. The emphasis of further improvement of localization accuracy should be on improving the source model rather than the covariance model.
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
GeoSciGraph: An Ontological Framework for EarthCube Semantic Infrastructure
NASA Astrophysics Data System (ADS)
Gupta, A.; Schachne, A.; Condit, C.; Valentine, D.; Richard, S.; Zaslavsky, I.
2015-12-01
The CINERGI (Community Inventory of EarthCube Resources for Geosciences Interoperability) project compiles an inventory of a wide variety of earth science resources including documents, catalogs, vocabularies, data models, data services, process models, information repositories, domain-specific ontologies etc. developed by research groups and data practitioners. We have developed a multidisciplinary semantic framework called GeoSciGraph semantic ingration of earth science resources. An integrated ontology is constructed with Basic Formal Ontology (BFO) as its upper ontology and currently ingests multiple component ontologies including the SWEET ontology, GeoSciML's lithology ontology, Tematres controlled vocabulary server, GeoNames, GCMD vocabularies on equipment, platforms and institutions, software ontology, CUAHSI hydrology vocabulary, the environmental ontology (ENVO) and several more. These ontologies are connected through bridging axioms; GeoSciGraph identifies lexically close terms and creates equivalence class or subclass relationships between them after human verification. GeoSciGraph allows a community to create community-specific customizations of the integrated ontology. GeoSciGraph uses the Neo4J,a graph database that can hold several billion concepts and relationships. GeoSciGraph provides a number of REST services that can be called by other software modules like the CINERGI information augmentation pipeline. 1) Vocabulary services are used to find exact and approximate terms, term categories (community-provided clusters of terms e.g., measurement-related terms or environmental material related terms), synonyms, term definitions and annotations. 2) Lexical services are used for text parsing to find entities, which can then be included into the ontology by a domain expert. 3) Graph services provide the ability to perform traversal centric operations e.g., finding paths and neighborhoods which can be used to perform ontological operations like computing transitive closure (e.g., finding all subclasses of rocks). 4) Annotation services are used to adorn an arbitrary block of text (e.g., from a NOAA catalog record) with ontology terms. The system has been used to ontologically integrate diverse sources like Science-base, NOAA records, PETDB.
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
26 CFR 31.3401(a)(14)-1 - Group-term life insurance.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...
Cost of care of haemophilia with inhibitors.
Di Minno, M N D; Di Minno, G; Di Capua, M; Cerbone, A M; Coppola, A
2010-01-01
In Western countries, the treatment of patients with inhibitors is presently the most challenging and serious issue in haemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the healthcare of patients in this setting. Being designed to address questions of resource allocation and effectiveness, decision models are the golden standard to reliably assess the overall economic implications of haemophilia with inhibitors in terms of mortality, bleeding-related morbidity, and severity of arthropathy. However, presently, most data analyses stem from retrospective short-term evaluations, that only allow for the analysis of direct health costs. In the setting of chronic diseases, the cost-utility analysis, that takes into account the beneficial effects of a given treatment/healthcare intervention in terms of health-related quality of life, is likely to be the most appropriate approach. To calculate net benefits, the quality adjusted life year, that significantly reflects such health gain, has to be compared with specific economic impacts. Differences in data sources, in medical practice and/or in healthcare systems and costs, imply that most current pharmacoeconomic analyses are confined to a narrow healthcare payer perspective. Long-term/lifetime prospective or observational studies, devoted to a careful definition of when to start a treatment; of regimens (dose and type of product) to employ, and of inhibitor population (children/adults, low-responding/high responding inhibitors) to study, are thus urgently needed to allow for newer insights, based on reliable data sources into resource allocation, effectiveness and cost-utility analysis in the treatment of haemophiliacs with inhibitors.
An open-source and low-cost monitoring system for precision enology.
Di Gennaro, Salvatore Filippo; Matese, Alessandro; Mancin, Mirko; Primicerio, Jacopo; Palliotti, Alberto
2014-12-05
Winemaking is a dynamic process, where microbiological and chemical effects may strongly differentiate products from the same vineyard and even between wine vats. This high variability means an increase in work in terms of control and process management. The winemaking process therefore requires a site-specific approach in order to optimize cellar practices and quality management, suggesting a new concept of winemaking, identified as Precision Enology. The Institute of Biometeorology of the Italian National Research Council has developed a wireless monitoring system, consisting of a series of nodes integrated in barrel bungs with sensors for the measurement of wine physical and chemical parameters in the barrel. This paper describes an open-source evolution of the preliminary prototype, using Arduino-based technology. Results have shown good performance in terms of data transmission and accuracy, minimal size and power consumption. The system has been designed to create a low-cost product, which allows a remote and real-time control of wine evolution in each barrel, minimizing costs and time for sampling and laboratory analysis. The possibility of integrating any kind of sensors makes the system a flexible tool that can satisfy various monitoring needs.
Rodrigues, Mayla Santos; Ferreira, Lívia Seno; Converti, Attilio; Sato, Sunao; de Carvalho, João Carlos Monteiro
2011-06-01
Previous work demonstrated that a mixture of NH(4)Cl and KNO(3) as nitrogen source was beneficial to fed-batch Arthrospira (Spirulina) platensis cultivation, in terms of either lower costs or higher cell concentration. On the basis of those results, this study focused on the use of a cheaper nitrogen source mixture, namely (NH(4))(2)SO(4) plus NaNO(3), varying the ammonium feeding time (T=7-15 days), either controlling the pH by CO(2) addition or not. A. platensis was cultivated in mini-tanks at 30°C, 156 μmol photons m(-2) s(-1), and starting cell concentration of 400 mg L(-1), on a modified Schlösser medium. T=13 days under pH control were selected as optimum conditions, ensuring the best results in terms of biomass production (maximum cell concentration of 2911 mg L(-1), cell productivity of 179 mg L(-1)d(-1) and specific growth rate of 0.77 d(-1)) and satisfactory protein and lipid contents (around 30% each). Copyright © 2011 Elsevier Ltd. All rights reserved.
Global threat to agriculture from invasive species.
Paini, Dean R; Sheppard, Andy W; Cook, David C; De Barro, Paul J; Worner, Susan P; Thomas, Matthew B
2016-07-05
Invasive species present significant threats to global agriculture, although how the magnitude and distribution of the threats vary between countries and regions remains unclear. Here, we present an analysis of almost 1,300 known invasive insect pests and pathogens, calculating the total potential cost of these species invading each of 124 countries of the world, as well as determining which countries present the greatest threat to the rest of the world given their trading partners and incumbent pool of invasive species. We find that countries vary in terms of potential threat from invasive species and also their role as potential sources, with apparently similar countries sometimes varying markedly depending on specifics of agricultural commodities and trade patterns. Overall, the biggest agricultural producers (China and the United States) could experience the greatest absolute cost from further species invasions. However, developing countries, in particular, Sub-Saharan African countries, appear most vulnerable in relative terms. Furthermore, China and the United States represent the greatest potential sources of invasive species for the rest of the world. The analysis reveals considerable scope for ongoing redistribution of known invasive pests and highlights the need for international cooperation to slow their spread.
Global threat to agriculture from invasive species
Paini, Dean R.; Sheppard, Andy W.; Cook, David C.; De Barro, Paul J.; Worner, Susan P.; Thomas, Matthew B.
2016-01-01
Invasive species present significant threats to global agriculture, although how the magnitude and distribution of the threats vary between countries and regions remains unclear. Here, we present an analysis of almost 1,300 known invasive insect pests and pathogens, calculating the total potential cost of these species invading each of 124 countries of the world, as well as determining which countries present the greatest threat to the rest of the world given their trading partners and incumbent pool of invasive species. We find that countries vary in terms of potential threat from invasive species and also their role as potential sources, with apparently similar countries sometimes varying markedly depending on specifics of agricultural commodities and trade patterns. Overall, the biggest agricultural producers (China and the United States) could experience the greatest absolute cost from further species invasions. However, developing countries, in particular, Sub-Saharan African countries, appear most vulnerable in relative terms. Furthermore, China and the United States represent the greatest potential sources of invasive species for the rest of the world. The analysis reveals considerable scope for ongoing redistribution of known invasive pests and highlights the need for international cooperation to slow their spread. PMID:27325781
12 CFR 201.4 - Availability and terms of credit.
Code of Federal Regulations, 2014 CFR
2014-01-01
... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...
NASA Technical Reports Server (NTRS)
Tiwari, Vidhu S.; Kalluru, Rajamohan R.; Yueh, Fang-Yu; Singh, Jagdish P.; SaintCyr, William
2007-01-01
A spontaneous Raman scattering optical fiber sensor is developed for a specific need of NASA/SSC for long-term detection and monitoring of the quality of liquid oxygen (LOX) in the delivery line during ground testing of rocket engines. The sensor performance was tested in the laboratory and with different excitation light sources. To evaluate the sensor performance with different excitation light sources for the LOX quality application, we have used the various mixtures of liquid oxygen and liquid nitrogen as samples. The study of the sensor performance shows that this sensor offers a great deal of flexibility and provides a cost effective solution for the application. However, an improved system response time is needed for the real-time, quantitative monitoring of the quality of cryogenic fluids in harsh environment.
Isolation of Genetically Diverse Marburg Viruses from Egyptian Fruit Bats
Towner, Jonathan S.; Amman, Brian R.; Sealy, Tara K.; Carroll, Serena A. Reeder; Comer, James A.; Kemp, Alan; Swanepoel, Robert; Paddock, Christopher D.; Balinandi, Stephen; Khristova, Marina L.; Formenty, Pierre B. H.; Albarino, Cesar G.; Miller, David M.; Reed, Zachary D.; Kayiwa, John T.; Mills, James N.; Cannon, Deborah L.; Greer, Patricia W.; Byaruhanga, Emmanuel; Farnon, Eileen C.; Atimnedi, Patrick; Okware, Samuel; Katongole-Mbidde, Edward; Downing, Robert; Tappero, Jordan W.; Zaki, Sherif R.; Ksiazek, Thomas G.; Nichol, Stuart T.; Rollin, Pierre E.
2009-01-01
In July and September 2007, miners working in Kitaka Cave, Uganda, were diagnosed with Marburg hemorrhagic fever. The likely source of infection in the cave was Egyptian fruit bats (Rousettus aegyptiacus) based on detection of Marburg virus RNA in 31/611 (5.1%) bats, virus-specific antibody in bat sera, and isolation of genetically diverse virus from bat tissues. The virus isolates were collected nine months apart, demonstrating long-term virus circulation. The bat colony was estimated to be over 100,000 animals using mark and re-capture methods, predicting the presence of over 5,000 virus-infected bats. The genetically diverse virus genome sequences from bats and miners closely matched. These data indicate common Egyptian fruit bats can represent a major natural reservoir and source of Marburg virus with potential for spillover into humans. PMID:19649327
Isolation of genetically diverse Marburg viruses from Egyptian fruit bats.
Towner, Jonathan S; Amman, Brian R; Sealy, Tara K; Carroll, Serena A Reeder; Comer, James A; Kemp, Alan; Swanepoel, Robert; Paddock, Christopher D; Balinandi, Stephen; Khristova, Marina L; Formenty, Pierre B H; Albarino, Cesar G; Miller, David M; Reed, Zachary D; Kayiwa, John T; Mills, James N; Cannon, Deborah L; Greer, Patricia W; Byaruhanga, Emmanuel; Farnon, Eileen C; Atimnedi, Patrick; Okware, Samuel; Katongole-Mbidde, Edward; Downing, Robert; Tappero, Jordan W; Zaki, Sherif R; Ksiazek, Thomas G; Nichol, Stuart T; Rollin, Pierre E
2009-07-01
In July and September 2007, miners working in Kitaka Cave, Uganda, were diagnosed with Marburg hemorrhagic fever. The likely source of infection in the cave was Egyptian fruit bats (Rousettus aegyptiacus) based on detection of Marburg virus RNA in 31/611 (5.1%) bats, virus-specific antibody in bat sera, and isolation of genetically diverse virus from bat tissues. The virus isolates were collected nine months apart, demonstrating long-term virus circulation. The bat colony was estimated to be over 100,000 animals using mark and re-capture methods, predicting the presence of over 5,000 virus-infected bats. The genetically diverse virus genome sequences from bats and miners closely matched. These data indicate common Egyptian fruit bats can represent a major natural reservoir and source of Marburg virus with potential for spillover into humans.
Material from the Internal Surface of Squid Axon Exhibits Excess Noise
Fishman, Harvey M.
1981-01-01
A fluid material from a squid (Loligo pealei) axon was isolated by mechanical application of two types of microcapillary (1-3-μm Diam) to the internal surface of intact and cut-axon preparations. Current noise in the isolated material exceeded thermal levels and power spectra were 1/f in form in the frequency range 1.25-500 Hz with voltage-dependent intensities that were unrelated to specific ion channels. Whether conduction in this material is a significant source of excess noise during axon conduction remains to be determined. Nevertheless, a source of excess noise external to or within an ion channel may not be properly represented solely as an additive term to the spectrum of ion channel noise; a deconvolution of these spectral components may be required for modeling purposes. PMID:6266542
Computed myography: three-dimensional reconstruction of motor functions from surface EMG data
NASA Astrophysics Data System (ADS)
van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.
2008-12-01
We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.
Comparison of Physics Frameworks for WebGL-Based Game Engine
NASA Astrophysics Data System (ADS)
Yogya, Resa; Kosala, Raymond
2014-03-01
Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.
Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms
Li, Le; Yip, Kevin Y.
2016-01-01
Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature.Availability: Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/. PMID:27976738
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, S R; Dreger, D S; Phillips, W S
2008-07-16
Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less
NASA Technical Reports Server (NTRS)
Hayes, J. M.; Freeman, K. H.; Popp, B. N.; Hoham, C. H.
1990-01-01
Patterns of isotopic fractionation in biogeochemical processes are reviewed and it is suggested that isotopic fractionations will be small when substrates are large. If so, isotopic compositions of biomarkers will reflect those of their biosynthetic precursors. This prediction is tested by consideration of results of analyses of geoporphyrins and geolipids from the Greenhorn Formation (Cretaceous, Western Interior Seaway of North America) and the Messel Shale (Eocene, lacustrine, southern Germany). It is shown (i) that isotopic compositions of porphyrins that are related to a common source, but which have been altered structurally, cluster tightly and (ii) that isotopic differences between geolipids and porphyrins related to a common source are equal to those observed in modern biosynthetic products. Both of these observations are consistent with preservation of biologically controlled isotopic compositions during diagenesis. Isotopic compositions of individual compounds can thus be interpreted in terms of biogeochemical processes in ancient depositional environments. In the Cretaceous samples, isotopic compositions of n-alkanes are covariant with those of total organic carbon, while delta values for pristane and phytane are covariant with those of porphyrins. In this unit representing an open marine environment, the preserved acyclic polyisoprenoids apparently derive mainly from primary material, while the extractable, n-alkanes derive mainly from lower levels of the food chain. In the Messel Shale, isotopic compositions of individual biomarkers range from -20.9 to -73.4% vs PDB. Isotopic compositions of specific compounds can be interpreted in terms of origin from methylotrophic, chemautotrophic, and chemolithotrophic microorganisms as well as from primary producers that lived in the water column and sediments of this ancient lake.
Role of large scale energy systems models in R and D planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamontagne, J.
1980-11-01
Long-term energy policy deals with the problem of finite supplies of convenient energy sources becoming more costly as they are depleted. The development of alternative technologies to provide new sources of energy and extend the lives of current ones is an attractive option available to government. Thus, one aspect of long-term energy policy involves investment in R and D. The importance of the problems addressed by R and D to the future of society (especially with regard to energy) dictates adoption of a cogent approach to resource allocation and to the designation of priorities for R and D. It ismore » hoped that energy systems models when properly used can provide useful inputs to this process. The influence of model results on energy policy makers who are not knowledgable about flaws or uncertainties in the models, errors in assumptions in model inputs which can result in faulty forecasts, the overall usefulness of energy system models, and model limitations are discussed. It is suggested that the large scale energy systems models currently used for assessing a broad spectrum of policy issues need to be replaced with reasonably simple models capable of dealing with uncertainty in a straightforward manner, and their methodologies and the meaning of their results should be transparent, especially to those removed from the modeling process. Energy models should be clearly related to specific issues. Methodologies should be clearly related to specific decisions, and should allow adjustments to be easily made for alternative assumptions and for additional knowledge gained during the evolution of the energy system. (LCL)« less
Economics of wind energy for utilities
NASA Technical Reports Server (NTRS)
Mccabe, T. F.; Goldenblatt, M. K.
1982-01-01
Utility acceptance of this technology will be contingent upon the establishment of both its technical and economic feasibility. This paper presents preliminary results from a study currently underway to establish the economic value of central station wind energy to certain utility systems. The results for the various utilities are compared specifically in terms of three parameters which have a major influence on the economic value: (1) wind resource, (2) mix of conventional generation sources, and (3) specific utility financial parameters including projected fuel costs. The wind energy is derived from modeling either MOD-2 or MOD-0A wind turbines in wind resources determined by a year of data obtained from the DOE supported meteorological towers with a two-minute sampling frequency. In this paper, preliminary results for six of the utilities studied are presented and compared.
Augmented Citizen Science for Environmental Monitoring and Education
NASA Astrophysics Data System (ADS)
Albers, B.; de Lange, N.; Xu, S.
2017-09-01
Environmental monitoring and ecological studies detect and visualize changes of the environment over time. Some agencies are committed to document the development of conservation and status of geotopes and geosites, which is time-consuming and cost-intensive. Citizen science and crowd sourcing are modern approaches to collect data and at the same time to raise user awareness for environmental changes. Citizen scientists can take photographs of point of interests (POI) with smartphones and the PAN App, which is presented in this article. The user is navigated to a specific point and is then guided with an augmented reality approach to take a photo in a specific direction. The collected photographs are processed to time-lapse videos to visualize environmental changes. Users and experts in environmental agencies can use this data for long-term documentation.
Krall, J. R.; Hackstadt, A. J.; Peng, R. D.
2017-01-01
Exposure to particulate matter (PM) air pollution has been associated with a range of adverse health outcomes, including cardiovascular disease (CVD) hospitalizations and other clinical parameters. Determining which sources of PM, such as traffic or industry, are most associated with adverse health outcomes could help guide future recommendations aimed at reducing harmful pollution exposure for susceptible individuals. Information obtained from multisite studies, which is generally more precise than information from a single location, is critical to understanding how PM impacts health and to informing local strategies for reducing individual-level PM exposure. However, few methods exist to perform multisite studies of PM sources, which are not generally directly observed, and adverse health outcomes. We developed SHARE, a hierarchical modeling approach that facilitates reproducible, multisite epidemiologic studies of PM sources. SHARE is a two-stage approach that first summarizes information about PM sources across multiple sites. Then, this information is used to determine how community-level (i.e. county- or city-level) health effects of PM sources should be pooled to estimate regional-level health effects. SHARE is a type of population value decomposition that aims to separate out regional-level features from site-level data. Unlike previous approaches for multisite epidemiologic studies of PM sources, the SHARE approach allows the specific PM sources identified to vary by site. Using data from 2000–2010 for 63 northeastern US counties, we estimated regional-level health effects associated with short-term exposure to major types of PM sources. We found PM from secondary sulfate, traffic, and metals sources was most associated with CVD hospitalizations. PMID:28098412
10 CFR 40.41 - Terms and conditions of licenses.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...
Bell, Raoul; Giang, Trang; Buchner, Axel
2012-01-01
Previous research has shown a source memory advantage for faces presented in negative contexts. As yet it remains unclear whether participants remember the specific type of context in which the faces were presented or whether they can only remember that the face was associated with negative valence. In the present study, participants saw faces together with descriptions of two different types of negative behaviour and neutral behaviour. In Experiment 1, we examined whether the participants were able to discriminate between two types of other-relevant negative context information (cheating and disgusting behaviour) in a source memory test. In Experiment 2, we assessed source memory for other-relevant negative (threatening) context information (other-aggressive behaviour) and self-relevant negative context information (self-aggressive behaviour). A multinomial source memory model was used to separately assess partial source memory for the negative valence of the behaviour and specific source memory for the particular type of negative context the face was associated with. In Experiment 1, source memory was specific for the particular type of negative context presented (i.e., cheating or disgusting behaviour). Experiment 2 showed that source memory for other-relevant negative information was more specific than source memory for self-relevant information. Thus, emotional source memory may vary in specificity depending on the degree to which the negative emotional context is perceived as threatening.
Consedine, Nathan S; Adjei, Brenda A; Ramirez, Paul M; McKiernan, James M
2008-07-01
Fears regarding prostate cancer and the associated screening are widespread. However, the relations between anxiety, cancer worry, and screening fear and screening behavior are complex, because anxieties stemming from different sources have different effects on behavior. In differentiating among anxieties from different sources (trait anxiety, cancer worry, and screening fear), we expected that cancer worry would be associated with more frequent screening, whereas fear of screening would be associated with less frequent screening. Hypotheses were tested in a sample of 533 men (ages 45-70 years) recruited using a stratified cluster-sampling plan. Men provided information on demographic and structural variables (age, education, income, marital status, physician discussion of risk and screening, access, and insurance) and completed a set of anxiety measures (trait anxiety, cancer worry, and screening fear). As expected, two-step multiple regressions controlling for demographics, health insurance status, physician discussion, and health-care system barriers showed that prostate-specific antigen and digital rectal examination frequencies had unique associations with cancer worry and screening fear. Specifically, whereas cancer worry was associated with more frequent screening, fear of screening was associated with less frequent screening at least for digital rectal examination; trait anxiety was inconsistently related to screening. Data are discussed in terms of their implications for male screening and the understanding of how anxiety motivates health behaviors. It is suggested that understanding the source of anxiety and the manner in which health behaviors such as cancer screenings may enhance or reduce felt anxiety is a likely key to understanding the associations between anxiety and behavioral outcomes.
Emerging Disparities in Dietary Sodium Intake from Snacking in the US Population.
Dunford, Elizabeth K; Poti, Jennifer M; Popkin, Barry M
2017-06-17
The US population consumes dietary sodium well in excess of recommended levels. It is unknown how the contribution of snack foods to sodium intake has changed over time, and whether disparities exist within specific subgroups of the US population. To examine short and long term trends in the contribution of snack food sources to dietary sodium intake for US adults and children over a 37-year period from 1977 to 2014. We used data collected from eight nationally representative surveys of food intake in 50,052 US children aged 2-18 years, and 73,179 adults aged 19+ years between 1977 and 2014. Overall, patterns of snack food consumption, trends in sodium intake from snack food sources and trends in food and beverage sources of sodium from snack foods across race-ethnic, age, gender, body mass index, household education and income groups were examined. In all socio-demographic subgroups there was a significant increase in both per capita sodium intake, and the proportion of sodium intake derived from snacks from 1977-1978 to 2011-2014 ( p < 0.01). Those with the lowest household education, Non-Hispanic Black race-ethnicity, and the lowest income had the largest increase in sodium intake from snacks. While in 1977-1978 Non-Hispanic Blacks had a lower sodium intake from snacks compared to Non-Hispanic Whites ( p < 0.01), in 2011-2014 they had a significantly higher intake. Conclusions: Important disparities are emerging in dietary sodium intake from snack sources in Non-Hispanic Blacks. Our findings have implications for future policy interventions targeting specific US population subgroups.
Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horne, Steven M.; Harding, Lee
The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.
Localization of sound sources in a room with one microphone
NASA Astrophysics Data System (ADS)
Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre
2017-08-01
Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions
NASA Technical Reports Server (NTRS)
Huff, R. G.
1984-01-01
The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.
NASA Astrophysics Data System (ADS)
Giemsa, Esther; Jacobeit, Jucundus; Ries, Ludwig; Frank, Gabriele; Hachinger, Stephan; Meyer-Arnek, Julian
2017-04-01
Carbon dioxide (CO2) and methane (CH4) represent the most important contributors to increased radiative forcing enhancing it together by contemporary 2.65 W/m2 on the global average (IPCC 2013). The unbroken increase of atmospheric greenhouse gases (GHG) has been unequivocally attributed to human emissions mainly coming from fossil fuel burning and land-use changes, while the oceans and terrestrial ecosystems slightly attenuate this rise with seasonally varying strength. Short-term fluctuations in the GHG concentrations that superimpose the seasonal cycle and the climate change driven trend reflect the presence of regional sources and sinks. A perfect place for investigating the comprehensive influence of these regional emissions is provided by the Environmental Research Station Schneefernerhaus (47.42°N, 10.98°E, 2.650m a.s.l.) situated in the eastern Alps at the southern side of Zugspitze mountain. Located just 300m below the highest peak of the German Alps, the exposed site is one of the currently 30 global core sites of the World Meteorological Organisation (WMO) Global Atmosphere Watch (GAW) programme and thus provides ideal conditions to study source-receptor relationships for greenhouse gases. We propose a stepwise statistical methodology for examining the relationship between synoptic-scale atmospheric transport patterns and climate gas mole fractions to finally receive a characterization of the sampling site with regard to the key processes driving CO2 and CH4 concentration levels. The first step entails a reliable radon-based filtering approach to subdivide the detected air masses according to their regional or 'background' origin. Simultaneously, a large number of ten-day back-trajectories from Schneefernerhaus every two hours over the entire study period 2011 - 2015 is calculated with the Lagrangian transport and dispersion model FLEXPART (Stohl et al. 2005) and subjected to cluster analysis. The weather- and emission strength-related (short-term) components of the regional CO2 and CH4 concentration time-series are assigned to the back-trajectory clusters. The significant differences in the greenhouse gases' distributions associated with each cluster are confirmed by the non-parametric Kruskal-Wallis test thereby delivering the prerequisites for further investigations, in particular by Potential Source Contribution Functions for the detection of probable locations of within-cluster emission sources. The advantages of this comprehensive approach are site-specificity (by considering trajectories arriving at Schneefernerhaus as well as a site-appropriate filter method) and concentration-specificity (each greenhouse gas has its own source regions) combined with granting the space and time scales related to the synoptic flow patterns in source attribution studies. This research received funding from the Bavarian State Ministry of the Environment and Consumer Protection.
Energy Harvesting Research: The Road from Single Source to Multisource.
Bai, Yang; Jantunen, Heli; Juuti, Jari
2018-06-07
Energy harvesting technology may be considered an ultimate solution to replace batteries and provide a long-term power supply for wireless sensor networks. Looking back into its research history, individual energy harvesters for the conversion of single energy sources into electricity are developed first, followed by hybrid counterparts designed for use with multiple energy sources. Very recently, the concept of a truly multisource energy harvester built from only a single piece of material as the energy conversion component is proposed. This review, from the aspect of materials and device configurations, explains in detail a wide scope to give an overview of energy harvesting research. It covers single-source devices including solar, thermal, kinetic and other types of energy harvesters, hybrid energy harvesting configurations for both single and multiple energy sources and single material, and multisource energy harvesters. It also includes the energy conversion principles of photovoltaic, electromagnetic, piezoelectric, triboelectric, electrostatic, electrostrictive, thermoelectric, pyroelectric, magnetostrictive, and dielectric devices. This is one of the most comprehensive reviews conducted to date, focusing on the entire energy harvesting research scene and providing a guide to seeking deeper and more specific research references and resources from every corner of the scientific community. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-07-01
To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODs) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regimes, which are labeled as integrator OUs. This remedial investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the feasibility study to evaluate all probable or likely alternatives.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-09-01
To effectively evaluate the cumulative impact of releases from multiple sources of contamination, a structured approach has been adopted for Oak Ridge Reservation (ORR) based on studies of the groundwater and surface water separate from studies of the sources. Based on the realization of the complexity of the hydrogeologic regime of the ORR, together with the fact that there are numerous sources contributing to groundwater contamination within a geographical area, it was agreed that more timely investigations, at perhaps less cost, could be achieved by separating the sources of contamination from the groundwater and surface water for investigation and remediation.more » The result will be more immediate attention [Records of Decision (RODS) for interim measures or removal actions] for the source Operable Units (OUs) while longer-term remediation investigations continue for the hydrogeologic regime`s, which are labeled as integrator OUs. This Remedial Investigation work plan contains summaries of geographical, historical, operational, geological, and hydrological information specific to the unit. Taking advantage of the historical data base and ongoing monitoring activities and applying the observational approach to focus data gathering activities will allow the Feasibility Study to evaluate all probable or likely alternatives.« less
Naturally occurring 32Si and low-background silicon dark matter detectors
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; ...
2018-02-10
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32Si and low-background silicon dark matter detectors
NASA Astrophysics Data System (ADS)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary; Bunker, Raymond; Finch, Zachary S.
2018-05-01
The naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon "ore" and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.
Naturally occurring 32Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
Here, the naturally occurring radioisotope 32Si represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of 32Si and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the 32Si concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude thatmore » production of 32Si-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in 32Si. To quantitatively evaluate the 32Si content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon detectors with low levels of 32Si, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Naturally occurring 32 Si and low-background silicon dark matter detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, John L.; Arnquist, Isaac J.; Bliss, Mary
The naturally occurring radioisotope Si-32 represents a potentially limiting background in future dark matter direct-detection experiments. We investigate sources of Si-32 and the vectors by which it comes to reside in silicon crystals used for fabrication of radiation detectors. We infer that the Si-32 concentration in commercial single-crystal silicon is likely variable, dependent upon the specific geologic and hydrologic history of the source (or sources) of silicon “ore” and the details of the silicon-refinement process. The silicon production industry is large, highly segmented by refining step, and multifaceted in terms of final product type, from which we conclude that productionmore » of Si-32-mitigated crystals requires both targeted silicon material selection and a dedicated refinement-through-crystal-production process. We review options for source material selection, including quartz from an underground source and silicon isotopically reduced in Si-32. To quantitatively evaluate the Si-32 content in silicon metal and precursor materials, we propose analytic methods employing chemical processing and radiometric measurements. Ultimately, it appears feasible to produce silicon-based detectors with low levels of Si-32, though significant assay method development is required to validate this claim and thereby enable a quality assurance program during an actual controlled silicon-detector production cycle.« less
Non-additive dissipation in open quantum networks out of equilibrium
NASA Astrophysics Data System (ADS)
Mitchison, Mark T.; Plenio, Martin B.
2018-03-01
We theoretically study a simple non-equilibrium quantum network whose dynamics can be expressed and exactly solved in terms of a time-local master equation. Specifically, we consider a pair of coupled fermionic modes, each one locally exchanging energy and particles with an independent, macroscopic thermal reservoir. We show that the generator of the asymptotic master equation is not additive, i.e. it cannot be expressed as a sum of contributions describing the action of each reservoir alone. Instead, we identify an additional interference term that generates coherences in the energy eigenbasis, associated with the current of conserved particles flowing in the steady state. Notably, non-additivity arises even for wide-band reservoirs coupled arbitrarily weakly to the system. Our results shed light on the non-trivial interplay between multiple thermal noise sources in modular open quantum systems.
NASA Astrophysics Data System (ADS)
Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.
2017-09-01
Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a probabilistic way. Assessment of seismic hazard in France in the framework of the safety of nuclear facilities should consider these recent advances. In this sense, the opening of discussions with all of the stakeholders in France to update the reference documents (i.e., RFS 2001-01; ASN/2/01 Guide) appears appropriate in the short term.
Foraster, Maria; Eze, Ikenna C; Vienneau, Danielle; Brink, Mark; Cajochen, Christian; Caviezel, Seraina; Héritier, Harris; Schaffner, Emmanuel; Schindler, Christian; Wanner, Miriam; Wunderli, Jean-Marc; Röösli, Martin; Probst-Hensch, Nicole
2016-05-01
Noise annoyance (NA) might lead to behavioral patterns not captured by noise levels, which could reduce physical activity (PA) either directly or through impaired sleep and constitute a noise pathway towards cardiometabolic diseases. We investigated the association of long-term transportation NA and its main sources (aircraft, road, and railway) at home with PA levels. We assessed 3842 participants (aged 37-81) that attended the three examinations (SAP 1, 2, and 3 in years 1991, 2001 and 2011, respectively) of the population-based Swiss cohort on Air Pollution and Lung and Heart Diseases in Adults (SAPALDIA). Participants reported general 24-h transportation NA (in all examinations) and source-specific NA at night (only SAP 3) on an ICBEN-type 11-point scale. We assessed moderate, vigorous, and total PA from a short-questionnaire (SAP 3). The main outcome was moderate PA (active/inactive: cut-off≥150min/week). We used logistic regression including random effects by area and adjusting for age, sex, socioeconomic status, and lifestyles (main model) and evaluated potential effect modifiers. We analyzed associations with PA at SAP 3 a) cross-sectionally: for source-specific and transportation NA in the last year (SAP 3), and b) longitudinally: for 10-y transportation NA (mean of SAP 1+2), adjusting for prior PA (SAP 2) and changes in NA (SAP 3-2). Reported NA (score≥5) was 16.4%, 7.5%, 3%, and 1.1% for 1-year transportation, road, aircraft, and railway at SAP 3, respectively. NA was greater in the past, reaching 28.5% for 10-y transportation NA (SAP 1+2). The 10-y transportation NA was associated with a 3.2% (95% CI: 6%-0.2%) decrease in moderate PA per 1-NA rating point and was related to road and aircraft NA at night in cross-sectional analyses. The longitudinal association was stronger for women, reported daytime sleepiness or chronic diseases and it was not explained by objectively modeled levels of road traffic noise at SAP 3. In conclusion, long-term NA (related to psychological noise appraisal) reduced PA and could represent another noise pathway towards cardiometabolic diseases. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Auditing the multiply-related concepts within the UMLS
Mougin, Fleur; Grabar, Natalia
2014-01-01
Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853
Amino Acid Isotope Incorporation and Enrichment Factors in Pacific Bluefin Tuna, Thunnus orientalis
Bradley, Christina J.; Madigan, Daniel J.; Block, Barbara A.; Popp, Brian N.
2014-01-01
Compound specific isotopic analysis (CSIA) of amino acids has received increasing attention in ecological studies in recent years due to its ability to evaluate trophic positions and elucidate baseline nutrient sources. However, the incorporation rates of individual amino acids into protein and specific trophic discrimination factors (TDFs) are largely unknown, limiting the application of CSIA to trophic studies. We determined nitrogen turnover rates of individual amino acids from a long-term (up to 1054 days) laboratory experiment using captive Pacific bluefin tuna, Thunnus orientalis (PBFT), a large endothermic pelagic fish fed a controlled diet. Small PBFT (white muscle δ15N∼11.5‰) were collected in San Diego, CA and transported to the Tuna Research and Conservation Center (TRCC) where they were fed a controlled diet with high δ15N values relative to PBFT white muscle (diet δ15N∼13.9‰). Half-lives of trophic and source amino acids ranged from 28.6 to 305.4 days and 67.5 to 136.2 days, respectively. The TDF for the weighted mean values of amino acids was 3.0 ‰, ranging from 2.2 to 15.8 ‰ for individual combinations of 6 trophic and 5 source amino acids. Changes in the δ15N values of amino acids across trophic levels are the underlying drivers of the trophic 15N enrichment. Nearly all amino acid δ15N values in this experiment changed exponentially and could be described by a single compartment model. Significant differences in the rate of 15N incorporation were found for source and trophic amino acids both within and between these groups. Varying half-lives of individual amino acids can be applied to migratory organisms as isotopic clocks, determining the length of time an individual has spent in a new environment. These results greatly enhance the ability to interpret compound specific isotope analyses in trophic studies. PMID:24465724
The XSD-Builder Specification Language—Toward a Semantic View of XML Schema Definition
NASA Astrophysics Data System (ADS)
Fong, Joseph; Cheung, San Kuen
In the present database market, XML database model is a main structure for the forthcoming database system in the Internet environment. As a conceptual schema of XML database, XML Model has its limitation on presenting its data semantics. System analyst has no toolset for modeling and analyzing XML system. We apply XML Tree Model (shown in Figure 2) as a conceptual schema of XML database to model and analyze the structure of an XML database. It is important not only for visualizing, specifying, and documenting structural models, but also for constructing executable systems. The tree model represents inter-relationship among elements inside different logical schema such as XML Schema Definition (XSD), DTD, Schematron, XDR, SOX, and DSD (shown in Figure 1, an explanation of the terms in the figure are shown in Table 1). The XSD-Builder consists of XML Tree Model, source language, translator, and XSD. The source language is called XSD-Source which is mainly for providing an environment with concept of user friendliness while writing an XSD. The source language will consequently be translated by XSD-Translator. Output of XSD-Translator is an XSD which is our target and is called as an object language.
Viking-Age Sails: Form and Proportion
NASA Astrophysics Data System (ADS)
Bischoff, Vibeke
2017-04-01
Archaeological ship-finds have shed much light on the design and construction of vessels from the Viking Age. However, the exact proportions of their sails remain unknown due to the lack of fully preserved sails, or other definite indicators of their proportions. Key Viking-Age ship-finds from Scandinavia—the Oseberg Ship, the Gokstad Ship and Skuldelev 3—have all revealed traces of rigging. In all three finds, the keelson—with the mast position—is preserved, together with fastenings for the sheets and the tack, indicating the breadth of the sail. The sail area can then be estimated based on practical experience of how large a sail the specific ship can carry, in conjunction with hull form and displacement. This article presents reconstructions of the form and dimensions of rigging and sail based on the archaeological finds, evidence from iconographic and written sources, and ethnographic parallels with traditional Nordic boats. When these sources are analysed, not only do the similarities become apparent, but so too does the relative disparity between the archaeological record and the other sources. Preferential selection in terms of which source is given the greatest merit is therefore required, as it is not possible to afford them all equal value.
Lee, Charlotte A; Sinha, Siddharth; Fitzpatrick, Emer; Dhawan, Anil
2018-06-01
Human hepatocyte transplantation has been actively perused as an alternative to liver replacement for acute liver failure and liver-based metabolic defects. Current challenges in this field include a limited cell source, reduced cell viability following cryopreservation and poor engraftment of cells into the recipient liver with consequent limited life span. As a result, alternative stem cell sources such as pluripotent stem cells, fibroblasts, hepatic progenitor cells, amniotic epithelial cells and mesenchymal stem/stromal cells (MSCs) can be used to generate induced hepatocyte like cells (HLC) with each technique exhibiting advantages and disadvantages. HLCs may have comparable function to primary human hepatocytes and could offer patient-specific treatment. However, long-term functionality of transplanted HLCs and the potential oncogenic risks of using stem cells have yet to be established. The immunomodulatory effects of MSCs are promising, and multiple clinical trials are investigating their effect in cirrhosis and acute liver failure. Here, we review the current status of hepatocyte transplantation, alternative cell sources to primary human hepatocytes and their potential in liver regeneration. We also describe recent clinical trials using hepatocytes derived from stem cells and their role in improving the phenotype of several liver diseases.
The tools of PDT: light sources and devices. Can they help in getting better therapeutic results?
NASA Astrophysics Data System (ADS)
Boucher, Didier
2011-08-01
PDT is a drug and device therapy using photosensitizing drugs activated by laser light, for tissue ablation. PDT light sources must deliver wavelengths matching the absorption of photosensitizers' compound without any side thermal effect. According to applications, these sources need to be: - pled to relatively small optical fibres so as to bring the light energy, of specific wavelength, inside of the body (gastroenterology, head & neck, urology, pneumology), - coupled to a slit lamp adapter to transmit the light to the eye (AMD) - or allow a direct illumination of tissues when large areas must be treated (dermatology). But they also need to be user-friendly with limited investment and installation costs. So as to achieve the required effects, several light sources are available and will be used but practical and economical reasons have limited the number and types of these sources. For PDT oncology applications, besides dermatology, it has also been necessary to develop specific light delivery systems based on optical fibres. These devices allow the treatment: - of circular lumens such as oesophagus, bile ducts, lungs - of solid volumes such as prostate, pancreas - of surfaces such as in head and neck - of empty volumes such as bladder, uterus, cervix. Due to the variety of these treatments, a full family of sources has been developed from original sophisticated costly lasers to more recent easy-to-use diode laser systems. The aim of this presentation is to present the actual state of the art of actual available PDT tools, analyze their qualities and weaknesses, analyze the consequences of a good and/or bad choice or good and/or bad utilization on the quality of the therapeutic results and resulting side effects. It will also evaluate the short and medium term developments of new tools and their effect on the development of the therapy including economical aspects.
The long-term problems of contaminated land: Sources, impacts and countermeasures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baes, C.F. III
1986-11-01
This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').
Logic-based assessment of the compatibility of UMLS ontology sources
2011-01-01
Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571
Avanzini, Maria Antonietta; Bernardo, Maria Ester; Cometa, Angela Maria; Perotti, Cesare; Zaffaroni, Nadia; Novara, Francesca; Visai, Livia; Moretta, Antonia; Del Fante, Claudia; Villa, Raffaella; Ball, Lynne M.; Fibbe, Willem E.; Maccario, Rita; Locatelli, Franco
2009-01-01
Background Mesenchymal stromal cells are employed in various different clinical settings in order to modulate immune response. However, relatively little is known about the mechanisms responsible for their immunomodulatory effects, which could be influenced by both the cell source and culture conditions. Design and Methods We tested the ability of a 5% platelet lysate-supplemented medium to support isolation and ex vivo expansion of mesenchymal stromal cells from full-term umbilical-cord blood. We also investigated the biological/functional properties of umbilical cord blood mesenchymal stromal cells, in comparison with platelet lysate-expanded bone marrow mesenchymal stromal cells. Results The success rate of isolation of mesenchymal stromal cells from umbilical cord blood was in the order of 20%. These cells exhibited typical morphology, immunophenotype and differentiation capacity. Although they have a low clonogenic efficiency, umbilical cord blood mesenchymal stromal cells may possess high proliferative potential. The genetic stability of these cells from umbilical cord blood was demonstrated by a normal molecular karyotype; in addition, these cells do not express hTERT and telomerase activity, do express p16ink4a protein and do not show anchorage-independent cell growth. Concerning alloantigen-specific immune responses, umbilical cord blood mesenchymal stromal cells were able to: (i) suppress T- and NK-lymphocyte proliferation, (ii) decrease cytotoxic activity and (iii) only slightly increase interleukin-10, while decreasing interferon-γ secretion, in mixed lymphocyte culture supernatants. While an indoleamine 2,3-dioxygenase-specific inhibitor did not reverse mesenchymal stromal cell-induced suppressive effects, a prostaglandin E2-specific inhibitor hampered the suppressive effect of both umbilical cord blood- and bone marrow-mesenchymal stromal cells on alloantigen-induced cytotoxic activity. Mesenchymal stromal cells from both sources expressed HLA-G. Conclusions Umbilical cord blood- and bone marrow-mesenchymal stromal cells may differ in terms of clonogenic efficiency, proliferative capacity and immunomodulatory properties; these differences may be relevant for clinical applications. PMID:19773264
Neisser-Svae, A; Bailey, A; Gregori, L; Heger, A; Jordan, S; Behizad, M; Reichl, H; Römisch, J; Svae, T-E
2009-10-01
A new chromatographic step for the selective binding of abnormal prion protein (PrP(Sc)) was developed, and optimization for PrP(Sc) capture was achieved by binding to an affinity ligand attached to synthetic resin particles. This step was implemented into the manufacturing process of the solvent/detergent (S/D)-treated biopharmaceutical quality plasma Octaplas to further improve the safety margin in terms of risk for variant Creutzfeldt-Jakob disease (vCJD) transmission. Intermediates and Octaplas final container material, spiked with hamster brain-derived PrP(Sc)-containing fractions, were used for experiments to establish the feasibility of introducing this novel chromatography step. The binding capacity per millilitre of ligand gel was determined under the selected manufacturing conditions. In addition, the specificity of the ligand gel to bind PrP(Sc) from human sources was investigated. A validated Western blot test was used for the identification and quantification of PrP(Sc). A reduction factor of > or = 3.0 log(10) could be demonstrated by Western blotting, utilizing the relevant Octaplas matrix from manufacturing. In this particular cell-free plasma solution, the PrP(Sc) binding capacity of the selected gel was very high (> or = 6 log(10) ID(50)/ml, equivalent to roughly 10 log(10) ID(50)/column at manufacturing scale). The gel binds specifically PrP(Sc) from both animal (hamster and mouse) and human (sporadic and variant CJD) sources. This new single-use, disposable PrP(Sc)-harvesting gel ensures a very high capacity in terms of removing the pathogenic agent causing vCJD from the new generation OctaplasLG, in the event that prions can be found in plasma from donors incubating the disease and thereby contaminating the raw material plasma used for manufacturing.
Englot, Dario J.; Nagarajan, Srikantan S.; Imber, Brandon S.; Raygor, Kunal P.; Honma, Susanne M.; Mizuiri, Danielle; Mantle, Mary; Knowlton, Robert C.; Kirsch, Heidi E.; Chang, Edward F.
2015-01-01
Objective The efficacy of epilepsy surgery depends critically upon successful localization of the epileptogenic zone. Magnetoencephalography (MEG) enables non-invasive detection of interictal spike activity in epilepsy, which can then be localized in three dimensions using magnetic source imaging (MSI) techniques. However, the clinical value of MEG in the pre-surgical epilepsy evaluation is not fully understood, as studies to date are limited by either a lack of long-term seizure outcomes or small sample size. Methods We performed a retrospective cohort study of focal epilepsy patients who received MEG for interictal spike mapping followed by surgical resection at our institution. Results We studied 132 surgical patients, with mean post-operative follow-up of 3.6 years (minimum 1 year). Dipole source modelling was successful in 103 (78%) patients, while no interictal spikes were seen in others. Among patients with successful dipole modelling, MEG findings were concordant with and specific to: i) the region of resection in 66% of patients, ii) invasive electrocorticography (ECoG) findings in 67% of individuals, and iii) the MRI abnormality in 74% of cases. MEG showed discordant lateralization in ~5% of cases. After surgery, 70% of all patients achieved seizure-freedom (Engel class I outcome). Whereas 85% of patients with concordant and specific MEG findings became seizure-free, this outcome was achieved by only 37% of individuals with MEG findings that were non-specific or discordant with the region of resection (χ2 = 26.4, p < 0.001). MEG reliability was comparable in patients with or without localized scalp EEG, and overall, localizing MEG findings predicted seizure freedom with an odds ratio of 5.11 (2.23–11.8, 95% CI). Significance MEG is a valuable tool for non-invasive interictal spike mapping in epilepsy surgery, including patients with non-localized findings on long-term EEG monitoring, and localization of the epileptogenic zone using MEG is associated with improved seizure outcomes. PMID:25921215
Borowiak, Malgorzata
2010-01-01
Diabetic patients suffer from the loss of insulin-secreting β-cells, or from an improper working β-cell mass. Due to the increasing prevalence of diabetes across the world, there is a compelling need for a renewable source of cells that could replace pancreatic β-cells. In recent years, several promising approaches to the generation of new β-cells have been developed. These include directed differentiation of pluripotent cells such as embryonic stem (ES) cells or induced pluripotent stem (iPS) cells, or reprogramming of mature tissue cells. High yield methods to differentiate cell populations into β-cells, definitive endoderm, and pancreatic progenitors, have been established using growth factors and small molecules. However, the final step of directed differentiation to generate functional, mature β-cells in sufficient quantities has yet to be achieved in vitro. Beside the needs of transplantation medicine, a renewable source of β-cells would also be important in terms of a platform to study the pathogenesis of diabetes, and to seek alternative treatments. Finally, by generating new β-cells, we could learn more details about pancreatic development and β-cell specification. This review gives an overview of pancreas ontogenesis in the perspective of stem cell differentiation, and highlights the critical aspects of small molecules in the generation of a renewable β-cell source. Also, it discusses longer term challenges and opportunities in moving towards a therapeutic goal for diabetes.
Capture Versus Capture Zones: Clarifying Terminology Related to Sources of Water to Wells.
Barlow, Paul M; Leake, Stanley A; Fienen, Michael N
2018-03-15
The term capture, related to the source of water derived from wells, has been used in two distinct yet related contexts by the hydrologic community. The first is a water-budget context, in which capture refers to decreases in the rates of groundwater outflow and (or) increases in the rates of recharge along head-dependent boundaries of an aquifer in response to pumping. The second is a transport context, in which capture zone refers to the specific flowpaths that define the three-dimensional, volumetric portion of a groundwater flow field that discharges to a well. A closely related issue that has become associated with the source of water to wells is streamflow depletion, which refers to the reduction in streamflow caused by pumping, and is a type of capture. Rates of capture and streamflow depletion are calculated by use of water-budget analyses, most often with groundwater-flow models. Transport models, particularly particle-tracking methods, are used to determine capture zones to wells. In general, however, transport methods are not useful for quantifying actual or potential streamflow depletion or other types of capture along aquifer boundaries. To clarify the sometimes subtle differences among these terms, we describe the processes and relations among capture, capture zones, and streamflow depletion, and provide proposed terminology to distinguish among them. Published 2018. This article is a U.S. Government work and is in the public domain in the USA. Groundwater published by Wiley Periodicals, Inc. on behalf of National Ground Water Association.
Nutritional treatment in inflammatory bowel disease. An update.
Guagnozzi, Danila; González-Castillo, Sonia; Olveira, Antonio; Lucendo, Alfredo J
2012-09-01
enteral (EN) and parenteral (TPN) nutrition exert variable therapeutic effects on the induction and maintenance of remission in inflammatory bowel disease (IBD). This review aims to provide an updated discussion on the complex relationship between diet and IBD. medline, Cochrane and Scopus database searches were conducted. Sources cited in the articles obtained were also searched to identify other potential sources of information. nutritional status is significantly compromised in IBD patients, especially those with Crohn's disease (CD). Apart from restoring malnourishment, dietary components contribute to modulate intestinal immune responses. Nutritional treatment is divided into support therapy and primary therapy to induce and maintain remission through TPN and EN. EN is considered a first-line therapy in children with active CD whereas it is usually used in adult CD patients when corticosteroid therapy is not possible. TPN has limited effects on IBD.En formula composition, in terms of carbohydrates, nitrogen source and bioactive molecules supplementation, differentially influence on IBD treatment outcomes. Other dietary components, such as poorly absorbed short-chain carbohydrate, polyols, and exogenous microparticles, also participate in the etiopathogenesis of IBD. Finally, new approaches to understanding the complex relationship between IBD and diet are provided by nutrigenenomic. further long-term, well-powered studies are required to accurately assess the usefulness of nutrition in treating IBD. In future research, the potential role of nutrient-gene interaction in drug trials and specific dietary formula compositions should be investigated in order to incorporate new knowledge about the etiopathology of IBD into nutritional intervention.
Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests
NASA Astrophysics Data System (ADS)
Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.
2015-12-01
Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basu, Rupa, E-mail: Rupa.Basu@oehha.ca.gov; Harris, Maria; Sie, Lillian
Relationships between prenatal exposure to fine particles (PM{sub 2.5}) and birth weight have been observed previously. Few studies have investigated specific constituents of PM{sub 2.5}, which may identify sources and major contributors of risk. We examined the effects of trimester and full gestational prenatal exposures to PM{sub 2.5} mass and 23 PM{sub 2.5} constituents on birth weight among 646,296 term births in California between 2000 and 2006. We used linear and logistic regression models to assess associations between exposures and birth weight and risk of low birth weight (LBW; <2500 g), respectively. Models were adjusted for individual demographic characteristics, apparentmore » temperature, month and year of birth, region, and socioeconomic indicators. Higher full gestational exposures to PM{sub 2.5} mass and several PM{sub 2.5} constituents were significantly associated with reductions in term birth weight. The largest reductions in birth weight were associated with exposure to vanadium, sulfur, sulfate, iron, elemental carbon, titanium, manganese, bromine, ammonium, zinc, and copper. Several of these PM{sub 2.5} constituents were associated with increased risk of term LBW. Reductions in birth weight were generally larger among younger mothers and varied by race/ethnicity. Exposure to specific constituents of PM{sub 2.5}, especially traffic-related particles, sulfur constituents, and metals, were associated with decreased birth weight in California. -- Highlights: • Examine full gestational and trimester fine particle and its constituents on term birth weight. • Fine particles and several of its constituents associated with birth weight reductions. • Largest reductions for traffic-related particles, sulfur constituents, and metals. • Greater birth weight reductions for younger mothers, and varied by race/ethnicity.« less
Point-particle effective field theory I: classical renormalization and the inverse-square potential
NASA Astrophysics Data System (ADS)
Burgess, C. P.; Hayman, Peter; Williams, M.; Zalavári, László
2017-04-01
Singular potentials (the inverse-square potential, for example) arise in many situations and their quantum treatment leads to well-known ambiguities in choosing boundary conditions for the wave-function at the position of the potential's singularity. These ambiguities are usually resolved by developing a self-adjoint extension of the original prob-lem; a non-unique procedure that leaves undetermined which extension should apply in specific physical systems. We take the guesswork out of this picture by using techniques of effective field theory to derive the required boundary conditions at the origin in terms of the effective point-particle action describing the physics of the source. In this picture ambiguities in boundary conditions boil down to the allowed choices for the source action, but casting them in terms of an action provides a physical criterion for their determination. The resulting extension is self-adjoint if the source action is real (and involves no new degrees of freedom), and not otherwise (as can also happen for reasonable systems). We show how this effective-field picture provides a simple framework for understanding well-known renormalization effects that arise in these systems, including how renormalization-group techniques can resum non-perturbative interactions that often arise, particularly for non-relativistic applications. In particular we argue why the low-energy effective theory tends to produce a universal RG flow of this type and describe how this can lead to the phenomenon of reaction catalysis, in which physical quantities (like scattering cross sections) can sometimes be surprisingly large compared to the underlying scales of the source in question. We comment in passing on the possible relevance of these observations to the phenomenon of the catalysis of baryon-number violation by scattering from magnetic monopoles.
Paraskevopoulou, D; Liakakou, E; Gerasopoulos, E; Mihalopoulos, N
2015-09-15
To identify the sources of aerosols in Greater Athens Area (GAA), a total of 1510 daily samples of fine (PM 2.5) and coarse (PM 10-2,5) aerosols were collected at a suburban site (Penteli), during a five year period (May 2008-April 2013) corresponding to the period before and during the financial crisis. In addition, aerosol sampling was also conducted in parallel at an urban site (Thissio), during specific, short-term campaigns during all seasons. In all these samples mass and chemical composition measurements were performed, the latest only at the fine fraction. Particulate organic matter (POM) and ionic masses (IM) are the main contributors of aerosol mass, equally contributing by accounting for about 24% of the fine aerosol mass. In the IM, nss-SO4(-2) is the prevailing specie followed by NO3(-) and NH4(+) and shows a decreasing trend during the 2008-2013 period similar to that observed for PM masses. The contribution of water in fine aerosol is equally significant (21 ± 2%), while during dust transport, the contribution of dust increases from 7 ± 2% to 31 ± 9%. Source apportionment (PCA and PMF) and mass closure exercises identified the presence of six sources of fine aerosols: secondary photochemistry, primary combustion, soil, biomass burning, sea salt and traffic. Finally, from winter 2012 to winter 2013 the contribution of POM to the urban aerosol mass is increased by almost 30%, reflecting the impact of wood combustion (dominant fuel for domestic heating) to air quality in Athens, which massively started in winter 2013. Copyright © 2015 Elsevier B.V. All rights reserved.
Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B
2016-12-01
The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
de Souza, Andrea; Bittker, Joshua; Lahr, David; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I.; Waller, Anna; Yang, Jeremy; Southall, Noel; Guha, Rajarshi; Schurer, Stephan; Vempati, Uma; Southern, Mark R.; Dawson, Eric S.; Clemons, Paul A.; Chung, Thomas D.Y.
2015-01-01
Recent industry-academic partnerships involve collaboration across disciplines, locations, and organizations using publicly funded “open-access” and proprietary commercial data sources. These require effective integration of chemical and biological information from diverse data sources, presenting key informatics, personnel, and organizational challenges. BARD (BioAssay Research Database) was conceived to address these challenges and to serve as a community-wide resource and intuitive web portal for public-sector chemical biology data. Its initial focus is to enable scientists to more effectively use the NIH Roadmap Molecular Libraries Program (MLP) data generated from 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage the BioAssay Ontology (BAO) and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We have initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the inter-disciplinary BARD team, veterans of public and private sector data-integration projects, collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. PMID:24441647
Guillarme, Davy; Desfontaine, Vincent; Heinisch, Sabine; Veuthey, Jean-Luc
2018-04-15
Mass spectrometry (MS) is considered today as one of the most popular detection methods, due to its high selectivity and sensitivity. In particular, this detector has become the gold standard for the analysis of complex mixtures such as biological samples. The first successful SFC-MS hyphenation was reported in the 80's, and since then, several ionization sources, mass analyzers and interfacing technologies have been combined. Due to the specific physicochemical properties and compressibility of the SFC mobile phase, directing the column effluent into the ionization source is more challenging than in LC. Therefore, some specific interfaces have to be employed in SFC-MS, to i) avoid (or at least limit) analytes precipitation due to CO 2 decompression, when the SFC mobile phase is not anymore under backpressure control, ii) achieve adequate ionization yield, even with a low proportion of MeOH in the mobile phase and iii) preserve the chromatographic integrity (i.e. maintaining retention, selectivity, and efficiency). The goal of this review is to describe the various SFC-MS interfaces and highlight the most favorable ones in terms of reliability, flexibility, sensitivity and user-friendliness. Copyright © 2018 Elsevier B.V. All rights reserved.
Lithium-Ion Batteries for Aerospace Applications
NASA Technical Reports Server (NTRS)
Surampudi, S.; Halpert, G.; Marsh, R. A.; James, R.
1999-01-01
This presentation reviews: (1) the goals and objectives, (2) the NASA and Airforce requirements, (3) the potential near term missions, (4) management approach, (5) the technical approach and (6) the program road map. The objectives of the program include: (1) develop high specific energy and long life lithium ion cells and smart batteries for aerospace and defense applications, (2) establish domestic production sources, and to demonstrate technological readiness for various missions. The management approach is to encourage the teaming of universities, R&D organizations, and battery manufacturing companies, to build on existing commercial and government technology, and to develop two sources for manufacturing cells and batteries. The technological approach includes: (1) develop advanced electrode materials and electrolytes to achieve improved low temperature performance and long cycle life, (2) optimize cell design to improve specific energy, cycle life and safety, (3) establish manufacturing processes to ensure predictable performance, (4) establish manufacturing processes to ensure predictable performance, (5) develop aerospace lithium ion cells in various AH sizes and voltages, (6) develop electronics for smart battery management, (7) develop a performance database required for various applications, and (8) demonstrate technology readiness for the various missions. Charts which review the requirements for the Li-ion battery development program are presented.
Defining the relationship between individuals’ aggregate and maximum source-specific exposures
The concepts of aggregate and source-specific exposures play an important role in chemical risk management. The concepts of aggregate and source-specific exposures play an important role in chemical risk management. Aggregate exposure to a chemical refers to combined exposures fr...
Quantitative Determination of Vinpocetine in Dietary Supplements.
French, John M T; King, Matthew D; McDougal, Owen M
2016-05-01
Current United States regulatory policies allow for the addition of pharmacologically active substances in dietary supplements if derived from a botanical source. The inclusion of certain nootropic drugs, such as vinpocetine, in dietary supplements has recently come under scrutiny due to the lack of defined dosage parameters and yet unproven short- and long-term benefits and risks to human health. This study quantified the concentration of vinpocetine in several commercially available dietary supplements and found that a highly variable range of 0.6-5.1 mg/serving was present across the tested products, with most products providing no specification of vinpocetine concentrations.
NASA Technical Reports Server (NTRS)
Traversi, M.; Barbarek, L. A. C.
1979-01-01
A handy reference for JPL minimum requirements and guidelines is presented as well as information on the use of the fundamental information source represented by the Nationwide Personal Transportation Survey. Data on U.S. demographic statistics and highway speeds are included along with methodology for normal parameters evaluation, synthesis of daily distance distributions, and projection of car ownership distributions. The synthesis of tentative mission quantification results, of intermediate mission quantification results, and of mission quantification parameters are considered and 1985 in place fleet fuel economy data are included.
Water resources management. World Bank policy paper; Gestion des ressources en eau
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-12-31
The management framework presented in this study addresses the demand for water in Asia caused by rapid population growth and economic development. It focuses on three key actions to meet the challenge: evaluate how the region manages water resources; identify guidelines for the Bank`s water resource programs; and develop country-specific strategies and promote joint programs. Reforms built into the framework seek to modernize institutions that affect water sources. The authors suggest ways to improve planning and long-term management, streamline economic and financial policy, and upgrade `real-time` management, operation, and maintenance.
The mass-zero spin-two field and gravitational theory.
NASA Technical Reports Server (NTRS)
Coulter, C. A.
1972-01-01
Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.
Phenolic-enriched foods: sources and processing for enhanced health benefits.
McDougall, Gordon J
2017-05-01
Polyphenols are ubiquitous secondary products present in many plant foods. Their intake has been associated with health benefits ranging from reduced incidence of CVD, diabetes and cancers to improved neurodegenerative outcomes. Major dietary sources include beverages such as coffee, teas and foods such as chocolate. Fruits are also major sources and berries in particular are a palatable source of a diverse range of polyphenol components. There are a number of ways that polyphenol uptake could be increased and healthier polyphenol-rich foods could be produced with specific compositions to target-specific health effects. Firstly, we could exploit the genetic diversity of plants (with a focus on berries) to select varieties that have enhanced levels of specific polyphenols implicated in disease mitigation (e.g. anthocyanins, tannins or flavonols). Working with variation induced by environmental and agronomic factors, modern molecular breeding techniques could exploit natural variation and beneficially alter polyphenol content and composition, although this could be relatively long term. Alternatively, we could employ a synthetic biology approach and design new plants that overexpress certain genes or re-deploy more metabolic effort into specific polyphenols. However, such 'polyphenol-plus' fruit could prove unpalatable as polyphenols contribute to sensorial properties (e.g. astringency of tannins). However, if the aim was to produce a polyphenol as a pharmaceutical then 'lifting' biosynthetic pathways from plants and expressing them in microbial vectors may be a feasible option. Secondly, we could design processing methods to enhance the polyphenolic composition or content of foods. Fermentation of teas, cocoa beans and grapes, or roasting of cocoa and coffee beans has long been used and can massively influence polyphenol composition and potential bioactivity. Simple methods such as milling, heat treatment, pasteurisation or juicing (v. pureeing) can have notable effects on polyphenol profiles and novel extraction methods bring new opportunities. Encapsulation methods can protect specific polyphenols during digestion and increase their delivery in the gastrointestinal tract to target-specific health effects. Lastly we could examine reformulation of products to alter polyphenol content or composition. Enhancing staple apple or citrus juices with berry juices could double polyphenol levels and provide specific polyphenol components. Reformulation of foods with polyphenol-rich factions recovered from 'wastes' could increase polyphenol intake, alter product acceptability, improve shelf life and prevent food spoilage. Finally, co-formulation of foods can influence bioavailability and potential bioactivity of certain polyphenols. Within the constraints that certain polyphenols can interfere with drug effectiveness through altered metabolism, this provides another avenue to enhance polyphenol intake and potential effectiveness. In conclusion, these approaches could be developed separately or in combination to produce foods with enhanced levels of phenolic components that are effective against specific disease conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-10-01
The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less
Soppa, Vanessa J; Schins, Roel P F; Hennig, Frauke; Nieuwenhuijsen, Mark J; Hellack, Bryan; Quass, Ulrich; Kaminski, Heinz; Sasse, Birgitta; Shinnawi, Samir; Kuhlbusch, Thomas A J; Hoffmann, Barbara
2017-10-01
Particulate air pollution is linked to adverse cardiovascular effects. The aim of the study was to investigate the effect of short-term exposure to indoor particles on blood pressure (BP). We analyzed the association of particle emissions from indoor sources (candle burning, toasting bread, frying sausages) with BP changes in 54 healthy volunteers in a randomized cross-over controlled exposure study. Particle mass concentration (PMC), size-specific particle number concentration (PNC) and lung-deposited particle surface area concentration (PSC) were measured during the 2h exposure. Systolic and diastolic blood pressure were measured before, during, directly, 2, 4 and 24h after exposure. We performed multiple mixed linear regression analyses of different particle metrics and BP. BP significantly increased with increasing PMC, PSC and PNC resulting from toasting bread. For example, an increase per 10µg/m 3 PM 10 and PM 2.5 , systolic BP increased at all time points with largest changes 1h after exposure initiation of 1.5mmHg (95%-CI: 1.1; 1.9) and of 2.2mmHg (95%-CI: 1.3; 3.1), respectively. Our study suggests an association of short-term exposure to fine and ultrafine particles emitted from toasting bread with increases in BP. Particles emitted from frying sausages and candle burning did not consistently affect BP. Copyright © 2017. Published by Elsevier Inc.
Developing a Domain Ontology: the Case of Water Cycle and Hydrology
NASA Astrophysics Data System (ADS)
Gupta, H.; Pozzi, W.; Piasecki, M.; Imam, B.; Houser, P.; Raskin, R.; Ramachandran, R.; Martinez Baquero, G.
2008-12-01
A semantic web ontology enables semantic data integration and semantic smart searching. Several organizations have attempted to implement smart registration and integration or searching using ontologies. These are the NOESIS (NSF project: LEAD) and HydroSeek (NSF project: CUAHS HIS) data discovery engines and the NSF project GEON. All three applications use ontologies to discover data from multiple sources and projects. The NASA WaterNet project was established to identify creative, innovative ways to bridge NASA research results to real world applications, linking decision support needs to available data, observations, and modeling capability. WaterNet (NASA project) utilized the smart query tool Noesis as a testbed to test whether different ontologies (and different catalog searches) could be combined to match resources with user needs. NOESIS contains the upper level SWEET ontology that accepts plug in domain ontologies to refine user search queries, reducing the burden of multiple keyword searches. Another smart search interface was that developed for CUAHSI, HydroSeek, that uses a multi-layered concept search ontology, tagging variables names from any number of data sources to specific leaf and higher level concepts on which the search is executed. This approach has proven to be quite successful in mitigating semantic heterogeneity as the user does not need to know the semantic specifics of each data source system but just uses a set of common keywords to discover the data for a specific temporal and geospatial domain. This presentation will show tests with Noesis and Hydroseek lead to the conclusion that the construction of a complex, and highly heterogeneous water cycle ontology requires multiple ontology modules. To illustrate the complexity and heterogeneity of a water cycle ontology, Hydroseek successfully utilizes WaterOneFlow to integrate data across multiple different data collections, such as USGS NWIS. However,different methodologies are employed by the Earth Science, the Hydrological, and Hydraulic Engineering Communities, and each community employs models that require different input data. If a sub-domain ontology is created for each of these,describing water balance calculations, then the resulting structure of the semantic network describing these various terms can be rather complex, heterogeneous, and overlapping, and will require "mapping" between equivalent terms in the ontologies, along with the development of an upper level conceptual or domain ontology to utilize and link to those already in existence.
Reversal Frequency, Core-Mantle Conditions, and the SCOR-field Hypothesis
NASA Astrophysics Data System (ADS)
Hoffman, K. A.
2009-12-01
One of the most intriguing results from paleomagnetic data spanning the past 108 yr comes from the work of McFadden et al. (1991) who found that the variation in the rate of polarity reversal is apparently tied to the temporal variation in the harmonic content of the full-polarity field. Their finding indicates that it is the relative importance of the two dynamo families--i.e. the Primary Family (PF), the field antisymmetric about the equator, and the Secondary Family (SF), the field symmetric about the equator--that largely determines reversal frequency. More specifically, McFadden et al. found that as the relative significance of the SF increases, as is observed during the Cenozoic, so too does reversal rate. Such a finding is reminiscent of the seminal work of Allan Cox who some forty years ago proposed that interactions with the non-dipole field may provide the trigger for reversal of the axial dipole (AD) field. Hence, new questions arise: Do the two dynamo family fields interact in this manner, and, if so, how can such an interaction physically occur in the fluid core? Gaussian coefficient terms comprising the PF and SF have degree and order (n + m) that sum to an odd and even number, respectively. The most significant field term in the PF is by far that of the axial dipole (g10). The entire SF, starting with the equatorial dipole terms (g11 and h11) and the axial quadrupole (g20), are constituents of the non-axial dipole (NAD) field. By way of both paleomagnetic transition and geomagnetic data Hoffman and Singer (2008) recently proposed (1) that field sources exist within the shallow core (SCOR-field) associated with fluid motions affected by long-lived core-mantle boundary conditions; (2) that these SCOR-field sources are largely separated from, i.e. in “poor communication” with, deep field convection roll-generated sources; and (3) that the deep sources are largely responsible for the AD field, leaving the SCOR-field to be the primary source for the NAD-field. This SCOR-field would almost exclusively contain the observed SF field, while the AD-field sources deeper within the core would be most responsible for the observed PF field. If so, the McFadden et al. result may be explained as follows: That the observed increasing significance of the SF field during the Cenozoic is the result of intensifying interactions between shallow core SCOR-field sources and deep core AD-field sources. This then suggests a progressive enhancement in the variability of physical conditions along the CMB which may indicate an accelerating influx of descended lithospheric plates and/or increasing number of plume roots during the Cenozoic.
ERIC Educational Resources Information Center
Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.
2008-01-01
A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…
Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux
NASA Astrophysics Data System (ADS)
Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.
2017-12-01
Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.
Beisel, Chase L.; Storz, Gisela
2011-01-01
SUMMARY Bacteria selectively consume some carbon sources over others through a regulatory mechanism termed catabolite repression. Here, we show that the base pairing RNA Spot 42 plays a broad role in catabolite repression in Escherichia coli by directly repressing genes involved in central and secondary metabolism, redox balancing, and the consumption of diverse non-preferred carbon sources. Many of the genes repressed by Spot 42 are transcriptionally activated by the global regulator CRP. Since CRP represses Spot 42, these regulators participate in a specific regulatory circuit called a multi-output feedforward loop. We found that this loop can reduce leaky expression of target genes in the presence of glucose and can maintain repression of target genes under changing nutrient conditions. Our results suggest that base pairing RNAs in feedforward loops can help shape the steady-state levels and dynamics of gene expression. PMID:21292161
Aviation Fueling: A Cleaner, Greener Approach
NASA Technical Reports Server (NTRS)
Hendricks, Robert C.; Bushnell, Dennis M.; Shouse, Dale T.
2010-01-01
Projected growth of aviation depends on fueling where specific needs must be met. Safety is paramount, and along with political, social, environmental and legacy transport systems requirements, alternate aviation fueling becomes an opportunity of enormous proportions. Biofuels sourced from halophytes, algae, cyanobacteria, and weeds using wastelands, waste water, and seawater have the capacity to be drop-in fuel replacements for petroleum fuels. Biojet fuels from such sources solves the aviation CO2 emissions issue and do not compete with food or freshwater needs. They are not detrimental to the social or environmental fabric and use the existing fuels infrastructure. Cost and sustainable supply remains the major impediments to alternate fuels. Halophytes are the near-term solution to biomass/biofuels capacity at reasonable costs; they simply involve more farming, at usual farming costs. Biofuels represent a win-win approach, proffering as they do at least the ones we are studying massive capacity, climate neutral-to-some sequestration, and ultimately, reasonable costs.
A nonequilibrium model for a moderate pressure hydrogen microwave discharge plasma
NASA Technical Reports Server (NTRS)
Scott, Carl D.
1993-01-01
This document describes a simple nonequilibrium energy exchange and chemical reaction model to be used in a computational fluid dynamics calculation for a hydrogen plasma excited by microwaves. The model takes into account the exchange between the electrons and excited states of molecular and atomic hydrogen. Specifically, electron-translation, electron-vibration, translation-vibration, ionization, and dissociation are included. The model assumes three temperatures, translational/rotational, vibrational, and electron, each describing a Boltzmann distribution for its respective energy mode. The energy from the microwave source is coupled to the energy equation via a source term that depends on an effective electric field which must be calculated outside the present model. This electric field must be found by coupling the results of the fluid dynamics and kinetics solution with a solution to Maxwell's equations that includes the effects of the plasma permittivity. The solution to Maxwell's equations is not within the scope of this present paper.
Software support for SBGN maps: SBGN-ML and LibSBGN.
van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk
2012-08-01
LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.
Distribution of tsunami interevent times
NASA Astrophysics Data System (ADS)
Geist, Eric L.; Parsons, Tom
2008-01-01
The distribution of tsunami interevent times is analyzed using global and site-specific (Hilo, Hawaii) tsunami catalogs. An empirical probability density distribution is determined by binning the observed interevent times during a period in which the observation rate is approximately constant. The empirical distributions for both catalogs exhibit non-Poissonian behavior in which there is an abundance of short interevent times compared to an exponential distribution. Two types of statistical distributions are used to model this clustering behavior: (1) long-term clustering described by a universal scaling law, and (2) Omori law decay of aftershocks and triggered sources. The empirical and theoretical distributions all imply an increased hazard rate after a tsunami, followed by a gradual decrease with time approaching a constant hazard rate. Examination of tsunami sources suggests that many of the short interevent times are caused by triggered earthquakes, though the triggered events are not necessarily on the same fault.
An airport cargo inspection system based on X-ray and thermal neutron analysis (TNA).
Ipe, Nisy E; Akery, A; Ryge, P; Brown, D; Liu, F; Thieu, J; James, B
2005-01-01
A cargo inspection system incorporating a high-resolution X-ray imaging system with a material-specific detection system based on Ancore Corporation's patented thermal neutron analysis (TNA) technology can detect bulk quantities of explosives and drugs concealed in trucks or cargo containers. The TNA process utilises a 252Cf neutron source surrounded by a moderator. The neutron interactions with the inspected object result in strong and unique gamma-ray signals from nitrogen, which is a key ingredient in modern high explosives, and from chlorinated drugs. The TNA computer analyses the gamma-ray signals and automatically determines the presence of explosives or drugs. The radiation source terms and shielding design of the facility are described. For the X-ray generator, the primary beam, leakage radiation, and scattered primary and leakage radiation were considered. For the TNA, the primary neutrons and tunnel scattered neutrons as well as the neutron-capture gamma rays were considered.
Stochastic memory: getting memory out of noise
NASA Astrophysics Data System (ADS)
Stotland, Alexander; di Ventra, Massimiliano
2011-03-01
Memory circuit elements, namely memristors, memcapacitors and meminductors, can store information without the need of a power source. These systems are generally defined in terms of deterministic equations of motion for the state variables that are responsible for memory. However, in real systems noise sources can never be eliminated completely. One would then expect noise to be detrimental for memory. Here, we show that under specific conditions on the noise intensity memory can actually be enhanced. We illustrate this phenomenon using a physical model of a memristor in which the addition of white noise into the state variable equation improves the memory and helps the operation of the system. We discuss under which conditions this effect can be realized experimentally, discuss its implications on existing memory systems discussed in the literature, and also analyze the effects of colored noise. Work supported in part by NSF.
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method
NASA Astrophysics Data System (ADS)
Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel
2018-05-01
We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.
Advanced relativistic VLBI model for geodesy
NASA Astrophysics Data System (ADS)
Soffel, Michael; Kopeikin, Sergei; Han, Wen-Biao
2017-07-01
Our present relativistic part of the geodetic VLBI model for Earthbound antennas is a consensus model which is considered as a standard for processing high-precision VLBI observations. It was created as a compromise between a variety of relativistic VLBI models proposed by different authors as documented in the IERS Conventions 2010. The accuracy of the consensus model is in the picosecond range for the group delay but this is not sufficient for current geodetic purposes. This paper provides a fully documented derivation of a new relativistic model having an accuracy substantially higher than one picosecond and based upon a well accepted formalism of relativistic celestial mechanics, astrometry and geodesy. Our new model fully confirms the consensus model at the picosecond level and in several respects goes to a great extent beyond it. More specifically, terms related to the acceleration of the geocenter are considered and kept in the model, the gravitational time-delay due to a massive body (planet, Sun, etc.) with arbitrary mass and spin-multipole moments is derived taking into account the motion of the body, and a new formalism for the time-delay problem of radio sources located at finite distance from VLBI stations is presented. Thus, the paper presents a substantially elaborated theoretical justification of the consensus model and its significant extension that allows researchers to make concrete estimates of the magnitude of residual terms of this model for any conceivable configuration of the source of light, massive bodies, and VLBI stations. The largest terms in the relativistic time delay which can affect the current VLBI observations are from the quadrupole and the angular momentum of the gravitating bodies that are known from the literature. These terms should be included in the new geodetic VLBI model for improving its consistency.
On estimating attenuation from the amplitude of the spectrally whitened ambient seismic field
NASA Astrophysics Data System (ADS)
Weemstra, Cornelis; Westra, Willem; Snieder, Roel; Boschi, Lapo
2014-06-01
Measuring attenuation on the basis of interferometric, receiver-receiver surface waves is a non-trivial task: the amplitude, more than the phase, of ensemble-averaged cross-correlations is strongly affected by non-uniformities in the ambient wavefield. In addition, ambient noise data are typically pre-processed in ways that affect the amplitude itself. Some authors have recently attempted to measure attenuation in receiver-receiver cross-correlations obtained after the usual pre-processing of seismic ambient-noise records, including, most notably, spectral whitening. Spectral whitening replaces the cross-spectrum with a unit amplitude spectrum. It is generally assumed that cross-terms have cancelled each other prior to spectral whitening. Cross-terms are peaks in the cross-correlation due to simultaneously acting noise sources, that is, spurious traveltime delays due to constructive interference of signal coming from different sources. Cancellation of these cross-terms is a requirement for the successful retrieval of interferometric receiver-receiver signal and results from ensemble averaging. In practice, ensemble averaging is replaced by integrating over sufficiently long time or averaging over several cross-correlation windows. Contrary to the general assumption, we show in this study that cross-terms are not required to cancel each other prior to spectral whitening, but may also cancel each other after the whitening procedure. Specifically, we derive an analytic approximation for the amplitude difference associated with the reversed order of cancellation and normalization. Our approximation shows that an amplitude decrease results from the reversed order. This decrease is predominantly non-linear at small receiver-receiver distances: at distances smaller than approximately two wavelengths, whitening prior to ensemble averaging causes a significantly stronger decay of the cross-spectrum.
Directional Unfolded Source Term (DUST) for Compton Cameras.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean
2018-03-01
A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.
NASA Astrophysics Data System (ADS)
Poupardin, A.; Heinrich, P.; Hébert, H.; Schindelé, F.; Jamelot, A.; Reymond, D.; Sugioka, H.
2018-05-01
This paper evaluates the importance of frequency dispersion in the propagation of recent trans-Pacific tsunamis. Frequency dispersion induces a time delay for the most energetic waves, which increases for long propagation distances and short source dimensions. To calculate this time delay, propagation of tsunamis is simulated and analyzed from spectrograms of time-series at specific gauges in the Pacific Ocean. One- and two-dimensional simulations are performed by solving either shallow water or Boussinesq equations and by considering realistic seismic sources. One-dimensional sensitivity tests are first performed in a constant-depth channel to study the influence of the source width. Two-dimensional tests are then performed in a simulated Pacific Ocean with a 4000-m constant depth and by considering tectonic sources of 2010 and 2015 Chilean earthquakes. For these sources, both the azimuth and the distance play a major role in the frequency dispersion of tsunamis. Finally, simulations are performed considering the real bathymetry of the Pacific Ocean. Multiple reflections, refractions as well as shoaling of waves result in much more complex time series for which the effects of the frequency dispersion are hardly discernible. The main point of this study is to evaluate frequency dispersion in terms of traveltime delays by calculating spectrograms for a time window of 6 hours after the arrival of the first wave. Results of the spectral analysis show that the wave packets recorded by pressure and tide sensors in the Pacific Ocean seem to be better reproduced by the Boussinesq model than the shallow water model and approximately follow the theoretical dispersion relationship linking wave arrival times and frequencies. Additionally, a traveltime delay is determined above which effects of frequency dispersion are considered to be significant in terms of maximum surface elevations.
Bioprospecting microbes for single-cell oil production from starchy wastes.
Chaturvedi, Shivani; Kumari, Arti; Nain, Lata; Khare, Sunil K
2018-03-16
Production of lipid from oleaginous yeast using starch as a carbon source is not a common practice; therefore, the purpose of this investigation was to explore the capability of starch assimilating microbes to produce oil, which was determined in terms of biomass weight, productivity, and lipid yield. Saccharomyces pastorianus, Rhodotorula mucilaginosa, Rhodotorula glutinis, and fungal isolate Ganoderma wiiroense were screened for the key parameters. The optimization was also performed by one-factor-at-a-time approach. Considering the specific yield of lipid and cell dry weight yield, R. glutinis and R. mucilaginosa showed superiority over other strains. G. wiiroense, a new isolate, would also be a promising strain for starch waste utilization in terms of extracellular and intracellular specific yield of lipids. Extracellular specific yield of lipid was highest in R. glutinis culture (0.025 g g -1 of biomass) followed by R. mucilaginosa (0.022 g g -1 of biomass) and G. wiiroense (0.020 g g -1 of biomass). Intracellular lipid was again highest in R. glutinis (0.048 g g -1 of biomass). The most prominent fatty acid methyl esters among the lipid as detected by GC-MS were saturated lipids mainly octadecanoic acid, tetradecanoate, and hexadecanoate. Extracellular lipid produced on starch substrate waste would be a cost-effective alternative for energy-intensive extraction process in biodiesel industry.
Ii, Satoshi; Adib, Mohd Azrul Hisham Mohd; Watanabe, Yoshiyuki; Wada, Shigeo
2018-01-01
This paper presents a novel data assimilation method for patient-specific blood flow analysis based on feedback control theory called the physically consistent feedback control-based data assimilation (PFC-DA) method. In the PFC-DA method, the signal, which is the residual error term of the velocity when comparing the numerical and reference measurement data, is cast as a source term in a Poisson equation for the scalar potential field that induces flow in a closed system. The pressure values at the inlet and outlet boundaries are recursively calculated by this scalar potential field. Hence, the flow field is physically consistent because it is driven by the calculated inlet and outlet pressures, without any artificial body forces. As compared with existing variational approaches, although this PFC-DA method does not guarantee the optimal solution, only one additional Poisson equation for the scalar potential field is required, providing a remarkable improvement for such a small additional computational cost at every iteration. Through numerical examples for 2D and 3D exact flow fields, with both noise-free and noisy reference data as well as a blood flow analysis on a cerebral aneurysm using actual patient data, the robustness and accuracy of this approach is shown. Moreover, the feasibility of a patient-specific practical blood flow analysis is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.
Steady and unsteady blade stresses within the SSME ATD/HPOTP inducer
NASA Technical Reports Server (NTRS)
Gross, R. Steven
1994-01-01
There were two main goals of the ATD HPOTP (alternate turbopump development)(high pressure oxygen turbopump). First, determine the steady and unsteady inducer blade surface strains produced by hydrodynamic sources as a function of flow capacity (Q/N), suction specific speed (Nss), and Reynolds number (Re). Second, to identify the hydrodynamic source(s) of the unsteady blade strains. The reason the aforementioned goals are expressed in terms of blade strains as opposed to blade hydrodynamic pressures is because of the interest regarding the high cycle life of the inducer blades. This report focuses on the first goal of the test program which involves the determination of the steady and unsteady strain (stress) values at various points within the inducer blades. Strain gages were selected as the strain measuring devices. Concurrent with the experimental program, an analytical study was undertaken to produce a complete NASTRAN finite-element model of the inducer. Computational fluid dynamics analyses were utilized to provide the estimated steady-state blade surface pressure loading needed as load input to the NASTRAN inducer model.
A source to deliver mesoscopic particles for laser plasma studies
NASA Astrophysics Data System (ADS)
Gopal, R.; Kumar, R.; Anand, M.; Kulkarni, A.; Singh, D. P.; Krishnan, S. R.; Sharma, V.; Krishnamurthy, M.
2017-02-01
Intense ultrashort laser produced plasmas are a source for high brightness, short burst of X-rays, electrons, and high energy ions. Laser energy absorption and its disbursement strongly depend on the laser parameters and also on the initial size and shape of the target. The ability to change the shape, size, and material composition of the matter that absorbs light is of paramount importance not only from a fundamental physics point of view but also for potentially developing laser plasma sources tailored for specific applications. The idea of preparing mesoscopic particles of desired size/shape and suspending them in vacuum for laser plasma acceleration is a sparsely explored domain. In the following report we outline the development of a delivery mechanism of microparticles into an effusive jet in vacuum for laser plasma studies. We characterise the device in terms of particle density, particle size distribution, and duration of operation under conditions suitable for laser plasma studies. We also present the first results of x-ray emission from micro crystals of boric acid that extends to 100 keV even under relatively mild intensities of 1016 W/cm2.
Kim, Hakchan; Kim, Jaai; Shin, Seung Gu; Hwang, Seokhwan; Lee, Changsoo
2016-05-01
This study investigated the simultaneous effects of hydraulic retention time (HRT) and pH on the continuous production of VFAs from food waste leachate using response surface analysis. The response surface approximations (R(2)=0.895, p<0.05) revealed that pH has a dominant effect on the specific VFA production (PTVFA) within the explored space (1-4-day HRT, pH 4.5-6.5). The estimated maximum PTVFA was 0.26g total VFAs/g CODf at 2.14-day HRT and pH 6.44, and the approximation was experimentally validated by running triplicate reactors under the estimated optimum conditions. The mixture of the filtrates recovered from these reactors was tested as a denitrification carbon source and demonstrated superior performance in terms of reaction rate and lag length relative to other chemicals, including acetate and methanol. The overall results provide helpful information for better design and control of continuous fermentation for producing waste-derived VFAs, an alternative carbon source for denitrification. Copyright © 2016 Elsevier Ltd. All rights reserved.
Assessment of indexing trends with specific and general terms for herbal medicine.
Bartol, Tomaz
2012-12-01
Concepts for medicinal plants are represented by a variety of associated general terms with specific indexing patterns in databases, which may not consistently reflect growth of records. The objectives of this study are to assess the development in databases by identifying general terms that describe herbal medicine with optimal retrieval recall and to identify possible special trends in co-occurrence of specific and general concepts. Different search strategies are tested in cab abstracts, medline and web of science. Specific terms (Origanum and Salvia) are employed. Relevant general terms (e.g. 'Plants, Medicinal', Phytotherapy, Herbal drugs) are identified, along with indexing trends and co-occurrences. Growth trends, in specific (narrower) terms, are similar among databases. General terms, however, exhibit dissimilar trends, sometimes almost opposing one another. Co-occurrence of specific and general terms is changing over time. General terms may not denote definite development of trends as the use of terms differs amongst databases, making it difficult to correctly assess possible numbers of relevant records. Perceived increase can, sometimes, be attributed to an increased occurrence of a more general term alongside the specific one. Thesaurus-controlled databases may yield more hits, because of 'up-posted' (broader) terms. Use of broader terms is helpful as it enhances retrieval of relevant documents. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group.
Brown, J; Hosseini, A; Karcher, M; Kauker, F; Dowdall, M; Schnur, R; Strand, P
2016-04-15
The transport of nuclear or radioactive materials and the presence of nuclear powered vessels pose risks to the Northern Seas in terms of potential impacts to man and environment as well socio-economic impacts. Management of incidents involving actual or potential releases to the marine environment are potentially difficult due to the complexity of the environment into which the release may occur and difficulties in quantifying risk to both man and environment. In order to address this, a state of the art oceanographic model was used to characterize the underlying variability for a specific radionuclide release scenario. The resultant probabilistic data were used as inputs to transfer and dose models providing an indication of potential impacts for man and environment This characterization was then employed to facilitate a rapid means of quantifying risk to man and the environment that included and addressed this variability. The radionuclide specific risk indices derived can be applied by simply multiplying the reported values by the magnitude of the source term and thereafter summing over all radionuclides to provide an indication of total risk. Copyright © 2016. Published by Elsevier Ltd.
Online self-help forums on cannabis: A content assessment.
Greiner, Christian; Chatton, Anne; Khazaal, Yasser
2017-10-01
To investigate online self-help forums related to cannabis users who were searching for help on the Internet. We analyzed the content of 717 postings by 328 users in three online forums in terms of fields of interest and self-help mechanisms. Only English-language forums that were free of charge and without registration were investigated. The main self-help mechanisms were disclosure and symptoms, with relatively few posts concerning legal issues and social perceptions. The forums differed significantly in all fields of interest and self-help mechanisms except for social network and financial and vocational issues. Highly involved users more commonly posted on topics related to diagnosis, etiology/research, and provision of information and less commonly on those related to gratitude. Correlation analysis showed a moderate negative correlation between emotional support and illness-related aspects and between emotional support and exchange of information. Cannabis forums share similarities with other mental health forums. Posts differ according to user involvement and the specific orientation of the forum. The Internet offers a viable source of self-help and social support for cannabis users, which has potential clinical implications in terms of referring clients to specific forums. Copyright © 2017 Elsevier B.V. All rights reserved.
Planck 2015 results: XXII. A map of the thermal Sunyaev-Zeldovich effect
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aghanim, N.; Arnaud, M.; Ashdown, M.
In this article, we have constructed all-sky Compton parameters maps, y-maps, of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 30 to 857 GHz frequency channel maps from the Planck satellite. These reconstructed y-maps are delivered as part of the Planck 2015 release. The y-maps are characterized in terms of noise properties and residual foreground contamination, mainly thermal dust emission at large angular scales, and cosmic infrared background and extragalactic point sources at small angular scales. Specific masks are defined to minimize foreground residuals and systematics. Using these masks, we compute the y-map angularmore » power spectrum and higher order statistics. From these we conclude that the y-map is dominated by tSZ signal in the multipole range, 20« less
ALARA implementation throughout project life cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haynes, M.J.
1995-03-01
A strength of radiation protection programs generally has been endorsement and application of the ALARA principle. In Ontario Hydro, which currently operates 20 commercial size nuclear units, great strides have been made in the last three decades in reducing occupational radiation exposure per unit of electricity generated. This paper will discuss specific applications of elements of the overall ALARA program which have most contributed to dose reduction as the nuclear program has expanded. This includes such things as management commitment, ALARA application in the design phase and major rehabilitation work, the benefits of the self protection concept, a specific examplemore » of elimination (or reduction) of the source term and the importance of dose targets. Finally, it is concluded that the major opportunities for further improvements may lie in the area of information management.« less
Planck 2015 results: XXII. A map of the thermal Sunyaev-Zeldovich effect
Aghanim, N.; Arnaud, M.; Ashdown, M.; ...
2016-09-20
In this article, we have constructed all-sky Compton parameters maps, y-maps, of the thermal Sunyaev-Zeldovich (tSZ) effect by applying specifically tailored component separation algorithms to the 30 to 857 GHz frequency channel maps from the Planck satellite. These reconstructed y-maps are delivered as part of the Planck 2015 release. The y-maps are characterized in terms of noise properties and residual foreground contamination, mainly thermal dust emission at large angular scales, and cosmic infrared background and extragalactic point sources at small angular scales. Specific masks are defined to minimize foreground residuals and systematics. Using these masks, we compute the y-map angularmore » power spectrum and higher order statistics. From these we conclude that the y-map is dominated by tSZ signal in the multipole range, 20« less
Investigation of aluminium ohmic contacts to n-type GaN grown by molecular beam epitaxy
NASA Astrophysics Data System (ADS)
Kribes, Y.; Harrison, I.; Tuck, B.; Kim, K. S.; Cheng, T. S.; Foxon, C. T.
1997-11-01
Using epi-layers of different doping concentrations, we have investigated aluminium contacts on n-type gallium nitride grown by plasma source molecular beam epitaxy. To achieve repeatable and reliable results it was found that the semiconductor needed to be etched in aqua-regia before the deposition of the contact metallization. Scanning electron micrographs of the semiconductor surface show a deterioration of the semiconductor surface on etching. The specific contact resistivity of the etched samples were, however, superior. Annealing the contacts at 0268-1242/12/11/030/img9 produced contacts with the lowest specific contact resistance of 0268-1242/12/11/030/img10. The long-term aging of these contacts was also investigated. The contacts and the sheet resistance were both found to deteriorate over a three-month period.
Redox signaling in the cardiomyocyte: From physiology to failure.
Santos, Celio X C; Raza, Sadaf; Shah, Ajay M
2016-05-01
The specific effect of oxygen and reactive oxygen species (ROS) in mediating post-translational modification of protein targets has emerged as a key mechanism regulating signaling components, a process termed redox signaling. ROS act in the post-translational modification of multiple target proteins including receptors, kinases, phosphatases, ion channels and transcription factors. Both O2 and ROS are major source of electrons in redox reactions in aerobic organisms. Because the heart has the highest O2 consumption among body organs, it is not surprising that redox signaling is central to heart function and pathophysiology. In this article, we review some of the main cardiac redox signaling pathways and their roles in the cardiomyocyte and in heart failure, with particular focus on the specific molecular targets of ROS in the heart. Copyright © 2016 Elsevier Ltd. All rights reserved.
Erratum to Surface‐wave green’s tensors in the near field
Haney, Matthew M.; Hisashi Nakahara,
2016-01-01
Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).
Flowsheets and source terms for radioactive waste projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forsberg, C.W.
1985-03-01
Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.
Reischer, G H; Haider, J M; Sommer, R; Stadler, H; Keiblinger, K M; Hornek, R; Zerobin, W; Mach, R L; Farnleitner, A H
2008-10-01
The impairment of water quality by faecal pollution is a global public health concern. Microbial source tracking methods help to identify faecal sources but the few recent quantitative microbial source tracking applications disregarded catchment hydrology and pollution dynamics. This quantitative microbial source tracking study, conducted in a large karstic spring catchment potentially influenced by humans and ruminant animals, was based on a tiered sampling approach: a 31-month water quality monitoring (Monitoring) covering seasonal hydrological dynamics and an investigation of flood events (Events) as periods of the strongest pollution. The detection of a ruminant-specific and a human-specific faecal Bacteroidetes marker by quantitative real-time PCR was complemented by standard microbiological and on-line hydrological parameters. Both quantitative microbial source tracking markers were detected in spring water during Monitoring and Events, with preponderance of the ruminant-specific marker. Applying multiparametric analysis of all data allowed linking the ruminant-specific marker to general faecal pollution indicators, especially during Events. Up to 80% of the variation of faecal indicator levels during Events could be explained by ruminant-specific marker levels proving the dominance of ruminant faecal sources in the catchment. Furthermore, soil was ruled out as a source of quantitative microbial source tracking markers. This study demonstrates the applicability of quantitative microbial source tracking methods and highlights the prerequisite of considering hydrological catchment dynamics in source tracking study design.
Robinson, C; Kirkham, J; Percival, R; Shore, R C; Bonass, W A; Brookes, S J; Kusa, L; Nakagaki, H; Kato, K; Nattress, B
1997-01-01
The study of plaque biofilms in the oral cavity is difficult as plaque removal inevitably disrupts biofilm integrity precluding kinetic studies involving the penetration of components and metabolism of substrates in situ. A method is described here in which plaque is formed in vivo under normal (or experimental) conditions using a collection device which can be removed from the mouth after a specified time without physical disturbance to the plaque biofilm, permitting site-specific analysis or exposure of the undisturbed plaque to experimental conditions in vitro. Microbiological analysis revealed plaque flora which was similar to that reported from many natural sources. Analytical data can be related to plaque volume rather than weight. Using this device, plaque fluoride concentrations have been shown to vary with plaque depth and in vitro short-term exposure to radiolabelled components may be carried out, permitting important conclusions to be drawn regarding the site-specific composition and dynamics of dental plaque.
Simulation verification techniques study: Simulation self test hardware design and techniques report
NASA Technical Reports Server (NTRS)
1974-01-01
The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.
A Benchmark Study of Large Contract Supplier Monitoring Within DOD and Private Industry
1994-03-01
83 2. Long Term Supplier Relationships ...... .. 84 3. Global Sourcing . . . . . . . . . . . . .. 85 4. Refocusing on Customer Quality...monitoring and recognition, reduced number of suppliers, global sourcing, and long term contractor relationships . These initiatives were then compared to DCMC...on customer quality. 14. suBJE.C TERMS Benchmark Study of Large Contract Supplier Monitoring. 15. NUMBER OF PAGES108 16. PRICE CODE 17. SECURITY
Engineering description of the ascent/descent bet product
NASA Technical Reports Server (NTRS)
Seacord, A. W., II
1986-01-01
The Ascent/Descent output product is produced in the OPIP routine from three files which constitute its input. One of these, OPIP.IN, contains mission specific parameters. Meteorological data, such as atmospheric wind velocities, temperatures, and density, are obtained from the second file, the Corrected Meteorological Data File (METDATA). The third file is the TRJATTDATA file which contains the time-tagged state vectors that combine trajectory information from the Best Estimate of Trajectory (BET) filter, LBRET5, and Best Estimate of Attitude (BEA) derived from IMU telemetry. Each term in the two output data files (BETDATA and the Navigation Block, or NAVBLK) are defined. The description of the BETDATA file includes an outline of the algorithm used to calculate each term. To facilitate describing the algorithms, a nomenclature is defined. The description of the nomenclature includes a definition of the coordinate systems used. The NAVBLK file contains navigation input parameters. Each term in NAVBLK is defined and its source is listed. The production of NAVBLK requires only two computational algorithms. These two algorithms, which compute the terms DELTA and RSUBO, are described. Finally, the distribution of data in the NAVBLK records is listed.
GONUTS: the Gene Ontology Normal Usage Tracking System
Renfro, Daniel P.; McIntosh, Brenley K.; Venkatraman, Anand; Siegele, Deborah A.; Hu, James C.
2012-01-01
The Gene Ontology Normal Usage Tracking System (GONUTS) is a community-based browser and usage guide for Gene Ontology (GO) terms and a community system for general GO annotation of proteins. GONUTS uses wiki technology to allow registered users to share and edit notes on the use of each term in GO, and to contribute annotations for specific genes of interest. By providing a site for generation of third-party documentation at the granularity of individual terms, GONUTS complements the official documentation of the Gene Ontology Consortium. To provide examples for community users, GONUTS displays the complete GO annotations from seven model organisms: Saccharomyces cerevisiae, Dictyostelium discoideum, Caenorhabditis elegans, Drosophila melanogaster, Danio rerio, Mus musculus and Arabidopsis thaliana. To support community annotation, GONUTS allows automated creation of gene pages for gene products in UniProt. GONUTS will improve the consistency of annotation efforts across genome projects, and should be useful in training new annotators and consumers in the production of GO annotations and the use of GO terms. GONUTS can be accessed at http://gowiki.tamu.edu. The source code for generating the content of GONUTS is available upon request. PMID:22110029
Performance Criteria of Nuclear Space Propulsion Systems
NASA Astrophysics Data System (ADS)
Shepherd, L. R.
Future exploration of the solar system on a major scale will require propulsion systems capable of performance far greater than is achievable with the present generation of rocket engines using chemical propellants. Viable missions going deeper into interstellar space will be even more demanding. Propulsion systems based on nuclear energy sources, fission or (eventually) fusion offer the best prospect for meeting the requirements. The most obvious gain coming from the application of nuclear reactions is the possibility, at least in principle, of obtaining specific impulses a thousandfold greater than can be achieved in chemically energised rockets. However, practical considerations preclude the possibility of exploiting the full potential of nuclear energy sources in any engines conceivable in terms of presently known technology. Achievable propulsive power is a particularly limiting factor, since this determines the acceleration that may be obtained. Conventional chemical rocket engines have specific propulsive powers (power per unit engine mass) in the order of gigawatts per tonne. One cannot envisage the possibility of approaching such a level of performance by orders of magnitude in presently conceivable nuclear propulsive systems. The time taken, under power, to reach a given terminal velocity is proportional to the square of the engine's exhaust velocity and the inverse of its specific power. An assessment of various nuclear propulsion concepts suggests that, even with the most optimistic assumptions, it could take many hundreds of years to attain the velocities necessary to reach the nearest stars. Exploration within a range of the order of a thousand AU, however, would appear to offer viable prospects, even with the low levels of specific power of presently conceivable nuclear engines.
ERIC Educational Resources Information Center
Littlejohn, Emily
2018-01-01
"Adaptation" originally began as a scientific term, but from 1860 to today it most often refers to an altered version of a text, film, or other literary source. When this term was first analyzed, humanities scholars often measured adaptations against their source texts, frequently privileging "original" texts. However, this…
40 CFR 401.11 - General definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Environmental Protection Agency. (d) The term point source means any discernible, confined and discrete conveyance, including but not limited to any pipe, ditch, channel, tunnel, conduit, well, discrete fissure... which pollutants are or may be discharged. (e) The term new source means any building, structure...
Liu, Jia Coco; Wilson, Ander; Mickley, Loretta J; Dominici, Francesca; Ebisu, Keita; Wang, Yun; Sulprizio, Melissa P; Peng, Roger D; Yue, Xu; Son, Ji-Young; Anderson, G Brooke; Bell, Michelle L
2017-01-01
The health impacts of wildfire smoke, including fine particles (PM2.5), are not well understood and may differ from those of PM2.5 from other sources due to differences in concentrations and chemical composition. First, for the entire Western United States (561 counties) for 2004-2009, we estimated daily PM2.5 concentrations directly attributable to wildfires (wildfires-specific PM2.5), using a global chemical transport model. Second, we defined smoke wave as ≥2 consecutive days with daily wildfire-specific PM2.5 > 20 μg/m, with sensitivity analysis considering 23, 28, and 37 μg/m. Third, we estimated the risk of cardiovascular and respiratory hospital admissions associated with smoke waves for Medicare enrollees. We used a generalized linear mixed model to estimate the relative risk of hospital admissions on smoke wave days compared with matched comparison days without wildfire smoke. We estimated that about 46 million people of all ages were exposed to at least one smoke wave during 2004 to 2009 in the Western United States. Of these, 5 million are Medicare enrollees (≥65 years). We found a 7.2% (95% confidence interval: 0.25%, 15%) increase in risk of respiratory admissions during smoke wave days with high wildfire-specific PM2.5 (>37 μg/m) compared with matched non smoke wave days. We did not observe an association between smoke wave days with wildfire-specific PM2.5 ≤ 37 μg/mand respiratory or cardiovascular admissions. Respiratory effects of wildfire-specific PM2.5 may be stronger than that of PM2.5 from other sources. Short-term exposure to wildfire-specific PM2.5was associated with risk of respiratory diseases in the elderly population in the Western United States during severe smoke days. See video abstract at, http://links.lww.com/EDE/B137.
Cassette, Philippe
2016-03-01
In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ(2) minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. Copyright © 2015 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Specific licenses for the manufacture or initial transfer... manufacture or initial transfer of calibration or reference sources. (a) An application for a specific license to manufacture or initially transfer calibration or reference sources containing plutonium, for...
Similarities and differences in affective and cognitive origins of food likings and dislikes.
Letarte, A; Dubé, L; Troche, V
1997-04-01
In a field study, 355 French-Canadian adults (Caucasians; 119 men, 236 women; average age of 40) freely stated the food item they liked and disliked the most, the reasons for their attitude and the context of their last consumption of these two food items. Content analysis revealed that the origins of food likes and dislikes are at the same time very similar and very different. They are similar in terms of the overwhelming influence of affective factors, in particular sensory experience, in the formation of both positive and negative attitudes toward food. Similarity between origins of food likes and dislikes in the same subjects is higher when they are from affective sources than when they are from cognitive sources. Food likes and dislikes are also similar in terms of the high salience of the social dimension in subjects' memories of consumption contexts. Results also show that food likes and dislikes also differ in many ways. Subjects can more easily elicit attitude bases and consumption contexts for food likes than they do for food dislikes. Beside taste as their common and most frequent base, results show that dislikes originate from more specific (e.g. texture, smell, appearance) and more intense sensory experiences than likes. Further, physiological consequences that contribute to food likes and dislikes are not the same: likes originate from positive nutritional value whereas dislikes follow from negative physiological responses, in particular nausea. Also, specific factors contribute uniquely to likes and dislikes. Functional aspects (e.g. flexibility, preparation) were the second most important reasons for food likes while having almost no influence on dislikes. In contrast, food symbolism was the third most important reason for food dislikes with almost no effect on food likes.
NASA Astrophysics Data System (ADS)
Jaquet, O.; Lantuéjoul, C.; Goto, J.
2017-10-01
Risk assessments in relation to the siting of potential deep geological repositories for radioactive wastes demand the estimation of long-term tectonic hazards such as volcanicity and rock deformation. Owing to their tectonic situation, such evaluations concern many industrial regions around the world. For sites near volcanically active regions, a prevailing source of uncertainty is related to volcanic hazard. For specific situations, in particular in relation to geological repository siting, the requirements for the assessment of volcanic and tectonic hazards have to be expanded to 1 million years. At such time scales, tectonic changes are likely to influence volcanic hazard and therefore a particular stochastic model needs to be developed for the estimation of volcanic hazard. The concepts and theoretical basis of the proposed model are given and a methodological illustration is provided using data from the Tohoku region of Japan.
Investigation of a family of power conditioners integrated into a utility grid: Category 1
NASA Astrophysics Data System (ADS)
Wood, P.; Putkovich, R. P.
1981-07-01
Technical issues regarding ac and dc interface requirements were studied. A baseline design was selected to be a good example of existing technology which would not need significant development effort for its implementation in residential solar photovoltaic systems. Alternative technologies are evaluated to determine which meet the baseline specification, and their costs and losses are evaluated. Areas in which cost improvements can be obtained are studied, and the three best candidate technologies--the current sourced converter, the HF front end converter, and the programmed wave converter--are compared. It is concluded that the designs investigated will meet, or with slight improvement could meet, short term efficiency goals. Long term efficiency goals could be met if an isolation transformer were not required in the power conditioning equipment. None of the technologies studied can meet cost goals unless further improvements are possible.
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.
2005-01-01
The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.
Plant Perception and Short-Term Responses to Phytophagous Insects and Mites.
Santamaria, M Estrella; Arnaiz, Ana; Gonzalez-Melendi, Pablo; Martinez, Manuel; Diaz, Isabel
2018-05-03
Plant⁻pest relationships involve complex processes encompassing a network of molecules, signals, and regulators for overcoming defenses they develop against each other. Phytophagous arthropods identify plants mainly as a source of food. In turn, plants develop a variety of strategies to avoid damage and survive. The success of plant defenses depends on rapid and specific recognition of the phytophagous threat. Subsequently, plants trigger a cascade of short-term responses that eventually result in the production of a wide range of compounds with defense properties. This review deals with the main features involved in the interaction between plants and phytophagous insects and acari, focusing on early responses from the plant side. A general landscape of the diverse strategies employed by plants within the first hours after pest perception to block the capability of phytophagous insects to develop mechanisms of resistance is presented, with the potential of providing alternatives for pest control.
Surface switching statistics of rotating fluid: Disk-rim gap effects
NASA Astrophysics Data System (ADS)
Tasaka, Yuji; Iima, Makoto
2017-04-01
We examined the influence of internal noise on the irregular switching of the shape of the free surface of fluids in an open cylindrical vessel driven by a bottom disk rotating at constant speed [Suzuki, Iima, and Hayase, Phys. Fluids 18, 101701 (2006), 10.1063/1.2359740]. A slight increase in the disk-rim gap (less than 3% of the disk radius) was established experimentally to cause significant changes in this system, specifically, frequent appearance of the surface descending event connecting a nonaxisymmetric shape in strong mixing flow (turbulent flow) and an axisymmetric shape in laminar flow, as well as a shift in critical Reynolds number that define the characteristic states. The physical mechanism underlying the change is analyzed in terms of flow characteristics in the disk-rim gap, which acts as a noise source, and a mathematical model established from measurements of the surface height fluctuations with noise term.
Auclair, A.N.D. [Science and Policy Associates, Inc., Washington, D.C. (United States; Bedford, J.A. [Science and Policy Associates, Inc., Washington, D.C. (United States); Revenga, C. [Science and Policy Associates, Inc., Washington, D.C. (United States); Brenkert, A.L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
1997-01-01
This database lists annual changes in areal extent (Ha) and gross merchantable wood volume (m3) produced by depletion and accrual processes in boreal and temperate forests in Alaska, Canada, Europe, Former Soviet Union, Non-Soviet temperate Asia, and the contiguous United States for the years 1890 through 1990. Forest depletions (source terms for atmospheric CO2) are identified as forest pests, forest dieback, forest fires, forest harvest, and land-use changes (predominantly the conversion of forest, temperate woodland, and shrubland to cropland). Forest accruals (sink terms for atmospheric CO2) are identified as fire exclusion, fire suppression, and afforestation or crop abandonment. The changes in areal extent and gross merchantable wood volume are calculated separately for each of the following biomes: forest tundra, boreal softwoods, mixed hardwoods, temperate softwoods, temperate hardwoods, and temperate wood- and shrublands.
Du, Yingge; Chambers, Scott A.
2014-10-20
Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific, real-time flux sensing and control. The ultimate sensitivity and performance of the sensors are strongly affected by the long-term and short term baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability which has not been previously considered or corrected by existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5%,more » which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.« less
Study, optimization, and design of a laser heat engine. [for satellite applications
NASA Technical Reports Server (NTRS)
Taussig, R. T.; Cassady, P. E.; Zumdieck, J. F.
1978-01-01
Laser heat engine concepts, proposed for satellite applications, are analyzed to determine which engine concept best meets the requirements of high efficiency (50 percent or better), continuous operation in space using near-term technology. The analysis of laser heat engines includes the thermodynamic cycles, engine design, laser power sources, collector/concentrator optics, receiving windows, absorbers, working fluids, electricity generation, and heat rejection. Specific engine concepts, optimized according to thermal efficiency, are rated by their technological availability and scaling to higher powers. A near-term experimental demonstration of the laser heat engine concept appears feasible utilizing an Otto cycle powered by CO2 laser radiation coupled into the engine through a diamond window. Higher cycle temperatures, higher efficiencies, and scalability to larger sizes appear to be achievable from a laser heat engine design based on the Brayton cycle and powered by a CO laser.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Mechanism study of tumor-specific immune responses induced by laser immunotherapy
NASA Astrophysics Data System (ADS)
Li, Xiaosong; Zhou, Feifan; Le, Henry; Wolf, Roman F.; Howard, Eric; Nordquist, Robert E.; Hode, Tomas; Liu, Hong; Chen, Wei R.
2011-03-01
Laser immunotherapy (LIT) has shown its efficacy against late-stage, metastatic cancers, both in pre-clinical studies and clinical pilot trials. However, the possible mechanism of LIT is still not fully understood. In our previous studies, we have shown that LIT induces tumor-specific antibodies that strongly bind to the target tumors. Tumor resistance in cured animals demonstrated long-term immunological effect of LIT. Successful transfer of adoptive immunity using spleen cells from LIT-cured animals indicated a long-term immunological memory of the host system. In clinical trials for the treatment of late-stage melanoma patients and breast cancer patients, the similar long-term, systemic effects have also been observed. To further study the immunological mechanism of LIT, immuno-histochemical analysis of patient tumor samples has performed before and after LIT treatment. Our results showed strong evidence that LIT significantly increases the infiltration of immune cells in the target tumors. Specifically, LIT appeared to drive the infiltrating immune cell populations in the direction of CD4, CD8 and CD68 T-cells. It is possible that activation and enhancement of both humeral and cellular arms of the host immune system are achievable by the treatment of LIT. These special features of LIT have contributed to the success of patient treatment. The underlying mechanism of LIT appears to be an in-situ autologous whole-cell cancer vaccination, using all components of tumors as sources of tumor antigens. Our preliminary mechanistic studies and future in-depth studies will contribute to the understanding and development of LIT as an effective modality for the treatment of late stage cancer patients who are facing severely limited options.
Moon, Jisook; Schwarz, Sigrid C.; Lee, Hyun‐Seob; Kang, Jun Mo; Lee, Young‐Eun; Kim, Bona; Sung, Mi‐Young; Höglinger, Günter; Wegner, Florian; Kim, Jin Su; Chung, Hyung‐Min; Chang, Sung Woon; Cha, Kwang Yul; Kim, Kwang‐Soo
2016-01-01
Abstract We have developed a good manufacturing practice for long‐term cultivation of fetal human midbrain‐derived neural progenitor cells. The generation of human dopaminergic neurons may serve as a tool of either restorative cell therapies or cellular models, particularly as a reference for phenotyping region‐specific human neural stem cell lines such as human embryonic stem cells and human inducible pluripotent stem cells. We cultivated 3 different midbrain neural progenitor lines at 10, 12, and 14 weeks of gestation for more than a year and characterized them in great detail, as well as in comparison with Lund mesencephalic cells. The whole cultivation process of tissue preparation, cultivation, and cryopreservation was developed using strict serum‐free conditions and standardized operating protocols under clean‐room conditions. Long‐term‐cultivated midbrain‐derived neural progenitor cells retained stemness, midbrain fate specificity, and floorplate markers. The potential to differentiate into authentic A9‐specific dopaminergic neurons was markedly elevated after prolonged expansion, resulting in large quantities of functional dopaminergic neurons without genetic modification. In restorative cell therapeutic approaches, midbrain‐derived neural progenitor cells reversed impaired motor function in rodents, survived well, and did not exhibit tumor formation in immunodeficient nude mice in the short or long term (8 and 30 weeks, respectively). We conclude that midbrain‐derived neural progenitor cells are a promising source for human dopaminergic neurons and suitable for long‐term expansion under good manufacturing practice, thus opening the avenue for restorative clinical applications or robust cellular models such as high‐content or high‐throughput screening. Stem Cells Translational Medicine 2017;6:576–588 PMID:28191758
Cancer mortality and oil production in the Amazon Region of Ecuador, 1990-2005.
Kelsh, Michael A; Morimoto, Libby; Lau, Edmund
2009-02-01
To compare cancer mortality rates in Amazon cantons (counties) with and without long-term oil exploration and extraction activities. Mortality (1990 through 2005) and population census (1990 and 2001) data for cantons in the provinces of the northern Amazon Region (Napo, Orellana, Sucumbios, and Pastaza), as well as the province with the capital city of Quito (Pichincha province) were obtained from the National Statistical Office of Ecuador, Instituto Nacional del Estadistica y Censos (INEC). Age- and sex-adjusted mortality rate ratios (RR) and 95% confidence intervals (CI) were estimated to evaluate total and cause-specific mortality in the study regions. Among Amazon cantons with long-term oil extraction, activities there was no evidence of increased rates of death from all causes (RR = 0.98; 95% CI = 0.95-1.01) or from overall cancer (RR = 0.82; 95% CI = 0.73-0.92), and relative risk estimates were also lower for most individual site-specific cancer deaths. Mortality rates in the Amazon provinces overall were significantly lower than those observed in Pichincha for all causes (RR = 0.82; 95% CI = 0.81-0.83), overall cancer (RR = 0.46; 95% CI = 0.43-0.49), and for all site-specific cancers. In regions with incomplete cancer registration, mortality data are one of the few sources of information for epidemiologic assessments. However, epidemiologic assessments in this region of Ecuador are limited by underreporting, exposure and disease misclassification, and study design limitations. Recognizing these limitations, our analyses of national mortality data of the Amazon Region in Ecuador does not provide evidence for an excess cancer risk in regions of the Amazon with long-term oil production. These findings were not consistent or supportive of earlier studies in this region that suggested increased cancer risks.
Kronholm, Scott C.; Capel, Paul D.
2015-01-01
Quantifying the relative contributions of different sources of water to a stream hydrograph is important for understanding the hydrology and water quality dynamics of a given watershed. To compare the performance of two methods of hydrograph separation, a graphical program [baseflow index (BFI)] and an end-member mixing analysis that used high-resolution specific conductance measurements (SC-EMMA) were used to estimate daily and average long-term slowflow additions of water to four small, primarily agricultural streams with different dominant sources of water (natural groundwater, overland flow, subsurface drain outflow, and groundwater from irrigation). Because the result of hydrograph separation by SC-EMMA is strongly related to the choice of slowflow and fastflow end-member values, a sensitivity analysis was conducted based on the various approaches reported in the literature to inform the selection of end-members. There were substantial discrepancies among the BFI and SC-EMMA, and neither method produced reasonable results for all four streams. Streams that had a small difference in the SC of slowflow compared with fastflow or did not have a monotonic relationship between streamflow and stream SC posed a challenge to the SC-EMMA method. The utility of the graphical BFI program was limited in the stream that had only gradual changes in streamflow. The results of this comparison suggest that the two methods may be quantifying different sources of water. Even though both methods are easy to apply, they should be applied with consideration of the streamflow and/or SC characteristics of a stream, especially where anthropogenic water sources (irrigation and subsurface drainage) are present.
NASA Astrophysics Data System (ADS)
Hanke, Ulrich M.; Schmidt, Michael W. I.; McIntyre, Cameron P.; Reddy, Christopher M.; Wacker, Lukas; Eglinton, Timothy I.
2016-04-01
Pyrogenic carbon (PyC) is a collective term for carbon-rich residues comprised of a continuum of products generated during biomass burning and fossil fuel combustion. PyC is a key component of the global carbon cycle due to its slow intrinsic decomposition rate and its ubiquity in the environment. It can originate from natural or anthropogenic vegetation fires, coal mining, energy production, industry and transport. Subsequently, PyC can be transported over long distances by wind and water and can eventually be buried in sediments. Information about the origin of PyC (biomass burning vs. fossil fuel combustion) deposited in estuarine sediments is scarce. We studied the highly anoxic estuarine sediments of the Pettaquamscutt River (Rhode Island, U.S.) in high temporal resolution over 250 years and found different combustion proxies reflect local and regional sources of PyC (Hanke et al. in review; Lima et al. 2003). The polycyclic aromatic hydrocarbons (PAH) originate from long-range atmospheric transport, whereas bulk PyC, detected as benzene polycarboxylic acids (BPCA), mainly stems from local catchment run-off. However, to unambiguously apportion PyC sources, we need additional information, such as compound specific radiocarbon (14C) measurements. We report 14C data for individual BPCA including error analysis and for combustion-related PAH. First results indicate that biomass burning is the main source of PyC deposits, with additional minor contributions from fossil fuel combustion. References Hanke U.M., T.I. Eglinton, A.L.L. Braun, C. Reddy, D.B. Wiedemeier, M.W.I. Schmidt. Decoupled sedimentary records of combustion: causes and implications. In review. Lima, A. L.; Eglinton, T. I.; Reddy, C. M., High-resolution record of pyrogenic polycyclic aromatic hydrocarbon deposition during the 20th century. ES&T, 2003, 37 (1), 53-61.
Severijnen, Chantal; Abrahamse, Evan; van der Beek, Eline M; Buco, Amra; van de Heijning, Bert J M; van Laere, Katrien; Bouritius, Hetty
2007-10-01
Diabetics are recommended to eat a balanced diet containing normal amounts of carbohydrates, preferably those with a low glycemic index. For solid foods, this can be achieved by choosing whole-grain, fiber-rich products. For (sterilized) liquid products, such as meal replacers, the choices for carbohydrate sources are restricted due to technological limitations. Starches usually have a high glycemic index after sterilization in liquids, whereas low glycemic sugars and sugar replacers can only be used in limited amounts. Using an in vitro digestion assay, we identified a resistant starch (RS) source [modified high amylose starch (mHAS)] that might enable the production of a sterilized liquid product with a low glycemic index. Heating mHAS for 4-5 min in liquid increased the slowly digestible starch (SDS) fraction at the expense of the RS portion. The effect was temperature dependent and reached its maximum above 120 degrees C. Heating at 130 degrees C significantly reduced the RS fraction from 49 to 22%. The product remained stable for at least several months when stored at 4 degrees C. To investigate whether a higher SDS fraction would result in a lower postprandial glycemic response, the sterilized mHAS solution was compared with rapidly digestible maltodextrin. Male Wistar rats received an i.g. bolus of 2.0 g available carbohydrate/kg body weight. Ingestion of heat-treated mHAS resulted in a significant attenuation of the postprandial plasma glucose and insulin responses compared with maltodextrin. mHAS appears to be a starch source which, after sterilization in a liquid product, acquires slow-release properties. The long-term stability of mHAS solutions indicates that this may provide a suitable carbohydrate source for low glycemic index liquid products for inclusion in a diabetes-specific diet.
Tile drainage as karst: Conduit flow and diffuse flow in a tile-drained watershed
Schilling, K.E.; Helmers, M.
2008-01-01
The similarity of tiled-drained watersheds to karst drainage basins can be used to improve understanding of watershed-scale nutrient losses from subsurface tile drainage networks. In this study, short-term variations in discharge and chemistry were examined from a tile outlet collecting subsurface tile flow from a 963 ha agricultural watershed. Study objectives were to apply analytical techniques from karst springs to tile discharge to evaluate water sources and estimate the loads of agricultural pollutants discharged from the tile with conduit, intermediate and diffuse flow regimes. A two-member mixing model using nitrate, chloride and specific conductance was used to distinguish rainwater versus groundwater inputs. Results indicated that groundwater comprised 75% of the discharge for a three-day storm period and rainwater was primarily concentrated during the hydrograph peak. A contrasting pattern of solute concentrations and export loads was observed in tile flow. During base flow periods, tile flow consisted of diffuse flow from groundwater sources and contained elevated levels of nitrate, chloride and specific conductance. During storm events, suspended solids and pollutants adhered to soil surfaces (phosphorus, ammonium and organic nitrogen) were concentrated and discharged during the rapid, conduit flow portion of the hydrograph. During a three-day period, conduit flow occurred for 5.6% of the time but accounted for 16.5% of the total flow. Nitrate and chloride were delivered primarily with diffuse flow (more than 70%), whereas 80-94% of total suspended sediment, phosphorus and ammonium were exported with conduit and intermediate flow regimes. Understanding the water sources contributing to tile drainage and the manner by which pollutant discharge occurs from these systems (conduit, intermediate or diffuse flow) may be useful for designing, implementing and evaluating non-point source reduction strategies in tile-drained landscapes. ?? 2007 Elsevier B.V. All rights reserved.
Emerging Disparities in Dietary Sodium Intake from Snacking in the US Population
Dunford, Elizabeth K.; Poti, Jennifer M.; Popkin, Barry M.
2017-01-01
Background: The US population consumes dietary sodium well in excess of recommended levels. It is unknown how the contribution of snack foods to sodium intake has changed over time, and whether disparities exist within specific subgroups of the US population. Objective: To examine short and long term trends in the contribution of snack food sources to dietary sodium intake for US adults and children over a 37-year period from 1977 to 2014. Methods: We used data collected from eight nationally representative surveys of food intake in 50,052 US children aged 2–18 years, and 73,179 adults aged 19+ years between 1977 and 2014. Overall, patterns of snack food consumption, trends in sodium intake from snack food sources and trends in food and beverage sources of sodium from snack foods across race-ethnic, age, gender, body mass index, household education and income groups were examined. Results: In all socio-demographic subgroups there was a significant increase in both per capita sodium intake, and the proportion of sodium intake derived from snacks from 1977–1978 to 2011–2014 (p < 0.01). Those with the lowest household education, Non-Hispanic Black race-ethnicity, and the lowest income had the largest increase in sodium intake from snacks. While in 1977–1978 Non-Hispanic Blacks had a lower sodium intake from snacks compared to Non-Hispanic Whites (p < 0.01), in 2011–2014 they had a significantly higher intake. Conclusions: Important disparities are emerging in dietary sodium intake from snack sources in Non-Hispanic Blacks. Our findings have implications for future policy interventions targeting specific US population subgroups. PMID:28629146
Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms
NASA Technical Reports Server (NTRS)
Heidmann, James D.; Hunter, Scott D.
2001-01-01
The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.
Possible Dual Earthquake-Landslide Source of the 13 November 2016 Kaikoura, New Zealand Tsunami
NASA Astrophysics Data System (ADS)
Heidarzadeh, Mohammad; Satake, Kenji
2017-10-01
A complicated earthquake ( M w 7.8) in terms of rupture mechanism occurred in the NE coast of South Island, New Zealand, on 13 November 2016 (UTC) in a complex tectonic setting comprising a transition strike-slip zone between two subduction zones. The earthquake generated a moderate tsunami with zero-to-crest amplitude of 257 cm at the near-field tide gauge station of Kaikoura. Spectral analysis of the tsunami observations showed dual peaks at 3.6-5.7 and 5.7-56 min, which we attribute to the potential landslide and earthquake sources of the tsunami, respectively. Tsunami simulations showed that a source model with slip on an offshore plate-interface fault reproduces the near-field tsunami observation in terms of amplitude, but fails in terms of tsunami period. On the other hand, a source model without offshore slip fails to reproduce the first peak, but the later phases are reproduced well in terms of both amplitude and period. It can be inferred that an offshore source is necessary to be involved, but it needs to be smaller in size than the plate interface slip, which most likely points to a confined submarine landslide source, consistent with the dual-peak tsunami spectrum. We estimated the dimension of the potential submarine landslide at 8-10 km.
Leung, K N; Nash, A A; Sia, D Y; Wildy, P
1984-12-01
A herpes simplex virus (HSV)-specific long-term T-cell clone has been established from the draining lymph node cells of BALB/c mice; the cells required repeated in vitro restimulation with UV-irradiated virus. The established T-cell clone expresses the Thy-1 and Lyt-1+2,3- surface antigens. For optimal proliferation of the cloned cells, both the presence of specific antigen and an exogenous source of T-cell growth factor are required. The proliferative response of the cloned T cells was found to be virus-specific but it did not distinguish between HSV-1 and HSV-2. Adoptive cell transfer of the cloned T cells helped primed B cells to produce anti-herpes antibodies: the response was antigen-specific and cell dose-dependent. The clone failed to produce a significant DTH reaction in vivo, but did produce high levels of macrophage-activating factor. Furthermore, the T-cell clone could protect from HSV infection, as measured by a reduction in local virus growth, and by enhanced survival following the challenge of mice with a lethal dose of virus. The mechanism(s) whereby this clone protects in vivo is discussed.
How Big Was It? Getting at Yield
NASA Astrophysics Data System (ADS)
Pasyanos, M.; Walter, W. R.; Ford, S. R.
2013-12-01
One of the most coveted pieces of information in the wake of a nuclear test is the explosive yield. Determining the yield from remote observations, however, is not necessarily a trivial thing. For instance, recorded observations of seismic amplitudes, used to estimate the yield, are significantly modified by the intervening media, which varies widely, and needs to be properly accounted for. Even after correcting for propagation effects such as geometrical spreading, attenuation, and station site terms, getting from the resulting source term to a yield depends on the specifics of the explosion source model, including material properties, and depth. Some formulas are based on assumptions of the explosion having a standard depth-of-burial and observed amplitudes can vary if the actual test is either significantly overburied or underburied. We will consider the complications and challenges of making these determinations using a number of standard, more traditional methods and a more recent method that we have developed using regional waveform envelopes. We will do this comparison for recent declared nuclear tests from the DPRK. We will also compare the methods using older explosions at the Nevada Test Site with announced yields, material and depths, so that actual performance can be measured. In all cases, we also strive to quantify realistic uncertainties on the yield estimation.
Bruppacher, R
1989-01-01
Criteria for epidemiological evidence of effects of elevated dosages of vitamins are basically the same as those for the evidence of effects of other exposures. Given the unambiguous classifications of both exposure and cases, they comprise strength, significance, specificity, and consistency of the statistical association, plausible time relationship as well as dose-effect relationship and consistency with other evidence. Today, the term epidemiological evidence usually refers to field experience, often to "observational," i.e., non-experimental, evidence. An extreme example for this are the so-called "ecological studies," which are frequently criticized because of their potential for exaggerated interpretations, though they can be very helpful in constructing and supporting hypotheses. For very rare and long-term effects the description and evaluation of individual cases are often combined with attempts of quantification, by relating them to the estimated exposure of the source population. This is subject to numerous sources of errors. If it is difficult to confirm the existence of rare and late effects, as the collection and interpretation of data on the prevention of such effects often present almost insurmountable methodological challenges. However, with correct interpretation and by keeping the quantitative perspective in mind, epidemiological evidence can be extremely helpful in the assessment of the overall importance, i.e., the public health significance, of such effects.
Site environmental report for Calendar Year 1994 on radiological and nonradiological parameters
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-06-30
Battelle Memorial Institute`s nuclear research facilities are currently being maintained in a surveillance and maintenance (S&M) mode with continual decontamination and decommissioning (D&D) activities being conducted under Department of Energy (DOE) Contract W-7405-ENG-92. These activities are referred to under the Contract as the Battelle Columbus Laboratories Decommissioning Project (BCLDP). Operations referenced in this report are performed in support of S&M and D&D activities. Battelle`s King Avenue facility is not considered in this report to the extent that the West Jefferson facility is. The source term at the King Avenue site is a small fraction of the source term at themore » West Jefferson site. Off site levels of radionuclides that could be attributed to the west Jefferson and King Avenue nuclear operations wereindistinguishable from background levels at specific locations where air, water, and direct radiation measurements were performed. Environmental monitoring continued to demonstrate compliance by Battelle with federal, state and local regulations. Routine, nonradiological activities performed include monitoring liquid effluents and monitoring the ground water system for the West Jefferson North site. Samples of various environmental media including air, water, grass, fish, field and garden crops, sediment and soil were collected from the region surrounding the two sites and analyzed.« less
Ambient Air Pollution and Increases in Blood Pressure: Role ...
Particulate matter (PM) is a complex mixture of extremely small particles and liquid droplets made up of a number of components including elemental carbon, organic chemicals, metals, acids (such as nitrates and sulfates), and soil and dust particles. Epidemiological studies consistently show that exposure to PM in urban areas across the globe is associated with increases in short- and long-term cardiovascular mortality and morbidity, most notably for myocardial infarction, heart failure and ischemic stroke.1 The range in strength of these associations is likely related to variation in PM sources and composition across space and time, and attests to the need to understand the contribution of specific sources to ultimately inform regulatory, public health and clinical strategies to reduce risk. Commentary: In 2014 a systematic review and meta-analysis published in this journal reported a positive association between short-term exposure to PM2.5 and blood pressure.2 The paper discussed potential mechanisms including PM-induced activation of pulmonary nociceptive receptors, pulmonary inflammatory responses and release of endothelin-1, and suggested that activation of pulmonary receptors and vagal afferents could lead to shifts in autonomic balance and vasoconstriction. Other effects including oxidative stress and decreased NO availability, as well as systemic inflammation and endothelial dysfunction have also been widely reported in association with PM compo
Castillo, Rodrigo; Nieto, Raquel; Drumond, Anita; Gimeno, Luis
2014-01-01
The Lagrangian FLEXPART model has been used during the last decade to detect moisture sources that affect the climate in different regions of the world. While most of these studies provided a climatological perspective on the atmospheric branch of the hydrological cycle in terms of precipitation, none assessed the minimum temporal domain for which the climatological approach is valid. The methodology identifies the contribution of humidity to the moisture budget in a region by computing the changes in specific humidity along backward (or forward) trajectories of air masses over a period of ten days beforehand (afterwards), thereby allowing the calculation of monthly, seasonal and annual averages. The current study calculates as an example the climatological seasonal mean and variance of the net precipitation for regions in which precipitation exceeds evaporation (E-P<0) for the North Atlantic moisture source region using different time periods, for winter and summer from 1980 to 2000. The results show that net evaporation (E-P>0) can be discounted after when the integration of E-P is done without affecting the general net precipitation patterns when it is discounted in a monthly or longer time scale. PMID:24893002
Ramus, Franck; Marshall, Chloe R.; Rosen, Stuart
2013-01-01
An on-going debate surrounds the relationship between specific language impairment and developmental dyslexia, in particular with respect to their phonological abilities. Are these distinct disorders? To what extent do they overlap? Which cognitive and linguistic profiles correspond to specific language impairment, dyslexia and comorbid cases? At least three different models have been proposed: the severity model, the additional deficit model and the component model. We address this issue by comparing children with specific language impairment only, those with dyslexia-only, those with specific language impairment and dyslexia and those with no impairment, using a broad test battery of language skills. We find that specific language impairment and dyslexia do not always co-occur, and that some children with specific language impairment do not have a phonological deficit. Using factor analysis, we find that language abilities across the four groups of children have at least three independent sources of variance: one for non-phonological language skills and two for distinct sets of phonological abilities (which we term phonological skills versus phonological representations). Furthermore, children with specific language impairment and dyslexia show partly distinct profiles of phonological deficit along these two dimensions. We conclude that a multiple-component model of language abilities best explains the relationship between specific language impairment and dyslexia and the different profiles of impairment that are observed. PMID:23413264
McDonald, Brian C; Goldstein, Allen H; Harley, Robert A
2015-04-21
A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.
Mitchell, Karen J; Mather, Mara; Johnson, Marcia K; Raye, Carol L; Greene, Erich J
2006-10-02
We investigated the hypothesis that arousal recruits attention to item information, thereby disrupting working memory processes that help bind items to context. Using functional magnetic resonance imaging, we compared brain activity when participants remembered negative or neutral picture-location conjunctions (source memory) versus pictures only. Behaviorally, negative trials showed disruption of short-term source, but not picture, memory; long-term picture recognition memory was better for negative than for neutral pictures. Activity in areas involved in working memory and feature integration (precentral gyrus and its intersect with superior temporal gyrus) was attenuated on negative compared with neutral source trials relative to picture-only trials. Visual processing areas (middle occipital and lingual gyri) showed greater activity for negative than for neutral trials, especially on picture-only trials.
Patient time and out-of-pocket costs for long-term prostate cancer survivors in Ontario, Canada.
de Oliveira, Claire; Bremner, Karen E; Ni, Andy; Alibhai, Shabbir M H; Laporte, Audrey; Krahn, Murray D
2014-03-01
Time and out-of-pocket (OOP) costs can represent a substantial burden for cancer patients but have not been described for long-term cancer survivors. We estimated these costs, their predictors, and their relationship to financial income, among a cohort of long-term prostate cancer (PC) survivors. A population-based, community-dwelling, geographically diverse sample of long-term (2-13 years) PC survivors in Ontario, Canada, was identified from the Ontario Cancer Registry and contacted through their referring physicians. We obtained data on demographics, health care resource use, and OOP costs through mailed questionnaires and conducted chart reviews to obtain clinical data. We compared mean annual time and OOP costs (2006 Canadian dollars) across clinical and sociodemographic characteristics and examined the association between costs and four groups of predictors (patient, disease, system, symptom) using two-part regression models. Patients' (N = 585) mean age was 73 years; 77 % were retired, and 42 % reported total annual incomes less than $40,000. Overall, mean time costs were $838/year and mean OOP costs were $200/year. Although generally low, total costs represented approximately 10 % of income for lower income patients. No demographic variables were associated with costs. Radical prostatectomy, younger age, poor urinary function, current androgen deprivation therapy, and recent diagnosis were significantly associated with increased likelihood of incurring any costs, but only urinary function significantly affected total amount. Time and OOP costs are modest for most long-term PC survivors but can represent a substantial burden for lower income patients. Even several years after diagnosis, PC-specific treatments and treatment-related dysfunction are associated with increased costs. Time and out-of-pocket costs are generally manageable for long-term PC survivors but can be a significant burden mainly for lower income patients. The effects of PC-specific, treatment-related dysfunctions on quality of life can also represent sources of expense for patients.
Circular current loops, magnetic dipoles and spherical harmonic analysis.
Alldredge, L.R.
1980-01-01
Spherical harmonic analysis (SHA) is the most used method of describing the Earth's magnetic field, even though spherical harmonic coefficients (SHC) almost completely defy interpretation in terms of real sources. Some moderately successful efforts have been made to represent the field in terms of dipoles placed in the core in an effort to have the model come closer to representing real sources. Dipole sources are only a first approximation to the real sources which are thought to be a very complicated network of electrical currents in the core of the Earth. -Author
A study of numerical methods for hyperbolic conservation laws with stiff source terms
NASA Technical Reports Server (NTRS)
Leveque, R. J.; Yee, H. C.
1988-01-01
The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.
Auditing the multiply-related concepts within the UMLS.
Mougin, Fleur; Grabar, Natalia
2014-10-01
This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Technical Reports Server (NTRS)
Karchmer, A. M.
1977-01-01
Fluctuating pressure measurements within the combustor and tailpipe of a turbofan engine are made simultaneously with far field acoustic measurements. The pressure measurements within the engine are accomplished with cooled semi-infinite waveguide probes utilizing conventional condenser microphones as the transducers. The measurements are taken over a broad range of engine operating conditions and for 16 far field microphone positions between 10 deg and 160 deg relative to the engine inlet axis. Correlation and coherence techniques are used to determine the relative phase and amplitude relationships between the internal pressures and far field acoustic pressures. The results indicate that the combustor is a low frequency source region for acoustic propagation through the tailpipe and out to the far field. Specifically, it is found that the relation between source pressure and the resulting sound pressure involves a 180 deg phase shift. The latter result is obtained by Fourier transforming the cross correlation function between the source pressure and acoustic pressure after removing the propagation delay time. Further, it is found that the transfer function between the source pressure and acoustic pressure has a magnitude approximately proportional to frequency squared. These results are shown to be consistent with a model using a modified source term in Lighthill's turbulence stress tensor, wherein the fluctuating Reynolds stresses are replaced with the pressure fluctuations due to fluctuating entropy.
Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.
Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S
2004-01-01
New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.
Tropospheric ozone using an emission tagging technique in the CAM-Chem and WRF-Chem models
NASA Astrophysics Data System (ADS)
Lupascu, A.; Coates, J.; Zhu, S.; Butler, T. M.
2017-12-01
Tropospheric ozone is a short-lived climate forcing pollutant. High concentration of ozone can affect human health (cardiorespiratory and increased mortality due to long-term exposure), and also it damages crops. Attributing ozone concentrations to the contributions from different sources would indicate the effects of locally emitted or transported precursors on ozone levels in specific regions. This information could be used as an important component of the design of emissions reduction strategies by indicating which emission sources could be targeted for effective reductions, thus reducing the burden of ozone pollution. Using a "tagging" approach within the CAM-Chem (global) and WRF-Chem (regional) models, we can quantify the contribution of individual emission of NOx and VOC precursors on air quality. Hence, when precursor emissions of NOx are tagged, we have seen that the largest contributors on ozone levels are the anthropogenic sources, while in the case of precursor emissions of VOCs, the biogenic sources and methane account for more than 50% of ozone levels. Further, we have extended the NOx tagging method in order to investigate continental source region contributions to concentrations of ozone over various receptor regions over the globe, with a zoom over Europe. In general, summertime maximum ozone in most receptor regions is largely attributable to local emissions of anthropogenic NOx and biogenic VOC. During the rest of the year, especially during springtime, ozone in most receptor regions shows stronger influences from anthropogenic emissions of NOx and VOC in remote source regions.
Soltani, Amanallah; Roslan, Samsilah
2013-03-01
Reading decoding ability is a fundamental skill to acquire word-specific orthographic information necessary for skilled reading. Decoding ability and its underlying phonological processing skills have been heavily investigated typically among developing students. However, the issue has rarely been noticed among students with intellectual disability who commonly suffer from reading decoding problems. This study is aimed at determining the contributions of phonological awareness, phonological short-term memory, and rapid automated naming, as three well known phonological processing skills, to decoding ability among 60 participants with mild intellectual disability of unspecified origin ranging from 15 to 23 years old. The results of the correlation analysis revealed that all three aspects of phonological processing are significantly correlated with decoding ability. Furthermore, a series of hierarchical regression analysis indicated that after controlling the effect of IQ, phonological awareness, and rapid automated naming are two distinct sources of decoding ability, but phonological short-term memory significantly contributes to decoding ability under the realm of phonological awareness. Copyright © 2013 Elsevier Ltd. All rights reserved.
Inferring the nature of anthropogenic threats from long-term abundance records.
Shoemaker, Kevin T; Akçakaya, H Resit
2015-02-01
Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.
Multi-Scale Analysis of Trends in Northeastern Temperate Forest Springtime Phenology
NASA Astrophysics Data System (ADS)
Moon, M.; Melaas, E. K.; Sulla-menashe, D. J.; Friedl, M. A.
2017-12-01
The timing of spring leaf emergence is highly variable in many ecosystems, exerts first-order control growing season length, and significantly modulates seasonally-integrated photosynthesis. Numerous studies have reported trends toward earlier spring phenology in temperate forests, with some papers indicating that this trend is also leading to increased carbon uptake. At broad spatial scales, however, most of these studies have used data from coarse spatial resolution instruments such as MODIS, which does not resolve ecologically important landscape-scale patterns in phenology. In this work, we examine how long-term trends in spring phenology differ across three data sources acquired at different scales of measurements at the Harvard Forest in central Massachusetts. Specifically, we compared trends in the timing of phenology based on long-term in-situ measurements of phenology, estimates based on eddy-covariance measurements of net carbon uptake transition dates, and from two sources of satellite-based remote sensing (MODIS and Landsat) land surface phenology (LSP) data. Our analysis focused on the flux footprint surrounding the Harvard Forest Environmental Measurements (EMS) tower. Our results reveal clearly defined trends toward earlier springtime phenology in Landsat LSP and in the timing of tower-based net carbon uptake. However, we find no statistically significant trend in springtime phenology measured from MODIS LSP data products, possibly because the time series of MODIS observations is relatively short (13 years). The trend in tower-based transition data exhibited a larger negative value than the trend derived from Landsat LSP data (-0.42 and -0.28 days per year for 21 and 28 years, respectively). More importantly, these results have two key implications regarding how changes in spring phenology are impacting carbon uptake at landscape-scale. First, long-term trends in spring phenology can be quite different, depending on what data source is used to estimate the trend, and 2) the response of carbon uptake to climate change may be more sensitive than the response of land surface phenology itself.
Moskal, Aurelie; Pisa, Pedro T; Ferrari, Pietro; Byrnes, Graham; Freisling, Heinz; Boutron-Ruault, Marie-Christine; Cadeau, Claire; Nailler, Laura; Wendt, Andrea; Kühn, Tilman; Boeing, Heiner; Buijsse, Brian; Tjønneland, Anne; Halkjær, Jytte; Dahm, Christina C; Chiuve, Stephanie E; Quirós, Jose R; Buckland, Genevieve; Molina-Montes, Esther; Amiano, Pilar; Huerta Castaño, José M; Gurrea, Aurelio Barricarte; Khaw, Kay-Tee; Lentjes, Marleen A; Key, Timothy J; Romaguera, Dora; Vergnaud, Anne-Claire; Trichopoulou, Antonia; Bamia, Christina; Orfanos, Philippos; Palli, Domenico; Pala, Valeria; Tumino, Rosario; Sacerdote, Carlotta; de Magistris, Maria Santucci; Bueno-de-Mesquita, H Bas; Ocké, Marga C; Beulens, Joline W J; Ericson, Ulrika; Drake, Isabel; Nilsson, Lena M; Winkvist, Anna; Weiderpass, Elisabete; Hjartåker, Anette; Riboli, Elio; Slimani, Nadia
2014-01-01
Compared to food patterns, nutrient patterns have been rarely used particularly at international level. We studied, in the context of a multi-center study with heterogeneous data, the methodological challenges regarding pattern analyses. We identified nutrient patterns from food frequency questionnaires (FFQ) in the European Prospective Investigation into Cancer and Nutrition (EPIC) Study and used 24-hour dietary recall (24-HDR) data to validate and describe the nutrient patterns and their related food sources. Associations between lifestyle factors and the nutrient patterns were also examined. Principal component analysis (PCA) was applied on 23 nutrients derived from country-specific FFQ combining data from all EPIC centers (N = 477,312). Harmonized 24-HDRs available for a representative sample of the EPIC populations (N = 34,436) provided accurate mean group estimates of nutrients and foods by quintiles of pattern scores, presented graphically. An overall PCA combining all data captured a good proportion of the variance explained in each EPIC center. Four nutrient patterns were identified explaining 67% of the total variance: Principle component (PC) 1 was characterized by a high contribution of nutrients from plant food sources and a low contribution of nutrients from animal food sources; PC2 by a high contribution of micro-nutrients and proteins; PC3 was characterized by polyunsaturated fatty acids and vitamin D; PC4 was characterized by calcium, proteins, riboflavin, and phosphorus. The nutrients with high loadings on a particular pattern as derived from country-specific FFQ also showed high deviations in their mean EPIC intakes by quintiles of pattern scores when estimated from 24-HDR. Center and energy intake explained most of the variability in pattern scores. The use of 24-HDR enabled internal validation and facilitated the interpretation of the nutrient patterns derived from FFQs in term of food sources. These outcomes open research opportunities and perspectives of using nutrient patterns in future studies particularly at international level.
Ruan, W; Bürkle, T; Dudeck, J
2000-01-01
In this paper we present a data dictionary server for the automated navigation of information sources. The underlying knowledge is represented within a medical data dictionary. The mapping between medical terms and information sources is based on a semantic network. The key aspect of implementing the dictionary server is how to represent the semantic network in a way that is easier to navigate and to operate, i.e. how to abstract the semantic network and to represent it in memory for various operations. This paper describes an object-oriented design based on Java that represents the semantic network in terms of a group of objects. A node and its relationships to its neighbors are encapsulated in one object. Based on such a representation model, several operations have been implemented. They comprise the extraction of parts of the semantic network which can be reached from a given node as well as finding all paths between a start node and a predefined destination node. This solution is independent of any given layout of the semantic structure. Therefore the module, called Giessen Data Dictionary Server can act independent of a specific clinical information system. The dictionary server will be used to present clinical information, e.g. treatment guidelines or drug information sources to the clinician in an appropriate working context. The server is invoked from clinical documentation applications which contain an infobutton. Automated navigation will guide the user to all the information relevant to her/his topic, which is currently available inside our closed clinical network.
Bu, Wenting; Zheng, Jian; Ketterer, Michael E; Hu, Sheng; Uchida, Shigeo; Wang, Xiaolin
2017-12-01
Measurements of the long-lived radionuclide 236 U are an important endeavor, not only in nuclear safeguards work, but also in terms of using this emerging nuclide as a tracer in chemical oceanography, hydrology, and actinide sourcing. Depending on the properties of a sample and its neutron irradiation history, 236 U/ 238 U ratios from different sources vary significantly. Therefore, this ratio can be treated as an important fingerprint for radioactive source identification, and in particular, affords a definitive means of discriminating between naturally occurring U and specific types of anthropogenic U. The development of mass spectrometric techniques makes it possible to determine ultra-trace levels of 236 U in environmental samples. In this paper, we review the current status of mass spectrometric approaches for determination of 236 U in environmental samples. Various sample preparation methods are summarized and compared. The mass spectrometric techniques emphasized herein are thermal ionization mass spectrometry (TIMS), inductively coupled plasma mass spectrometry (ICP-MS) and accelerator mass spectrometry (AMS). The strategies or principles used by each technique for the analysis of 236 U are described. The performances of these techniques in terms of abundance sensitivity and detection limit are discussed in detail. To date, AMS exhibits the best capability for ultra-trace determinations of 236 U. The levels and behaviors of 236 U in various environmental media are summarized and discussed as well. Results suggest that 236 U has an important, emerging role as a tracer for geochemical studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Salvinelli, Carlo; Elmore, A Curt; Reidmeyer, Mary R; Drake, K David; Ahmad, Khaldoun I
2016-11-01
Ceramic pot filters represent a common and effective household water treatment technology in developing countries, but factors impacting water production rate are not well-known. Turbidity of source water may be principal indicator in characterizing the filter's lifetime in terms of water production capacity. A flow rate study was conducted by creating four controlled scenarios with different turbidities, and influent and effluent water samples were tested for total suspended solids and particle size distribution. A relationship between average flow rate and turbidity was identified with a negative linear trend of 50 mLh -1 /NTU. Also, a positive linear relationship was found between the initial flow rate of the filters and average flow rate calculated over the 23 day life of the experiment. Therefore, it was possible to establish a method to estimate the average flow rate given the initial flow rate and the turbidity in the influent water source, and to back calculate the maximum average turbidity that would need to be maintained in order to achieve a specific average flow rate. However, long-term investigations should be conducted to assess how these relationships change over the expected CPF lifetime. CPFs rejected fine suspended particles (below 75 μm), especially particles with diameters between 0.375 μm and 10 μm. The results confirmed that ceramic pot filters are able to effectively reduce turbidity, but pretreatment of influent water should be performed to avoid premature failure. Copyright © 2016 Elsevier Ltd. All rights reserved.
The effect of distraction on change detection in crowded acoustic scenes.
Petsas, Theofilos; Harrison, Jemma; Kashino, Makio; Furukawa, Shigeto; Chait, Maria
2016-11-01
In this series of behavioural experiments we investigated the effect of distraction on the maintenance of acoustic scene information in short-term memory. Stimuli are artificial acoustic 'scenes' composed of several (up to twelve) concurrent tone-pip streams ('sources'). A gap (1000 ms) is inserted partway through the 'scene'; Changes in the form of an appearance of a new source or disappearance of an existing source, occur after the gap in 50% of the trials. Listeners were instructed to monitor the unfolding 'soundscapes' for these events. Distraction was measured by presenting distractor stimuli during the gap. Experiments 1 and 2 used a dual task design where listeners were required to perform a task with varying attentional demands ('High Demand' vs. 'Low Demand') on brief auditory (Experiment 1a) or visual (Experiment 1b) signals presented during the gap. Experiments 2 and 3 required participants to ignore distractor sounds and focus on the change detection task. Our results demonstrate that the maintenance of scene information in short-term memory is influenced by the availability of attentional and/or processing resources during the gap, and that this dependence appears to be modality specific. We also show that these processes are susceptible to bottom up driven distraction even in situations when the distractors are not novel, but occur on each trial. Change detection performance is systematically linked with the, independently determined, perceptual salience of the distractor sound. The findings also demonstrate that the present task may be a useful objective means for determining relative perceptual salience. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
Operations of Sandia National Laboratories, Nevada (SNL/NV) at the Tonopah Test Range (TTR) resulted in no planned point radiological releases during 1996. Other releases from SNL/NV included diffuse transuranic sources consisting of the three Clean Slate sites. Air emissions from these sources result from wind resuspension of near-surface transuranic contaminated soil particulates. The total area of contamination has been estimated to exceed 20 million square meters. Soil contamination was documented in an aerial survey program in 1977 (EG&G 1979). Surface contamination levels were generally found to be below 400 pCi/g of combined plutonium-238, plutonium-239, plutonium-240, and americium-241 (i.e., transuranic) activity.more » Hot spot areas contain up to 43,000 pCi/g of transuranic activity. Recent measurements confirm the presence of significant levels of transuranic activity in the surface soil. An annual diffuse source term of 0.39 Ci of transuranic material was calculated for the cumulative release from all three Clean Slate sites. A maximally exposed individual dose of 1.1 mrem/yr at the TTR airport area was estimated based on the 1996 diffuse source release amounts and site-specific meteorological data. A population dose of 0.86 person-rem/yr was calculated for the local residents. Both dose values were attributable to inhalation of transuranic contaminated dust.« less
New era of electronic brachytherapy
Ramachandran, Prabhakar
2017-01-01
Traditional brachytherapy refers to the placement of radioactive sources on or inside the cancer tissues. Based on the type of sources, brachytherapy can be classified as radionuclide and electronic brachytherapy. Electronic brachytherapy uses miniaturized X-ray sources instead of radionuclides to deliver high doses of radiation. The advantages of electronic brachytherapy include low dose to organs at risk, reduced dose to treating staff, no leakage radiation in off state, less shielding, and no radioactive waste. Most of these systems operate between 50 and 100 kVp and are widely used in the treatment of skin cancer. Intrabeam, Xoft and Papillon systems are also used in the treatment of intra-operative radiotherapy to breast in addition to other treatment sites. The rapid fall-off in the dose due to its low energy is a highly desirable property in brachytherapy and results in a reduced dose to the surrounding normal tissues compared to the Ir-192 source. The Xoft Axxent brachytherapy system uses a 2.25 mm miniaturized X-ray tube and the source almost mimics the high dose rate Ir-192 source in terms of dose rate and it is the only electronic brachytherapy system specifically used in the treatment of cervical cancers. One of the limiting factors that impede the use of electronic brachytherapy for interstitial application is the source dimension. However, it is highly anticipated that the design of miniaturized X-ray tube closer to the dimension of an Ir-192 wire is not too far away, and the new era of electronic brachytherapy has just begun. PMID:28529679
Chau, John H; Rahfeldt, Wolfgang A; Olmstead, Richard G
2018-03-01
Targeted sequence capture can be used to efficiently gather sequence data for large numbers of loci, such as single-copy nuclear loci. Most published studies in plants have used taxon-specific locus sets developed individually for a clade using multiple genomic and transcriptomic resources. General locus sets can also be developed from loci that have been identified as single-copy and have orthologs in large clades of plants. We identify and compare a taxon-specific locus set and three general locus sets (conserved ortholog set [COSII], shared single-copy nuclear [APVO SSC] genes, and pentatricopeptide repeat [PPR] genes) for targeted sequence capture in Buddleja (Scrophulariaceae) and outgroups. We evaluate their performance in terms of assembly success, sequence variability, and resolution and support of inferred phylogenetic trees. The taxon-specific locus set had the most target loci. Assembly success was high for all locus sets in Buddleja samples. For outgroups, general locus sets had greater assembly success. Taxon-specific and PPR loci had the highest average variability. The taxon-specific data set produced the best-supported tree, but all data sets showed improved resolution over previous non-sequence capture data sets. General locus sets can be a useful source of sequence capture targets, especially if multiple genomic resources are not available for a taxon.
An extension of the Lighthill theory of jet noise to encompass refraction and shielding
NASA Technical Reports Server (NTRS)
Ribner, Herbert S.
1995-01-01
A formalism for jet noise prediction is derived that includes the refractive 'cone of silence' and other effects; outside the cone it approximates the simple Lighthill format. A key step is deferral of the simplifying assumption of uniform density in the dominant 'source' term. The result is conversion to a convected wave equation retaining the basic Lighthill source term. The main effect is to amend the Lighthill solution to allow for refraction by mean flow gradients, achieved via a frequency-dependent directional factor. A general formula for power spectral density emitted from unit volume is developed as the Lighthill-based value multiplied by a squared 'normalized' Green's function (the directional factor), referred to a stationary point source. The convective motion of the sources, with its powerful amplifying effect, also directional, is already accounted for in the Lighthill format: wave convection and source convection are decoupled. The normalized Green's function appears to be near unity outside the refraction dominated 'cone of silence', this validates our long term practice of using Lighthill-based approaches outside the cone, with extension inside via the Green's function. The function is obtained either experimentally (injected 'point' source) or numerically (computational aeroacoustics). Approximation by unity seems adequate except near the cone and except when there are shrouding jets: in that case the difference from unity quantifies the shielding effect. Further extension yields dipole and monopole source terms (cf. Morfey, Mani, and others) when the mean flow possesses density gradients (e.g., hot jets).
VizieR Online Data Catalog: Stellar encounters with long-period comets (Feng+, 2015)
NASA Astrophysics Data System (ADS)
Feng, F.; Bailer-Jones, C. A. L.
2016-07-01
We have conducted simulations of the perturbation of the Oort cloud in order to estimate the significance of known encounters in generating long-period comets. We collected the data of stellar encounters from three sources: (Bailer-Jones, 2015, Cat. J/A+A/575/A35, hereafter BJ15), Dybczynski & Berski (2015MNRAS.449.2459D), and Mamajek et al. (2015ApJ...800L..17M). Following BJ15, we use the term 'object' to refer to each encountering star in our catalogue. A specific star may appear more than once but with different data, thus leading to a different object. (1 data file).
Quantitative determination of vinpocetine in dietary supplements
French, John M. T.; King, Matthew D.
2017-01-01
Current United States regulatory policies allow for the addition of pharmacologically active substances in dietary supplements if derived from a botanical source. The inclusion of certain nootropic drugs, such as vinpocetine, in dietary supplements has recently come under scrutiny due to the lack of defined dosage parameters and yet unproven short- and long-term benefits and risks to human health. This study quantified the concentration of vinpocetine in several commercially available dietary supplements and found that a highly variable range of 0.6–5.1 mg/serving was present across the tested products, with most products providing no specification of vinpocetine concentrations. PMID:27319129
Comparison of Factorization-Based Filtering for Landing Navigation
NASA Technical Reports Server (NTRS)
McCabe, James S.; Brown, Aaron J.; DeMars, Kyle J.; Carson, John M., III
2017-01-01
This paper develops and analyzes methods for fusing inertial navigation data with external data, such as data obtained from an altimeter and a star camera. The particular filtering techniques are based upon factorized forms of the Kalman filter, specifically the UDU and Cholesky factorizations. The factorized Kalman filters are utilized to ensure numerical stability of the navigation solution. Simulations are carried out to compare the performance of the different approaches along a lunar descent trajectory using inertial and external data sources. It is found that the factorized forms improve upon conventional filtering techniques in terms of ensuring numerical stability for the investigated landing navigation scenario.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; McGrath, Liam R.; Whitney, Paul D.
2011-11-17
We present a computational approach to radical rhetoric that leverages the co-expression of rhetoric and action features in discourse to identify violent intent. The approach combines text mining and machine learning techniques with insights from Frame Analysis and theories that explain the emergence of violence in terms of moral disengagement, the violation of sacred values and social isolation in order to build computational models that identify messages from terrorist sources and estimate their proximity to an attack. We discuss a specific application of this approach to a body of documents from and about radical and terrorist groups in the Middlemore » East and present the results achieved.« less
Programmable Bio-surfaces for Biomedical Applications.
Shiba, Kiyotaka
2017-01-01
A peptide can be used as a functional building block to construct artificial systems when it has sufficient transplantability and functional independence in terms of its assigned function. Recent advances in in vitro evolution systems have been increasing the list of peptides that specifically bind to certain targets, such as proteins and cells. By properly displaying these peptides on solid surfaces, we can endow the inorganic materials with various biological functions, which will contribute to the development of diagnosis and therapeutic medical devices. Here, the methods for the peptide-based surface functionalization are reviewed by focusing on sources of peptides as well as methods of immobilization.
The prospects of construction and transport industry
NASA Astrophysics Data System (ADS)
Yaskova, Natalia
2017-10-01
The article is focused on a problem of moving the construction industry into the growth and prosperity phase. The method of target orienting developed by the author is aimed at overcoming technological weakness of the construction industry and eliminating disproportions of the capital funds’ structure. Exhaustion of traditional sources of construction industry growth and real property market growth required the research on specific technologies of interphase transformations and their development. It will contribute to implementing the objective laws of the new wave of construction growth, which provides the development of immovable’s structure that is reasonable in terms of strategic priorities of the national economy.
The technology application process as applied to a firefighter's breathing system
NASA Technical Reports Server (NTRS)
Mclaughlan, P. B.
1974-01-01
The FBS Program indicated that applications of advanced technology can result in an improved FBS that will satisfy the requirements defined by municipal fire departments. To accomplish this technology transfer, a substantial commitment of resources over an extended period of time has been required. This program has indicated that the ability of NASA in terms of program management such as requirement definition, system analysis, and industry coordination may play as important a role as specific sources of hardware technology. As a result of the FBS program, a sequence of milestones was passed that may have applications as generalized milestones and objectives for any technical application program.
de Souza, Andrea; Bittker, Joshua A; Lahr, David L; Brudz, Steve; Chatwin, Simon; Oprea, Tudor I; Waller, Anna; Yang, Jeremy J; Southall, Noel; Guha, Rajarshi; Schürer, Stephan C; Vempati, Uma D; Southern, Mark R; Dawson, Eric S; Clemons, Paul A; Chung, Thomas D Y
2014-06-01
Recent industry-academic partnerships involve collaboration among disciplines, locations, and organizations using publicly funded "open-access" and proprietary commercial data sources. These require the effective integration of chemical and biological information from diverse data sources, which presents key informatics, personnel, and organizational challenges. The BioAssay Research Database (BARD) was conceived to address these challenges and serve as a community-wide resource and intuitive web portal for public-sector chemical-biology data. Its initial focus is to enable scientists to more effectively use the National Institutes of Health Roadmap Molecular Libraries Program (MLP) data generated from the 3-year pilot and 6-year production phases of the Molecular Libraries Probe Production Centers Network (MLPCN), which is currently in its final year. BARD evolves the current data standards through structured assay and result annotations that leverage BioAssay Ontology and other industry-standard ontologies, and a core hierarchy of assay definition terms and data standards defined specifically for small-molecule assay data. We initially focused on migrating the highest-value MLP data into BARD and bringing it up to this new standard. We review the technical and organizational challenges overcome by the interdisciplinary BARD team, veterans of public- and private-sector data-integration projects, who are collaborating to describe (functional specifications), design (technical specifications), and implement this next-generation software solution. © 2014 Society for Laboratory Automation and Screening.
Sirisan, V; Pattarajinda, V; Vichitphan, K; Leesing, R
2013-08-01
Ruminal organic acid production, especially lactic acid, can be modified by feeding cattle highly concentrated diets, which have been shown to adversely affect dairy cattle health. Therefore, the use of lactic acid-utilizing organisms is considered to be a potential method for controlling lactic acid levels. This study was conducted to isolate and identify lactic acid-utilizing yeasts from the ruminal fluid of dairy cattle and to determine the specific growth rate and generation time when using lactic acid as a carbon source instead of glucose. Seventeen yeast isolates were examined in this study. Yeasts isolated from dairy cattle that were fed a high cassava pulp diet (HCP) had higher specific growth rates and shorter generation times than yeasts isolated from dairy cattle that were fed a high-concentrate diet (HC) and a mixed diet (M). The three most effective yeasts in terms of specific growth rate and generation time were Pichia kudriavzevii, Candida rugosa and Kodamaea ohmeri, with 99, 100 and 99% nucleotide identities, respectively. These three isolates could be used as potential probiotics in dairy cattle diets. This study demonstrates that yeasts isolated from the ruminal fluid of dairy cattle can utilize lactic acid as a carbon and energy source for growth. The isolated yeasts can be used as probiotic supplements for dairy cattle that are fed highly concentrated diets to reduce ruminal lactic acid production. © 2013 The Society for Applied Microbiology.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2013-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.
Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei
2014-01-01
Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553
Krall, Jenna R; Ladva, Chandresh N; Russell, Armistead G; Golan, Rachel; Peng, Xing; Shi, Guoliang; Greenwald, Roby; Raysoni, Amit U; Waller, Lance A; Sarnat, Jeremy A
2018-06-01
Concentrations of traffic-related air pollutants are frequently higher within commuting vehicles than in ambient air. Pollutants found within vehicles may include those generated by tailpipe exhaust, brake wear, and road dust sources, as well as pollutants from in-cabin sources. Source-specific pollution, compared to total pollution, may represent regulation targets that can better protect human health. We estimated source-specific pollution exposures and corresponding pulmonary response in a panel study of commuters. We used constrained positive matrix factorization to estimate source-specific pollution factors and, subsequently, mixed effects models to estimate associations between source-specific pollution and pulmonary response. We identified four pollution factors that we named: crustal, primary tailpipe traffic, non-tailpipe traffic, and secondary. Among asthmatic subjects (N = 48), interquartile range increases in crustal and secondary pollution were associated with changes in lung function of -1.33% (95% confidence interval (CI): -2.45, -0.22) and -2.19% (95% CI: -3.46, -0.92) relative to baseline, respectively. Among non-asthmatic subjects (N = 51), non-tailpipe pollution was associated with pulmonary response only at 2.5 h post-commute. We found no significant associations between pulmonary response and primary tailpipe pollution. Health effects associated with traffic-related pollution may vary by source, and therefore some traffic pollution sources may require targeted interventions to protect health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David
A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less
NASA Astrophysics Data System (ADS)
Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina
2017-11-01
This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M < 6 local volcano-tectonic earthquakes, which dominate the hazard at the short- to mid-term exposure times considered in this study, we present a different viewpoint that, in our opinion, is relevant for retrofitting the existing buildings and for driving impending interventions of risk reduction.
Jouneau, S; Dres, M; Guerder, A; Bele, N; Bellocq, A; Bernady, A; Berne, G; Bourdin, A; Brinchault, G; Burgel, P R; Carlier, N; Chabot, F; Chavaillon, J M; Cittee, J; Claessens, Y E; Delclaux, B; Deslée, G; Ferré, A; Gacouin, A; Girault, C; Ghasarossian, C; Gouilly, P; Gut-Gobert, C; Gonzalez-Bermejo, J; Jebrak, G; Le Guillou, F; Léveiller, G; Lorenzo, A; Mal, H; Molinari, N; Morel, H; Morel, V; Noel, F; Pégliasco, H; Perotin, J M; Piquet, J; Pontier, S; Rabbat, A; Revest, M; Reychler, G; Stelianides, S; Surpas, P; Tattevin, P; Roche, N
2017-04-01
Chronic obstructive pulmonary disease (COPD) is the chronic respiratory disease with the most important burden on public health in terms of morbidity, mortality and health costs. For patients, COPD is a major source of disability because of dyspnea, restriction in daily activities, exacerbation, risk of chronic respiratory failure and extra-respiratory systemic organ disorders. The previous French Language Respiratory Society (SPLF) guidelines on COPD exacerbations were published in 2003. Using the GRADE methodology, the present document reviews the current knowledge on COPD exacerbation through 4 specific outlines: (1) epidemiology, (2) clinical evaluation, (3) therapeutic management and (4) prevention. Specific aspects of outpatients and inpatients care are discussed, especially regarding assessment of exacerbation severity and pharmacological approach. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Radioisotope Electric Propulsion (REP): A Near-Term Approach to Nuclear Propulsion
NASA Technical Reports Server (NTRS)
Schmidt, George R.; Manzella, David H.; Kamhawi, Hani; Kremic, Tibor; Oleson, Steven R.; Dankanich, John W.; Dudzinski, Leonard A.
2009-01-01
Studies over the last decade have shown radioisotope-based nuclear electric propulsion to be enhancing and, in some cases, enabling for many potential robotic science missions. Also known as radioisotope electric propulsion (REP), the technology offers the performance advantages of traditional reactor-powered electric propulsion (i.e., high specific impulse propulsion at large distances from the Sun), but with much smaller, affordable spacecraft. Future use of REP requires development of radioisotope power sources with system specific powers well above that of current systems. The US Department of Energy and NASA have developed an advanced Stirling radioisotope generator (ASRG) engineering unit, which was subjected to rigorous flight qualification-level tests in 2008, and began extended lifetime testing later that year. This advancement, along with recent work on small ion thrusters and life extension technology for Hall thrusters, could enable missions using REP sometime during the next decade.