Bayesian source term determination with unknown covariance of measurements
NASA Astrophysics Data System (ADS)
Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav
2017-04-01
Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).
Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H
2014-07-01
There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Detecting fission from special nuclear material sources
Rowland, Mark S [Alamo, CA; Snyderman, Neal J [Berkeley, CA
2012-06-05
A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. The system includes a graphing component that displays the plot of the neutron distribution from the unknown source over a Poisson distribution and a plot of neutrons due to background or environmental sources. The system further includes a known neutron source placed in proximity to the unknown source to actively interrogate the unknown source in order to accentuate differences in neutron emission from the unknown source from Poisson distributions and/or environmental sources.
7. Photocopy of painting (Source unknown, Date unknown) EXTERIOR SOUTH ...
7. Photocopy of painting (Source unknown, Date unknown) EXTERIOR SOUTH FRONT VIEW OF MISSION AND CONVENTO AFTER 1913 - Mission San Francisco Solano de Sonoma, First & Spain Streets, Sonoma, Sonoma County, CA
PHYSICS OF OUR DAYS: Dark energy and universal antigravitation
NASA Astrophysics Data System (ADS)
Chernin, A. D.
2008-03-01
Universal antigravitation, a new physical phenomenon discovered astronomically at distances of 5 to 8 billion light years, manifests itself as cosmic repulsion that acts between distant galaxies and overcomes their gravitational attraction, resulting in the accelerating expansion of the Universe. The source of the antigravitation is not galaxies or any other bodies of nature but a previously unknown form of mass/energy that has been termed dark energy. Dark energy accounts for 70 to 80% of the total mass and energy of the Universe and, in macroscopic terms, is a kind of continuous medium that fills the entire space of the Universe and is characterized by positive density and negative pressure. With its physical nature and microscopic structure unknown, dark energy is among the most critical challenges fundamental science faces in the twenty-first century.
Microseismic source locations with deconvolution migration
NASA Astrophysics Data System (ADS)
Wu, Shaojiang; Wang, Yibo; Zheng, Yikang; Chang, Xu
2018-03-01
Identifying and locating microseismic events are critical problems in hydraulic fracturing monitoring for unconventional resources exploration. In contrast to active seismic data, microseismic data are usually recorded with unknown source excitation time and source location. In this study, we introduce deconvolution migration by combining deconvolution interferometry with interferometric cross-correlation migration (CCM). This method avoids the need for the source excitation time and enhances both the spatial resolution and robustness by eliminating the square term of the source wavelets from CCM. The proposed algorithm is divided into the following three steps: (1) generate the virtual gathers by deconvolving the master trace with all other traces in the microseismic gather to remove the unknown excitation time; (2) migrate the virtual gather to obtain a single image of the source location and (3) stack all of these images together to get the final estimation image of the source location. We test the proposed method on complex synthetic and field data set from the surface hydraulic fracturing monitoring, and compare the results with those obtained by interferometric CCM. The results demonstrate that the proposed method can obtain a 50 per cent higher spatial resolution image of the source location, and more robust estimation with smaller errors of the localization especially in the presence of velocity model errors. This method is also beneficial for source mechanism inversion and global seismology applications.
ERIC Educational Resources Information Center
Noel-Levitz, Inc, 2010
2010-01-01
Today, prospective students in ever greater numbers are secretly exploring colleges online on their own terms, using official and unofficial sources, without completing a college's response form. Many are withholding college entrance exam scores from colleges they might be interested in and remaining unknown to their institutions of choice until…
14. Photocopy of photograph (source unknown) photographer unknown pre1885 NORTH ...
14. Photocopy of photograph (source unknown) photographer unknown pre-1885 NORTH SIDE AND WEST FRONT (NOTE ABSENCE OF DORMER ON GAMBREL ROOF OF ELL) (Illustration #6 of Data Report included in Field Records) - Narbonne House, 71 Essex Street, Salem, Essex County, MA
Augmented classical least squares multivariate spectral analysis
Haaland, David M.; Melgaard, David K.
2004-02-03
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-07-26
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
Augmented Classical Least Squares Multivariate Spectral Analysis
Haaland, David M.; Melgaard, David K.
2005-01-11
A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.
44. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
44. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown FIRST FLOOR PLAN - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
45. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
45. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown SECOND FLOOR PLAN - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
51. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
51. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown EXTERIOR, ELEVATION DETAILS - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
46. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
46. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown NORTH ELEVATION - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
47. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
47. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angleles, CA, Photographer, Date unknown WEST ELEVATION - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
Electromagnetic spectrum management system
Seastrand, Douglas R.
2017-01-31
A system for transmitting a wireless countermeasure signal to disrupt third party communications is disclosed that include an antenna configured to receive wireless signals and transmit wireless counter measure signals such that the wireless countermeasure signals are responsive to the received wireless signals. A receiver processes the received wireless signals to create processed received signal data while a spectrum control module subtracts known source signal data from the processed received signal data to generate unknown source signal data. The unknown source signal data is based on unknown wireless signals, such as enemy signals. A transmitter is configured to process the unknown source signal data to create countermeasure signals and transmit a wireless countermeasure signal over the first antenna or a second antenna to thereby interfere with the unknown wireless signals.
49. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
49. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown SECTION THROUGH BUILDING, LOOKING EAST - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
48. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
48. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA., Photographer, Date unknown SECTION THROUGH BUILDING, LOOKING NORTH - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
19. Photocopy of measured drawing (source unknown) 6 March 1945, ...
19. Photocopy of measured drawing (source unknown) 6 March 1945, delineator unknown PROPOSED ADAPTIVE REUSE AS CLEMSON COLLEGE FACULTY CLUB, SECOND FLOOR PLAN - Woodburn, Woodburn Road, U.S. Route 76 vicinity, Pendleton, Anderson County, SC
18. Photocopy of measured drawing (source unknown) 6 March 1945, ...
18. Photocopy of measured drawing (source unknown) 6 March 1945, delineator unknown PROPOSED ADAPTIVE REUSE AS CLEMSON COLLEGE FACULTY CLUB, FIRST FLOOR PLAN - Woodburn, Woodburn Road, U.S. Route 76 vicinity, Pendleton, Anderson County, SC
17. Photocopy of measured drawing (source unknown) 6 March 1945, ...
17. Photocopy of measured drawing (source unknown) 6 March 1945, delineator unknown PROPOSED ADAPTIVE REUSE AS CLEMSON COLLEGE FACULTY CLUB, BASEMENT PLAN - Woodburn, Woodburn Road, U.S. Route 76 vicinity, Pendleton, Anderson County, SC
20. Photocopy of measured drawing (source unknown) 6 March 1945, ...
20. Photocopy of measured drawing (source unknown) 6 March 1945, delineator unknown PROPOSED ADAPTIVE REUSE AS CLEMSON COLLEGE FACULTY CLUB, ATTIC PLAN - Woodburn, Woodburn Road, U.S. Route 76 vicinity, Pendleton, Anderson County, SC
21. Photocopy of maesured drawing (source unknown) 6 March 1945, ...
21. Photocopy of maesured drawing (source unknown) 6 March 1945, delineator unknown PROPOSED ADAPTIVE REUSE AS CLEMSON COLLEGE FACULTY CLUB, SITE PLAN - Woodburn, Woodburn Road, U.S. Route 76 vicinity, Pendleton, Anderson County, SC
Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests
NASA Astrophysics Data System (ADS)
Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.
2015-12-01
Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.
52. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
52. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown DETAILS OF MAIN FLOOR ELEVATOR LOBBY - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
50. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
50. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angleles, CA, Photographer, Date unknown ENTRANCE AND TYPICAL BAY ON FLOWER STREET - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
53. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print ...
53. Photocopy of drawing (Source unknown, 1928) Rapid Blue Print Co., Los Angeles, CA, Photographer, Date unknown DETAILS OF CORRIDORS ON SECOND - TWELFTH FLOORS - Richfield Oil Building, 555 South Flower Street, Los Angeles, Los Angeles County, CA
Electromagnetic spectrum management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seastrand, Douglas R.
A system for transmitting a wireless countermeasure signal to disrupt third party communications is disclosed that include an antenna configured to receive wireless signals and transmit wireless counter measure signals such that the wireless countermeasure signals are responsive to the received wireless signals. A receiver processes the received wireless signals to create processed received signal data while a spectrum control module subtracts known source signal data from the processed received signal data to generate unknown source signal data. The unknown source signal data is based on unknown wireless signals, such as enemy signals. A transmitter is configured to process themore » unknown source signal data to create countermeasure signals and transmit a wireless countermeasure signal over the first antenna or a second antenna to thereby interfere with the unknown wireless signals.« less
Recovering an unknown source in a fractional diffusion problem
NASA Astrophysics Data System (ADS)
Rundell, William; Zhang, Zhidong
2018-09-01
A standard inverse problem is to determine a source which is supported in an unknown domain D from external boundary measurements. Here we consider the case of a time-independent situation where the source is equal to unity in an unknown subdomain D of a larger given domain Ω and the boundary of D has the star-like shape, i.e.
Quantum key distribution with an unknown and untrusted source
NASA Astrophysics Data System (ADS)
Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong
2008-05-01
The security of a standard bidirectional “plug-and-play” quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we solve this question directly by presenting the quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard Bennett-Brassard 1984 protocol, weak+vacuum decoy state protocol, and one-decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source.
Statistical Characterization of MP3 Encoders for Steganalysis: ’CHAMP3’
2004-04-27
compression exceeds those of typical stegano- graphic tools (e. g., LSB image embedding), the availability of commented source codes for MP3 encoders...developed by testing the approach on known and unknown reference data. 15. SUBJECT TERMS EOARD, Steganography , Digital Watermarking...Pages kbps Kilobits per Second LGPL Lesser General Public License LSB Least Significant Bit MB Megabyte MDCT Modified Discrete Cosine Transformation MP3
Consistent description of kinetic equation with triangle anomaly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pu Shi; Gao Jianhua; Wang Qun
2011-05-01
We provide a consistent description of the kinetic equation with a triangle anomaly which is compatible with the entropy principle of the second law of thermodynamics and the charge/energy-momentum conservation equations. In general an anomalous source term is necessary to ensure that the equations for the charge and energy-momentum conservation are satisfied and that the correction terms of distribution functions are compatible to these equations. The constraining equations from the entropy principle are derived for the anomaly-induced leading order corrections to the particle distribution functions. The correction terms can be determined for the minimum number of unknown coefficients in onemore » charge and two charge cases by solving the constraining equations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Bruce T.; Knight, Jeff R.; Ringer, Mark A.
2012-10-15
Global-scale variations in the climate system over the last half of the twentieth century, including long-term increases in global-mean near-surface temperatures, are consistent with concurrent human-induced emissions of radiatively active gases and aerosols. However, such consistency does not preclude the possible influence of other forcing agents, including internal modes of climate variability or unaccounted for aerosol effects. To test whether other unknown forcing agents may have contributed to multidecadal increases in global-mean near-surface temperatures from 1950 to 2000, data pertaining to observed changes in global-scale sea surface temperatures and observed changes in radiatively active atmospheric constituents are incorporated into numericalmore » global climate models. Results indicate that the radiative forcing needed to produce the observed long-term trends in sea surface temperatures—and global-mean near-surface temperatures—is provided predominantly by known changes in greenhouse gases and aerosols. Further, results indicate that less than 10% of the long-term historical increase in global-mean near-surface temperatures over the last half of the twentieth century could have been the result of internal climate variability. In addition, they indicate that less than 25%of the total radiative forcing needed to produce the observed long-term trend in global-mean near-surface temperatures could have been provided by changes in net radiative forcing from unknown sources (either positive or negative). These results, which are derived from simple energy balance requirements, emphasize the important role humans have played in modifying the global climate over the last half of the twentieth century.« less
SYNTHESIS OF NOVEL ALL-DIELECTRIC GRATING FILTERS USING GENETIC ALGORITHMS
NASA Technical Reports Server (NTRS)
Zuffada, Cinzia; Cwik, Tom; Ditchman, Christopher
1997-01-01
We are concerned with the design of inhomogeneous, all dielectric (lossless) periodic structures which act as filters. Dielectric filters made as stacks of inhomogeneous gratings and layers of materials are being used in optical technology, but are not common at microwave frequencies. The problem is then finding the periodic cell's geometric configuration and permittivity values which correspond to a specified reflectivity/transmittivity response as a function of frequency/illumination angle. This type of design can be thought of as an inverse-source problem, since it entails finding a distribution of sources which produce fields (or quantities derived from them) of given characteristics. Electromagnetic sources (electric and magnetic current densities) in a volume are related to the outside fields by a well known linear integral equation. Additionally, the sources are related to the fields inside the volume by a constitutive equation, involving the material properties. Then, the relationship linking the fields outside the source region to those inside is non-linear, in terms of material properties such as permittivity, permeability and conductivity. The solution of the non-linear inverse problem is cast here as a combination of two linear steps, by explicitly introducing the electromagnetic sources in the computational volume as a set of unknowns in addition to the material unknowns. This allows to solve for material parameters and related electric fields in the source volume which are consistent with Maxwell's equations. Solutions are obtained iteratively by decoupling the two steps. First, we invert for the permittivity only in the minimization of a cost function and second, given the materials, we find the corresponding electric fields through direct solution of the integral equation in the source volume. The sources thus computed are used to generate the far fields and the synthesized triter response. The cost function is obtained by calculating the deviation between the synthesized value of reflectivity/transmittivity and the desired one. Solution geometries for the periodic cell are sought as gratings (ensembles of columns of different heights and widths), or combinations of homogeneous layers of different dielectric materials and gratings. Hence the explicit unknowns of the inversion step are the material permittivities and the relative boundaries separating homogeneous parcels of the periodic cell.
Extremum seeking with bounded update rates
Scheinker, Alexander; Krstić, Miroslav
2013-11-16
In this work, we present a form of extremum seeking (ES) in which the unknown function being minimized enters the system’s dynamics as the argument of a cosine or sine term, thereby guaranteeing known bounds on update rates and control efforts. We present general n-dimensional optimization and stabilization results as well as 2D vehicle control, with bounded velocity and control efforts. For application to autonomous vehicles, tracking a source in a GPS denied environment with unknown orientation, this ES approach allows for smooth heading angle actuation, with constant velocity, and in application to a unicycle-type vehicle results in control abilitymore » as if the vehicle is fully actuated. Our stability analysis is made possible by the classic results of Kurzweil, Jarnik, Sussmann, and Liu, regarding systems with highly oscillatory terms. In our stability analysis, we combine the averaging results with a semi-global practical stability result under small parametric perturbations developed by Moreau and Aeyels.« less
Quantum key distribution with an unknown and untrusted source
NASA Astrophysics Data System (ADS)
Zhao, Yi; Qi, Bing; Lo, Hoi-Kwong
2009-03-01
The security of a standard bi-directional ``plug & play'' quantum key distribution (QKD) system has been an open question for a long time. This is mainly because its source is equivalently controlled by an eavesdropper, which means the source is unknown and untrusted. Qualitative discussion on this subject has been made previously. In this paper, we present the first quantitative security analysis on a general class of QKD protocols whose sources are unknown and untrusted. The securities of standard BB84 protocol, weak+vacuum decoy state protocol, and one-decoy decoy state protocol, with unknown and untrusted sources are rigorously proved. We derive rigorous lower bounds to the secure key generation rates of the above three protocols. Our numerical simulation results show that QKD with an untrusted source gives a key generation rate that is close to that with a trusted source. Our work is published in [1]. [4pt] [1] Y. Zhao, B. Qi, and H.-K. Lo, Phys. Rev. A, 77:052327 (2008).
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Turbelin, Gregory; Issartel, Jean-Pierre; Kumar, Pramod; Feiz, Amir Ali
2015-04-01
The fast growing urbanization, industrialization and military developments increase the risk towards the human environment and ecology. This is realized in several past mortality incidents, for instance, Chernobyl nuclear explosion (Ukraine), Bhopal gas leak (India), Fukushima-Daichi radionuclide release (Japan), etc. To reduce the threat and exposure to the hazardous contaminants, a fast and preliminary identification of unknown releases is required by the responsible authorities for the emergency preparedness and air quality analysis. Often, an early detection of such contaminants is pursued by a distributed sensor network. However, identifying the origin and strength of unknown releases following the sensor reported concentrations is a challenging task. This requires an optimal strategy to integrate the measured concentrations with the predictions given by the atmospheric dispersion models. This is an inverse problem. The measured concentrations are insufficient and atmospheric dispersion models suffer from inaccuracy due to the lack of process understanding, turbulence uncertainties, etc. These lead to a loss of information in the reconstruction process and thus, affect the resolution, stability and uniqueness of the retrieved source. An additional well known issue is the numerical artifact arisen at the measurement locations due to the strong concentration gradient and dissipative nature of the concentration. Thus, assimilation techniques are desired which can lead to an optimal retrieval of the unknown releases. In general, this is facilitated within the Bayesian inference and optimization framework with a suitable choice of a priori information, regularization constraints, measurement and background error statistics. An inversion technique is introduced here for an optimal reconstruction of unknown releases using limited concentration measurements. This is based on adjoint representation of the source-receptor relationship and utilization of a weight function which exhibits a priori information about the unknown releases apparent to the monitoring network. The properties of the weight function provide an optimal data resolution and model resolution to the retrieved source estimates. The retrieved source estimates are proved theoretically to be stable against the random measurement errors and their reliability can be interpreted in terms of the distribution of the weight functions. Further, the same framework can be extended for the identification of the point type releases by utilizing the maximum of the retrieved source estimates. The inversion technique has been evaluated with the several diffusion experiments, like, Idaho low wind diffusion experiment (1974), IIT Delhi tracer experiment (1991), European Tracer Experiment (1994), Fusion Field Trials (2007), etc. In case of point release experiments, the source parameters are mostly retrieved close to the true source parameters with least error. Primarily, the proposed technique overcomes two major difficulties incurred in the source reconstruction: (i) The initialization of the source parameters as required by the optimization based techniques. The converged solution depends on their initialization. (ii) The statistical knowledge about the measurement and background errors as required by the Bayesian inference based techniques. These are hypothetically assumed in case of no prior knowledge.
Improved tomographic reconstructions using adaptive time-dependent intensity normalization.
Titarenko, Valeriy; Titarenko, Sofya; Withers, Philip J; De Carlo, Francesco; Xiao, Xianghui
2010-09-01
The first processing step in synchrotron-based micro-tomography is the normalization of the projection images against the background, also referred to as a white field. Owing to time-dependent variations in illumination and defects in detection sensitivity, the white field is different from the projection background. In this case standard normalization methods introduce ring and wave artefacts into the resulting three-dimensional reconstruction. In this paper the authors propose a new adaptive technique accounting for these variations and allowing one to obtain cleaner normalized data and to suppress ring and wave artefacts. The background is modelled by the product of two time-dependent terms representing the illumination and detection stages. These terms are written as unknown functions, one scaled and shifted along a fixed direction (describing the illumination term) and one translated by an unknown two-dimensional vector (describing the detection term). The proposed method is applied to two sets (a stem Salix variegata and a zebrafish Danio rerio) acquired at the parallel beam of the micro-tomography station 2-BM at the Advanced Photon Source showing significant reductions in both ring and wave artefacts. In principle the method could be used to correct for time-dependent phenomena that affect other tomographic imaging geometries such as cone beam laboratory X-ray computed tomography.
NASA Astrophysics Data System (ADS)
Frazer, Gordon J.; Anderson, Stuart J.
1997-10-01
The radar returns from some classes of time-varying point targets can be represented by the discrete-time signal plus noise model: xt equals st plus [vt plus (eta) t] equals (summation)i equals o P minus 1 Aiej2(pi f(i)/f(s)t) plus vt plus (eta) t, t (epsilon) 0, . . ., N minus 1, fi equals kfI plus fo where the received signal xt corresponds to the radar return from the target of interest from one azimuth-range cell. The signal has an unknown number of components, P, unknown complex amplitudes Ai and frequencies fi. The frequency parameters fo and fI are unknown, although constrained such that fo less than fI/2 and parameter k (epsilon) {minus u, . . ., minus 2, minus 1, 0, 1, 2, . . ., v} is constrained such that the component frequencies fi are bound by (minus fs/2, fs/2). The noise term vt, is typically colored, and represents clutter, interference and various noise sources. It is unknown, except that (summation)tvt2 less than infinity; in general, vt is not well modelled as an auto-regressive process of known order. The additional noise term (eta) t represents time-invariant point targets in the same azimuth-range cell. An important characteristic of the target is the unknown parameter, fI, representing the frequency interval between harmonic lines. It is desired to determine an estimate of fI from N samples of xt. We propose an algorithm to estimate fI based on Thomson's harmonic line F-Test, which is part of the multi-window spectrum estimation method and demonstrate the proposed estimator applied to target echo time series collected using an experimental HF skywave radar.
The use of the virtual source technique in computing scattering from periodic ocean surfaces.
Abawi, Ahmad T
2011-08-01
In this paper the virtual source technique is used to compute scattering of a plane wave from a periodic ocean surface. The virtual source technique is a method of imposing boundary conditions using virtual sources, with initially unknown complex amplitudes. These amplitudes are then determined by applying the boundary conditions. The fields due to these virtual sources are given by the environment Green's function. In principle, satisfying boundary conditions on an infinite surface requires an infinite number of sources. In this paper, the periodic nature of the surface is employed to populate a single period of the surface with virtual sources and m surface periods are added to obtain scattering from the entire surface. The use of an accelerated sum formula makes it possible to obtain a convergent sum with relatively small number of terms (∼40). The accuracy of the technique is verified by comparing its results with those obtained using the integral equation technique.
Source memory for action in young and older adults: self vs. close or unknown others.
Rosa, Nicole M; Gutchess, Angela H
2011-09-01
The present study examines source memory for actions (e.g., placing items in a suitcase). For both young and older adult participants, source memory for actions performed by the self was better than memory for actions performed by either a known (close) or unknown other. In addition, neither young nor older adults were more likely to confuse self with close others than with unknown others. Results suggest an advantage in source memory for actions performed by the self compared to others, possibly associated with sensorimotor cues that are relatively preserved in aging.
NASA Astrophysics Data System (ADS)
Zhang, Ye; Gong, Rongfang; Cheng, Xiaoliang; Gulliksson, Mårten
2018-06-01
This study considers the inverse source problem for elliptic partial differential equations with both Dirichlet and Neumann boundary data. The unknown source term is to be determined by additional boundary conditions. Unlike the existing methods found in the literature, which usually employ the first-order in time gradient-like system (such as the steepest descent methods) for numerically solving the regularized optimization problem with a fixed regularization parameter, we propose a novel method with a second-order in time dissipative gradient-like system and a dynamical selected regularization parameter. A damped symplectic scheme is proposed for the numerical solution. Theoretical analysis is given for both the continuous model and the numerical algorithm. Several numerical examples are provided to show the robustness of the proposed algorithm.
Nitrogen emissions, deposition, and monitoring in the Western United States
Fenn, M.E.; Haeuber, R.; Tonnesen, G.S.; Baron, Jill S.; Grossman-Clarke, S.; Hope, D.; Jaffe, D.A.; Copeland, S.; Geiser, L.; Rueth, H.M.; Sickman, J.O.
2003-01-01
Nitrogen (N) deposition in the western United States ranges from 1 to 4 kilograms (kg) per hectare (ha) per year over much of the region to as high as 30 to 90 kg per ha per year downwind of major urban and agricultural areas. Primary N emissions sources are transportation, agriculture, and industry. Emissions of N as ammonia are about 50% as great as emissions of N as nitrogen oxides. An unknown amount of N deposition to the West Coast originates from Asia. Nitrogen deposition has increased in the West because of rapid increases in urbanization, population, distance driven, and large concentrated animal feeding operations. Studies of ecological effects suggest that emissions reductions are needed to protect sensitive ecosystem components. Deposition rates are unknown for most areas in the West, although reasonable estimates are available for sites in California, the Colorado Front Range, and central Arizona. National monitoring networks provide long-term wet deposition data and, more recently, estimated dry deposition data at remote sites. However, there is little information for many areas near emissions sources.
Efficient dynamic optimization of logic programs
NASA Technical Reports Server (NTRS)
Laird, Phil
1992-01-01
A summary is given of the dynamic optimization approach to speed up learning for logic programs. The problem is to restructure a recursive program into an equivalent program whose expected performance is optimal for an unknown but fixed population of problem instances. We define the term 'optimal' relative to the source of input instances and sketch an algorithm that can come within a logarithmic factor of optimal with high probability. Finally, we show that finding high-utility unfolding operations (such as EBG) can be reduced to clause reordering.
A new aerodynamic integral equation based on an acoustic formula in the time domain
NASA Technical Reports Server (NTRS)
Farassat, F.
1984-01-01
An aerodynamic integral equation for bodies moving at transonic and supersonic speeds is presented. Based on a time-dependent acoustic formula for calculating the noise emanating from the outer portion of a propeller blade travelling at high speed (the Ffowcs Williams-Hawking formulation), the loading terms and a conventional thickness source terms are retained. Two surface and three line integrals are employed to solve an equation for the loading noise. The near-field term is regularized using the collapsing sphere approach to obtain semiconvergence on the blade surface. A singular integral equation is thereby derived for the unknown surface pressure, and is amenable to numerical solutions using Galerkin or collocation methods. The technique is useful for studying the nonuniform inflow to the propeller.
25. Photocopy of photograph (Source unknown, c. 19231925) EXTERIOR, CLOSEUP ...
25. Photocopy of photograph (Source unknown, c. 1923-1925) EXTERIOR, CLOSE-UP OF SOUTH FRONT OF MISSION AFTER RESTORATION, C. 1923-1925 - Mission San Francisco Solano de Sonoma, First & Spain Streets, Sonoma, Sonoma County, CA
44. Reinforcement construction to Pleasant Dam. Photographer unknown, 1935. Source: ...
44. Reinforcement construction to Pleasant Dam. Photographer unknown, 1935. Source: Huber Collection, University of California, Berkeley, Water Resources Library. - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
Nonparametric Stochastic Model for Uncertainty Quantifi cation of Short-term Wind Speed Forecasts
NASA Astrophysics Data System (ADS)
AL-Shehhi, A. M.; Chaouch, M.; Ouarda, T.
2014-12-01
Wind energy is increasing in importance as a renewable energy source due to its potential role in reducing carbon emissions. It is a safe, clean, and inexhaustible source of energy. The amount of wind energy generated by wind turbines is closely related to the wind speed. Wind speed forecasting plays a vital role in the wind energy sector in terms of wind turbine optimal operation, wind energy dispatch and scheduling, efficient energy harvesting etc. It is also considered during planning, design, and assessment of any proposed wind project. Therefore, accurate prediction of wind speed carries a particular importance and plays significant roles in the wind industry. Many methods have been proposed in the literature for short-term wind speed forecasting. These methods are usually based on modeling historical fixed time intervals of the wind speed data and using it for future prediction. The methods mainly include statistical models such as ARMA, ARIMA model, physical models for instance numerical weather prediction and artificial Intelligence techniques for example support vector machine and neural networks. In this paper, we are interested in estimating hourly wind speed measures in United Arab Emirates (UAE). More precisely, we predict hourly wind speed using a nonparametric kernel estimation of the regression and volatility functions pertaining to nonlinear autoregressive model with ARCH model, which includes unknown nonlinear regression function and volatility function already discussed in the literature. The unknown nonlinear regression function describe the dependence between the value of the wind speed at time t and its historical data at time t -1, t - 2, … , t - d. This function plays a key role to predict hourly wind speed process. The volatility function, i.e., the conditional variance given the past, measures the risk associated to this prediction. Since the regression and the volatility functions are supposed to be unknown, they are estimated using nonparametric kernel methods. In addition, to the pointwise hourly wind speed forecasts, a confidence interval is also provided which allows to quantify the uncertainty around the forecasts.
Enlightened Use of Passive Voice in Technical Writing
NASA Technical Reports Server (NTRS)
Trammell, M. K.
1981-01-01
The passive voice as a normal, acceptable, and established syntactic form in technical writing is defended. Passive/active verb ratios, taken from sources including 'antipassivist' text books, are considered. The suitability of the passive voice in technical writing which involves unknown or irrelevant agents is explored. Three 'myths' that the passive (1) utilizes an abnormal and artificial word order, (2) is lifeless, and (3) is indirect are considered. Awkward and abnormal sounding examples encountered in text books are addressed in terms of original context. Unattractive or incoherent passive sentences are explained in terms of inappropriate conversion from active sentences having (1) short nominal or pronominal subjects or (2) verbs with restrictions on their passive use.
12. Photocopy of lithograph (source unknown) The Armor Lithograph Company, ...
12. Photocopy of lithograph (source unknown) The Armor Lithograph Company, Ltd., Pittsburgh, Pennsylvania, ca. 1888 COURTHOUSE AND JAIL, FROM THE WEST - Allegheny County Courthouse & Jail, 436 Grant Street (Courthouse), 420 Ross Street (Jail), Pittsburgh, Allegheny County, PA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genn Saji
2006-07-01
The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less
A Risk-Based Multi-Objective Optimization Concept for Early-Warning Monitoring Networks
NASA Astrophysics Data System (ADS)
Bode, F.; Loschko, M.; Nowak, W.
2014-12-01
Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources which cannot be eliminated, especially in urban regions. As matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs.In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations and the early warning time and to minimize the installation and operating costs of the monitoring network. A qualitative risk ranking is used to prioritize the known risk sources for monitoring. The unknown risk sources can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well.We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks which are valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrade) to also cover moderate, tolerable and unknown risk sources. Monitoring networks which are valid for the remaining risk also cover all other risk sources but the early-warning time suffers.The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. To avoid numerical dispersion during the transport simulations we use the particle-tracking random walk method.
17. Photocopy of a photograph, source and date unknown GENERAL ...
17. Photocopy of a photograph, source and date unknown GENERAL VIEW OF FRONT FACADE OF MT. CLARE STATION; PASSENGER CAR SHOP IN REAR - Baltimore & Ohio Railroad, Mount Clare Passenger Car Shop, Southwest corner of Pratt & Poppleton Streets, Baltimore, Independent City, MD
Iliev, Filip L.; Stanev, Valentin G.; Vesselinov, Velimir V.
2018-01-01
Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found. PMID:29518126
Iliev, Filip L; Stanev, Valentin G; Vesselinov, Velimir V; Alexandrov, Boian S
2018-01-01
Factor analysis is broadly used as a powerful unsupervised machine learning tool for reconstruction of hidden features in recorded mixtures of signals. In the case of a linear approximation, the mixtures can be decomposed by a variety of model-free Blind Source Separation (BSS) algorithms. Most of the available BSS algorithms consider an instantaneous mixing of signals, while the case when the mixtures are linear combinations of signals with delays is less explored. Especially difficult is the case when the number of sources of the signals with delays is unknown and has to be determined from the data as well. To address this problem, in this paper, we present a new method based on Nonnegative Matrix Factorization (NMF) that is capable of identifying: (a) the unknown number of the sources, (b) the delays and speed of propagation of the signals, and (c) the locations of the sources. Our method can be used to decompose records of mixtures of signals with delays emitted by an unknown number of sources in a nondispersive medium, based only on recorded data. This is the case, for example, when electromagnetic signals from multiple antennas are received asynchronously; or mixtures of acoustic or seismic signals recorded by sensors located at different positions; or when a shift in frequency is induced by the Doppler effect. By applying our method to synthetic datasets, we demonstrate its ability to identify the unknown number of sources as well as the waveforms, the delays, and the strengths of the signals. Using Bayesian analysis, we also evaluate estimation uncertainties and identify the region of likelihood where the positions of the sources can be found.
Israeli and Palestinian families in the peace process: sources of stress and response patterns.
Lavee, Y; Ben-David, A; Azaiza, F
1997-09-01
The Israeli-Palestinian peace process is characterized by its unknown outcomes and consequences for the families involved. The purpose of this study was to identify family processes under conditions of prolonged uncertainty. Data were collected from both Israeli and Palestinian families in the West Bank by means of semi-structured interviews. Qualitative and quantitative analyses showed cross-cultural differences in the perception of the situation; different kinds of concerns and sources of stress; different coping responses; and differences in dyadic interaction patterns and intrafamily processes. The findings are discussed in social-contextual terms, particularly the ways in which political and cultural contexts shape the perception of the situation and family processes under prolonged stressful conditions.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T Elizabeth
2018-01-01
The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. A mixed-methods approach was used with cancer survivors from the "Assessment of Patients' Experience with Cancer Care" 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families' and friends' provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients.
Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T. Elizabeth
2018-01-01
Objective The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. Methods A mixed-methods approach was used with cancer survivors from the “Assessment of Patients’ Experience with Cancer Care” 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Results Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families’ and friends’ provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Conclusion Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients. PMID:29339938
EPA Unmix 6.0 Fundamentals & User Guide
Unmix seeks to solve the general mixture problem where the data are assumed to be a linear combination of an unknown number of sources of unknown composition, which contribute an unknown amount to each sample.
[Malaria and memory in the Veneto region of Italy].
Pegoraro, Manuela; Crotti, Daniele
2009-09-01
Malaria and emigration are two terms deeply embedded in Veneto history, related to images far back in the past, unknown to younger generations. Losing one's own collective historical memory is a source of personal and cultural impoverishment and inevitably compromises one's awareness of the present, possibly leading to superficial judgements and hastily formed opinions. Such a situation is all the more serious in a geographical area, north-eastern Italy, where immigration is so abundant. In this paper the authors seek to retrieve, at least in part, this memory, especially in terms of history (to what extent malaria afflicted residents in Veneto and migrants from the region) and biology (how much imprinting from malaria has remained in the native population's genetic make-up).
6. Photographic copy of photograph. No date. Photographer unknown. (Source: ...
6. Photographic copy of photograph. No date. Photographer unknown. (Source: SCIP office, Coolidge, AZ) CHINA WASH FLUME UNDER CONSTRUCTION - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ
Milan Army Ammunition Plant remedial investigation report: Volume 1. Final report 89-91
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okusu, N.; Hall, H.; Orndorff, A.
1991-12-09
A Remedial Investigation at the Milan Army Ammunition Plant, TN, was conducted for the US Army Toxic and Hazardous Materials Agency, under the terms of an Interagency Agreement with the State of Tennessee and the US Environmental Protection Agency. The study focused on the CERCLA site and selected RCRA regulated units identified by previous studies as potential sources of contamination. A broad range of chemicals including metals, explosives, and other organic compounds were found in source areas and in groundwater. The results of a risk assessment indicate that unacceptable levels of human health risks potentially exist. Conceptual models of sitemore » and unit characteristics were formulated to explain major findings, and areas not contributing to the problem were identified. For many source areas, major unknowns exist regarding hydrology, extent of contamination, and current and future impacts to groundwater quality.« less
Tsuji, Shintarou; Nishimoto, Naoki; Ogasawara, Katsuhiko
2008-07-20
Although large medical texts are stored in electronic format, they are seldom reused because of the difficulty of processing narrative texts by computer. Morphological analysis is a key technology for extracting medical terms correctly and automatically. This process parses a sentence into its smallest unit, the morpheme. Phrases consisting of two or more technical terms, however, cause morphological analysis software to fail in parsing the sentence and output unprocessed terms as "unknown words." The purpose of this study was to reduce the number of unknown words in medical narrative text processing. The results of parsing the text with additional dictionaries were compared with the analysis of the number of unknown words in the national examination for radiologists. The ratio of unknown words was reduced 1.0% to 0.36% by adding terminologies of radiological technology, MeSH, and ICD-10 labels. The terminology of radiological technology was the most effective resource, being reduced by 0.62%. This result clearly showed the necessity of additional dictionary selection and trends in unknown words. The potential for this investigation is to make available a large body of clinical information that would otherwise be inaccessible for applications other than manual health care review by personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian S.; Vesselinov, Velimir V.; Stanev, Valentin
The ShiftNMFk1.2 code, or as we call it, GreenNMFk, represents a hybrid algorithm combining unsupervised adaptive machine learning and Green's function inverse method. GreenNMFk allows an efficient and high performance de-mixing and feature extraction of a multitude of nonnegative signals that change their shape propagating through the medium. The signals are mixed and recorded by a network of uncorrelated sensors. The code couples Non-negative Matrix Factorization (NMF) and inverse-analysis Green's functions method. GreenNMF synergistically performs decomposition of the recorded mixtures, finds the number of the unknown sources and uses the Green's function of the governing partial differential equation to identifymore » the unknown sources and their charecteristics. GreenNMF can be applied directly to any problem controlled by a known partial-differential parabolic equation where mixtures of an unknown number of sources are measured at multiple locations. Full GreenNMFk method is a subject LANL U.S. Patent application S133364.000 August, 2017. The ShiftNMFk 1.2 version here is a toy version of this method that can work with a limited number of unknown sources (4 or less).« less
A Computer Program for the Computation of Running Gear Temperatures Using Green's Function
NASA Technical Reports Server (NTRS)
Koshigoe, S.; Murdock, J. W.; Akin, L. S.; Townsend, D. P.
1996-01-01
A new technique has been developed to study two dimensional heat transfer problems in gears. This technique consists of transforming the heat equation into a line integral equation with the use of Green's theorem. The equation is then expressed in terms of eigenfunctions that satisfy the Helmholtz equation, and their corresponding eigenvalues for an arbitrarily shaped region of interest. The eigenfunction are obtalned by solving an intergral equation. Once the eigenfunctions are found, the temperature is expanded in terms of the eigenfunctions with unknown time dependent coefficients that can be solved by using Runge Kutta methods. The time integration is extremely efficient. Therefore, any changes in the time dependent coefficients or source terms in the boundary conditions do not impose a great computational burden on the user. The method is demonstrated by applying it to a sample gear tooth. Temperature histories at representative surface locatons are given.
Vergeynst, Lidewei L; Sause, Markus G R; Hamstad, Marvin A; Steppe, Kathy
2015-01-01
When drought occurs in plants, acoustic emission (AE) signals can be detected, but the actual causes of these signals are still unknown. By analyzing the waveforms of the measured signals, it should, however, be possible to trace the characteristics of the AE source and get information about the underlying physiological processes. A problem encountered during this analysis is that the waveform changes significantly from source to sensor and lack of knowledge on wave propagation impedes research progress made in this field. We used finite element modeling and the well-known pencil lead break source to investigate wave propagation in a branch. A cylindrical rod of polyvinyl chloride was first used to identify the theoretical propagation modes. Two wave propagation modes could be distinguished and we used the finite element model to interpret their behavior in terms of source position for both the PVC rod and a wooden rod. Both wave propagation modes were also identified in drying-induced signals from woody branches, and we used the obtained insights to provide recommendations for further AE research in plant science.
37 CFR 382.7 - Unknown copyright owners.
Code of Federal Regulations, 2010 CFR
2010-07-01
....7 Section 382.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.7 Unknown copyright owners...
37 CFR 382.7 - Unknown copyright owners.
Code of Federal Regulations, 2012 CFR
2012-07-01
....7 Section 382.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.7 Unknown copyright owners...
37 CFR 382.7 - Unknown copyright owners.
Code of Federal Regulations, 2011 CFR
2011-07-01
....7 Section 382.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.7 Unknown copyright owners...
Rowland, Mark S [Alamo, CA; Snyderman, Neal J [Berkeley, CA
2012-04-10
A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source.
NASA Astrophysics Data System (ADS)
Capozzoli, Amedeo; Curcio, Claudio; Liseno, Angelo; Savarese, Salvatore; Schipani, Pietro
2016-07-01
The communication presents an innovative method for the diagnosis of reflector antennas in radio astronomical applications. The approach is based on the optimization of the number and the distribution of the far field sampling points exploited to retrieve the antenna status in terms of feed misalignments, this to drastically reduce the time length of the measurement process and minimize the effects of variable environmental conditions and simplifying the tracking process of the source. The feed misplacement is modeled in terms of an aberration function of the aperture field. The relationship between the unknowns and the far field pattern samples is linearized thanks to a Principal Component Analysis. The number and the position of the field samples are then determined by optimizing the Singular Values behaviour of the relevant operator.
Improved mapping of radio sources from VLBI data by least-square fit
NASA Technical Reports Server (NTRS)
Rodemich, E. R.
1985-01-01
A method is described for producing improved mapping of radio sources from Very Long Base Interferometry (VLBI) data. The method described is more direct than existing Fourier methods, is often more accurate, and runs at least as fast. The visibility data is modeled here, as in existing methods, as a function of the unknown brightness distribution and the unknown antenna gains and phases. These unknowns are chosen so that the resulting function values are as near as possible to the observed values. If researchers use the radio mapping source deviation to measure the closeness of this fit to the observed values, they are led to the problem of minimizing a certain function of all the unknown parameters. This minimization problem cannot be solved directly, but it can be attacked by iterative methods which we show converge automatically to the minimum with no user intervention. The resulting brightness distribution will furnish the best fit to the data among all brightness distributions of given resolution.
The underlying philosophy of Unmix is to let the data speak for itself. Unmix seeks to solve the general mixture problem where the data are assumed to be a linear combination of an unknown number of sources of unknown composition, which contribute an unknown amount to each sample...
Kurtosis Approach for Nonlinear Blind Source Separation
NASA Technical Reports Server (NTRS)
Duong, Vu A.; Stubbemd, Allen R.
2005-01-01
In this paper, we introduce a new algorithm for blind source signal separation for post-nonlinear mixtures. The mixtures are assumed to be linearly mixed from unknown sources first and then distorted by memoryless nonlinear functions. The nonlinear functions are assumed to be smooth and can be approximated by polynomials. Both the coefficients of the unknown mixing matrix and the coefficients of the approximated polynomials are estimated by the gradient descent method conditional on the higher order statistical requirements. The results of simulation experiments presented in this paper demonstrate the validity and usefulness of our approach for nonlinear blind source signal separation.
NASA Astrophysics Data System (ADS)
Huang, Ching-Sheng; Yeh, Hund-Der
2016-11-01
This study introduces an analytical approach to estimate drawdown induced by well extraction in a heterogeneous confined aquifer with an irregular outer boundary. The aquifer domain is divided into a number of zones according to the zonation method for representing the spatial distribution of a hydraulic parameter field. The lateral boundary of the aquifer can be considered under the Dirichlet, Neumann or Robin condition at different parts of the boundary. Flow across the interface between two zones satisfies the continuities of drawdown and flux. Source points, each of which has an unknown volumetric rate representing the boundary effect on the drawdown, are allocated around the boundary of each zone. The solution of drawdown in each zone is expressed as a series in terms of the Theis equation with unknown volumetric rates from the source points. The rates are then determined based on the aquifer boundary conditions and the continuity requirements. The estimated aquifer drawdown by the present approach agrees well with a finite element solution developed based on the Mathematica function NDSolve. As compared with the existing numerical approaches, the present approach has a merit of directly computing the drawdown at any given location and time and therefore takes much less computing time to obtain the required results in engineering applications.
An iterative method for the localization of a neutron source in a large box (container)
NASA Astrophysics Data System (ADS)
Dubinski, S.; Presler, O.; Alfassi, Z. B.
2007-12-01
The localization of an unknown neutron source in a bulky box was studied. This can be used for the inspection of cargo, to prevent the smuggling of neutron and α emitters. It is important to localize the source from the outside for safety reasons. Source localization is necessary in order to determine its activity. A previous study showed that, by using six detectors, three on each parallel face of the box (460×420×200 mm 3), the location of the source can be found with an average distance of 4.73 cm between the real source position and the calculated one and a maximal distance of about 9 cm. Accuracy was improved in this work by applying an iteration method based on four fixed detectors and the successive iteration of positioning of an external calibrating source. The initial positioning of the calibrating source is the plane of detectors 1 and 2. This method finds the unknown source location with an average distance of 0.78 cm between the real source position and the calculated one and a maximum distance of 3.66 cm for the same box. For larger boxes, localization without iterations requires an increase in the number of detectors, while localization with iterations requires only an increase in the number of iteration steps. In addition to source localization, two methods for determining the activity of the unknown source were also studied.
Identifying known unknowns using the US EPA's CompTox Chemistry Dashboard.
McEachran, Andrew D; Sobus, Jon R; Williams, Antony J
2017-03-01
Chemical features observed using high-resolution mass spectrometry can be tentatively identified using online chemical reference databases by searching molecular formulae and monoisotopic masses and then rank-ordering of the hits using appropriate relevance criteria. The most likely candidate "known unknowns," which are those chemicals unknown to an investigator but contained within a reference database or literature source, rise to the top of a chemical list when rank-ordered by the number of associated data sources. The U.S. EPA's CompTox Chemistry Dashboard is a curated and freely available resource for chemistry and computational toxicology research, containing more than 720,000 chemicals of relevance to environmental health science. In this research, the performance of the Dashboard for identifying known unknowns was evaluated against that of the online ChemSpider database, one of the primary resources used by mass spectrometrists, using multiple previously studied datasets reported in the peer-reviewed literature totaling 162 chemicals. These chemicals were examined using both applications via molecular formula and monoisotopic mass searches followed by rank-ordering of candidate compounds by associated references or data sources. A greater percentage of chemicals ranked in the top position when using the Dashboard, indicating an advantage of this application over ChemSpider for identifying known unknowns using data source ranking. Additional approaches are being developed for inclusion into a non-targeted analysis workflow as part of the CompTox Chemistry Dashboard. This work shows the potential for use of the Dashboard in exposure assessment and risk decision-making through significant improvements in non-targeted chemical identification. Graphical abstract Identifying known unknowns in the US EPA's CompTox Chemistry Dashboard from molecular formula and monoisotopic mass inputs.
The Information Available to a Moving Observer on Shape with Unknown, Isotropic BRDFs.
Chandraker, Manmohan
2016-07-01
Psychophysical studies show motion cues inform about shape even with unknown reflectance. Recent works in computer vision have considered shape recovery for an object of unknown BRDF using light source or object motions. This paper proposes a theory that addresses the remaining problem of determining shape from the (small or differential) motion of the camera, for unknown isotropic BRDFs. Our theory derives a differential stereo relation that relates camera motion to surface depth, which generalizes traditional Lambertian assumptions. Under orthographic projection, we show differential stereo may not determine shape for general BRDFs, but suffices to yield an invariant for several restricted (still unknown) BRDFs exhibited by common materials. For the perspective case, we show that differential stereo yields the surface depth for unknown isotropic BRDF and unknown directional lighting, while additional constraints are obtained with restrictions on the BRDF or lighting. The limits imposed by our theory are intrinsic to the shape recovery problem and independent of choice of reconstruction method. We also illustrate trends shared by theories on shape from differential motion of light source, object or camera, to relate the hardness of surface reconstruction to the complexity of imaging setup.
Efficient Bayesian experimental design for contaminant source identification
NASA Astrophysics Data System (ADS)
Zhang, J.; Zeng, L.
2013-12-01
In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameter identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from indirect concentration measurements in identifying unknown source parameters such as the release time, strength and location. In this approach, the sampling location that gives the maximum relative entropy is selected as the optimal one. Once the sampling location is determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown source parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. Compared with the traditional optimal design, which is based on the Gaussian linear assumption, the method developed in this study can cope with arbitrary nonlinearity. It can be used to assist in groundwater monitor network design and identification of unknown contaminant sources. Contours of the expected information gain. The optimal observing location corresponds to the maximum value. Posterior marginal probability densities of unknown parameters, the thick solid black lines are for the designed location. For comparison, other 7 lines are for randomly chosen locations. The true values are denoted by vertical lines. It is obvious that the unknown parameters are estimated better with the desinged location.
Kurtosis Approach Nonlinear Blind Source Separation
NASA Technical Reports Server (NTRS)
Duong, Vu A.; Stubbemd, Allen R.
2005-01-01
In this paper, we introduce a new algorithm for blind source signal separation for post-nonlinear mixtures. The mixtures are assumed to be linearly mixed from unknown sources first and then distorted by memoryless nonlinear functions. The nonlinear functions are assumed to be smooth and can be approximated by polynomials. Both the coefficients of the unknown mixing matrix and the coefficients of the approximated polynomials are estimated by the gradient descent method conditional on the higher order statistical requirements. The results of simulation experiments presented in this paper demonstrate the validity and usefulness of our approach for nonlinear blind source signal separation Keywords: Independent Component Analysis, Kurtosis, Higher order statistics.
An almost-parameter-free harmony search algorithm for groundwater pollution source identification.
Jiang, Simin; Zhang, Yali; Wang, Pei; Zheng, Maohui
2013-01-01
The spatiotemporal characterization of unknown sources of groundwater pollution is frequently encountered in environmental problems. This study adopts a simulation-optimization approach that combines a contaminant transport simulation model with a heuristic harmony search algorithm to identify unknown pollution sources. In the proposed methodology, an almost-parameter-free harmony search algorithm is developed. The performance of this methodology is evaluated on an illustrative groundwater pollution source identification problem, and the identified results indicate that the proposed almost-parameter-free harmony search algorithm-based optimization model can give satisfactory estimations, even when the irregular geometry, erroneous monitoring data, and prior information shortage of potential locations are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Boian S.; Lliev, Filip L.; Stanev, Valentin G.
This code is a toy (short) version of CODE-2016-83. From a general perspective, the code represents an unsupervised adaptive machine learning algorithm that allows efficient and high performance de-mixing and feature extraction of a multitude of non-negative signals mixed and recorded by a network of uncorrelated sensor arrays. The code identifies the number of the mixed original signals and their locations. Further, the code also allows deciphering of signals that have been delayed in regards to the mixing process in each sensor. This code is high customizable and it can be efficiently used for a fast macro-analyses of data. Themore » code is applicable to a plethora of distinct problems: chemical decomposition, pressure transient decomposition, unknown sources/signal allocation, EM signal decomposition. An additional procedure for allocation of the unknown sources is incorporated in the code.« less
Berner, Christine L; Staid, Andrea; Flage, Roger; Guikema, Seth D
2017-10-01
Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts. © 2016 Society for Risk Analysis.
Public Exposure from Indoor Radiofrequency Radiation in the City of Hebron, West Bank-Palestine.
Lahham, Adnan; Sharabati, Afefeh; ALMasri, Hussien
2015-08-01
This work presents the results of measured indoor exposure levels to radiofrequency (RF) radiation emitting sources in one of the major cities in the West Bank-the city of Hebron. Investigated RF emitters include FM, TV broadcasting stations, mobile telephony base stations, cordless phones [Digital Enhanced Cordless Telecommunications (DECT)], and wireless local area networks (WLAN). Measurements of power density were conducted in 343 locations representing different site categories in the city. The maximum total power density found at any location was about 2.3 × 10 W m with a corresponding exposure quotient of about 0.01. This value is well below unity, indicating compliance with the guidelines of the International Commission on Non-ionizing Radiation Protection (ICNIRP). The average total exposure from all RF sources was 0.08 × 10 W m. The relative contributions from different sources to the total exposure in terms of exposure quotient were evaluated and found to be 46% from FM radio, 26% from GSM900, 15% from DECT phones, 9% from WLAN, 3% from unknown sources, and 1% from TV broadcasting. RF sources located outdoors contribute about 73% to the population exposure indoors.
Identifying known unknowns using the US EPA's CompTox ...
Chemical features observed using high-resolution mass spectrometry can be tentatively identified using online chemical reference databases by searching molecular formulae and monoisotopic masses and then rank-ordering of the hits using appropriate relevance criteria. The most likely candidate “known unknowns,” which are those chemicals unknown to an investigator but contained within a reference database or literature source, rise to the top of a chemical list when rank-ordered by the number of associated data sources. The U.S. EPA’s CompTox Chemistry Dashboard is a curated and freely available resource for chemistry and computational toxicology research, containing more than 720,000 chemicals of relevance to environmental health science. In this research, the performance of the Dashboard for identifying “known unknowns” was evaluated against that of the online ChemSpider database, one of the primary resources used by mass spectrometrists, using multiple previously studied datasets reported in the peer-reviewed literature totaling 162 chemicals. These chemicals were examined using both applications via molecular formula and monoisotopic mass searches followed by rank-ordering of candidate compounds by associated references or data sources. A greater percentage of chemicals ranked in the top position when using the Dashboard, indicating an advantage of this application over ChemSpider for identifying known unknowns using data source ranking. Addition
What is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology
NASA Astrophysics Data System (ADS)
Borisenko, Victor E.; Ossicini, Stefano
2004-10-01
This introductory, reference handbook summarizes the terms and definitions, most important phenomena, and regulations discovered in the physics, chemistry, technology, and application of nanostructures. These nanostructures are typically inorganic and organic structures at the atomic scale. Fast progressing nanoelectronics and optoelectronics, molecular electronics and spintronics, nanotechnology and quantum processing of information, are of strategic importance for the information society of the 21st century. The short form of information taken from textbooks, special encyclopedias, recent original books and papers provides fast support in understanding "old" and new terms of nanoscience and technology widely used in scientific literature on recent developments. Such support is indeed important when one reads a scientific paper presenting new results in nanoscience. A representative collection of fundamental terms and definitions from quantum physics, and quantum chemistry, special mathematics, organic and inorganic chemistry, solid state physics, material science and technology accompanies recommended second sources (books, reviews, websites) for an extended study of a subject. Each entry interprets the term or definition under consideration and briefly presents main features of the phenomena behind it. Additional information in the form of notes ("First described in: ?", "Recognition: ?", "More details in: ?") supplements entries and gives a historical retrospective of the subject with reference to further sources. Ideal for answering questions related to unknown terms and definitions of undergraduate and Ph.D. students studying the physics of low-dimensional structures, nanoelectronics, nanotechnology. The handbook provides fast support, when one likes to know or to remind the essence of a scientific term, especially when it contains a personal name in its title, like in terms "Anderson localization", "Aharonov-Bohm effect", "Bose-Einstein condensate", e.t.c. More than 1000 entries, from a few sentences to a page in length.
Kassis, Hayah; Marnejon, Thomas; Gemmel, David; Cutrona, Anthony; Gottimukkula, Rajashree
2010-06-01
A 19-year-old male patient was diagnosed with S. sanguinis brain abscess of unknown etiopathology as a complication of subclinical endocarditis. While viridans streptococci are implicated in dental seeding to the heart, S. sanguinis brain abscesses are rare. Six previous cases of S. sanguinis brain abscess in the literature reported dental procedures and maxillofacial trauma. In our patient, there was no obvious source of infective endocarditis preceding the development of brain abscess. This demonstrates the importance of prompt diagnosis and initiation of antimicrobial therapy given the potential for long-term sequelae such as focal deficits and seizures.
41. Upstream end of emergency spillway excavation. Photographer unknown, 1929. ...
41. Upstream end of emergency spillway excavation. Photographer unknown, 1929. Source: Arizona Department of Water Resources (ADWR). - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
Absolute nuclear material assay
Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA
2012-05-15
A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.
Absolute nuclear material assay
Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA
2010-07-13
A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.
Research of mine water source identification based on LIF technology
NASA Astrophysics Data System (ADS)
Zhou, Mengran; Yan, Pengcheng
2016-09-01
According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.
NASA Astrophysics Data System (ADS)
Neville, J.; Emanuel, R. E.
2017-12-01
In 2016 Hurricane Matthew brought immense flooding and devastation to the Lumbee (aka Lumber) River basin. Some impacts are obvious, such as deserted homes and businesses, but other impacts, including long-term environmental, are uncertain. Extreme flooding throughout the basin established temporary hydrologic connectivity between aquatic environments and upland sources of nutrients and other pollutants. Though 27% of the basin is covered by wetlands, hurricane-induced flooding was so intense that wetlands may have had no opportunity to mitigate delivery of nutrients into surface waters. As a result, how Hurricane Matthew impacted nitrate retention and uptake in the Lumbee River remains uncertain. The unknown magnitude of nitrate transported into the Lumbee River from surrounding sources may have lingering impacts on nitrogen cycling in this stream. With these potential impacts in mind, we conducted a Lagrangian water quality sampling campaign to assess the ability of the Lumbee River to retain and process nitrogen following Hurricane Matthew. We collected samples before and after flooding and compare first order nitrogen uptake kinetics of both periods. The analysis and comparisons allow us to evaluate the long-term impacts of Hurricane Matthew on nitrogen cycling after floodwaters recede.
Hanus, Robert; Vrkoslav, Vladimír; Hrdý, Ivan; Cvačka, Josef; Šobotník, Jan
2010-01-01
In 1959, P. Karlson and M. Lüscher introduced the term ‘pheromone’, broadly used nowadays for various chemicals involved in intraspecific communication. To demonstrate the term, they depicted the situation in termite societies, where king and queen inhibit the reproduction of nest-mates by an unknown chemical substance. Paradoxically, half a century later, neither the source nor the chemical identity of this ‘royal’ pheromone is known. In this study, we report for the first time the secretion of polar compounds of proteinaceous origin by functional reproductives in three termite species, Prorhinotermes simplex, Reticulitermes santonensis and Kalotermes flavicollis. Aqueous washes of functional reproductives contained sex-specific proteinaceous compounds, virtually absent in non-reproducing stages. Moreover, the presence of these compounds was clearly correlated with the age of reproductives and their reproductive status. We discuss the putative function of these substances in termite caste recognition and regulation. PMID:19939837
NASA Astrophysics Data System (ADS)
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas
2016-11-01
Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.
Absolute nuclear material assay using count distribution (LAMBDA) space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.
A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.
Absolute nuclear material assay using count distribution (LAMBDA) space
Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA
2012-06-05
A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.
A multiwave range test for obstacle reconstructions with unknown physical properties
NASA Astrophysics Data System (ADS)
Potthast, Roland; Schulz, Jochen
2007-08-01
We develop a new multiwave version of the range test for shape reconstruction in inverse scattering theory. The range test [R. Potthast, et al., A `range test' for determining scatterers with unknown physical properties, Inverse Problems 19(3) (2003) 533-547] has originally been proposed to obtain knowledge about an unknown scatterer when the far field pattern for only one plane wave is given. Here, we extend the method to the case of multiple waves and show that the full shape of the unknown scatterer can be reconstructed. We further will clarify the relation between the range test methods, the potential method [A. Kirsch, R. Kress, On an integral equation of the first kind in inverse acoustic scattering, in: Inverse Problems (Oberwolfach, 1986), Internationale Schriftenreihe zur Numerischen Mathematik, vol. 77, Birkhauser, Basel, 1986, pp. 93-102] and the singular sources method [R. Potthast, Point sources and multipoles in inverse scattering theory, Habilitation Thesis, Gottingen, 1999]. In particular, we propose a new version of the Kirsch-Kress method using the range test and a new approach to the singular sources method based on the range test and potential method. Numerical examples of reconstructions for all four methods are provided.
Adaptive suboptimal second-order sliding mode control for microgrids
NASA Astrophysics Data System (ADS)
Incremona, Gian Paolo; Cucuzzella, Michele; Ferrara, Antonella
2016-09-01
This paper deals with the design of adaptive suboptimal second-order sliding mode (ASSOSM) control laws for grid-connected microgrids. Due to the presence of the inverter, of unpredicted load changes, of switching among different renewable energy sources, and of electrical parameters variations, the microgrid model is usually affected by uncertain terms which are bounded, but with unknown upper bounds. To theoretically frame the control problem, the class of second-order systems in Brunovsky canonical form, characterised by the presence of matched uncertain terms with unknown bounds, is first considered. Four adaptive strategies are designed, analysed and compared to select the most effective ones to be applied to the microgrid case study. In the first two strategies, the control amplitude is continuously adjusted, so as to arrive at dominating the effect of the uncertainty on the controlled system. When a suitable control amplitude is attained, the origin of the state space of the auxiliary system becomes attractive. In the other two strategies, a suitable blend between two components, one mainly working during the reaching phase, the other being the predominant one in a vicinity of the sliding manifold, is generated, so as to reduce the control amplitude in steady state. The microgrid system in a grid-connected operation mode, controlled via the selected ASSOSM control strategies, exhibits appreciable stability properties, as proved theoretically and shown in simulation.
Schiller, Q.; Tu, W.; Ali, A. F.; ...
2017-03-11
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Q.; Tu, W.; Ali, A. F.
The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less
Accelerating fissile material detection with a neutron source
Rowland, Mark S.; Snyderman, Neal J.
2018-01-30
A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly to count neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. The system includes a Poisson neutron generator for in-beam interrogation of a possible fissile neutron source and a DC power supply that exhibits electrical ripple on the order of less than one part per million. Certain voltage multiplier circuits, such as Cockroft-Walton voltage multipliers, are used to enhance the effective of series resistor-inductor circuits components to reduce the ripple associated with traditional AC rectified, high voltage DC power supplies.
O'Brien, J K; Robeck, T R
2012-08-01
Bottlenose dolphins (Tursiops truncatus) undergoing natural breeding and artificial insemination (AI) were examined to characterize serum progesterone concentrations and determine relationships among age, parity, and reproductive outcome. Progesterone profiles of five cycle types (n = 119 total cycles from 54 animals) were characterized as follows: (i) conception and production of a live term calf (conceptive-term, n = 73); (ii) conception and abortion after Day 60 (conceptive-abortion, n = 12); (iii) unknown conception status with prolonged, elevated progesterone and absence of a fetus (conceptive-unknown, n = 14); (iv) conception failure with normal luteal phase progesterone concentrations (non-conceptive, n = 14, AI cycles only); and (v) conception failure with progesterone insufficiency occuring after spontaneous ovulation or owing to premature ovulation induction using GnRH (non-conceptive-PI, n = 6, AI cycles only). By Day 21 post-insemination (PI), progesterone concentrations were similar (P > 0.05) among conceptive-term, conceptive-abortion and conceptive-unknown, and higher (P < 0.05) for conceptive-term than non-conceptive and non-conceptive-PI cycles. Progesterone concentrations of known conceptive cycles peaked by Week 7 PI (P < 0.05) and remained elevated for the remainder of pregnancy (Weeks 8 up to 54, ≥ 5 days pre-partum). During midpregnancy (Days 121-240), conceptive-term cycles had higher (P > 0.05) progesterone concentrations than conceptive-abortion and unknown conception status cycles. Parity was not associated with reproductive outcome based on cycle type (P > 0.05). Age of females in conceptive-unknown (26.5 ± 10.1 yrs) and conceptive-abortion (22.1 ± 9.4 yrs) groups was higher (P < 0.05) than in conceptive-term (15.7 ± 7.2 yrs). The conceptive-unknown cycle type possibly represents undetected early embryonic loss occurring before Day 60 PI. Length of gestation using known conception dates was 376.1 ± 11.0 days and the range of this parameter (355-395 days) has implications for peri-parturient management procedures for the species. Copyright © 2012 Elsevier Inc. All rights reserved.
Viking-Age Sails: Form and Proportion
NASA Astrophysics Data System (ADS)
Bischoff, Vibeke
2017-04-01
Archaeological ship-finds have shed much light on the design and construction of vessels from the Viking Age. However, the exact proportions of their sails remain unknown due to the lack of fully preserved sails, or other definite indicators of their proportions. Key Viking-Age ship-finds from Scandinavia—the Oseberg Ship, the Gokstad Ship and Skuldelev 3—have all revealed traces of rigging. In all three finds, the keelson—with the mast position—is preserved, together with fastenings for the sheets and the tack, indicating the breadth of the sail. The sail area can then be estimated based on practical experience of how large a sail the specific ship can carry, in conjunction with hull form and displacement. This article presents reconstructions of the form and dimensions of rigging and sail based on the archaeological finds, evidence from iconographic and written sources, and ethnographic parallels with traditional Nordic boats. When these sources are analysed, not only do the similarities become apparent, but so too does the relative disparity between the archaeological record and the other sources. Preferential selection in terms of which source is given the greatest merit is therefore required, as it is not possible to afford them all equal value.
How to Decide? Multi-Objective Early-Warning Monitoring Networks for Water Suppliers
NASA Astrophysics Data System (ADS)
Bode, Felix; Loschko, Matthias; Nowak, Wolfgang
2015-04-01
Groundwater is a resource for drinking water and hence needs to be protected from contaminations. However, many well catchments include an inventory of known and unknown risk sources, which cannot be eliminated, especially in urban regions. As a matter of risk control, all these risk sources should be monitored. A one-to-one monitoring situation for each risk source would lead to a cost explosion and is even impossible for unknown risk sources. However, smart optimization concepts could help to find promising low-cost monitoring network designs. In this work we develop a concept to plan monitoring networks using multi-objective optimization. Our considered objectives are to maximize the probability of detecting all contaminations, to enhance the early warning time before detected contaminations reach the drinking water well, and to minimize the installation and operating costs of the monitoring network. Using multi-objectives optimization, we avoid the problem of having to weight these objectives to a single objective-function. These objectives are clearly competing, and it is impossible to know their mutual trade-offs beforehand - each catchment differs in many points and it is hardly possible to transfer knowledge between geological formations and risk inventories. To make our optimization results more specific to the type of risk inventory in different catchments we do risk prioritization of all known risk sources. Due to the lack of the required data, quantitative risk ranking is impossible. Instead, we use a qualitative risk ranking to prioritize the known risk sources for monitoring. Additionally, we allow for the existence of unknown risk sources that are totally uncertain in location and in their inherent risk. Therefore, they can neither be located nor ranked. Instead, we represent them by a virtual line of risk sources surrounding the production well. We classify risk sources into four different categories: severe, medium and tolerable for known risk sources and an extra category for the unknown ones. With that, early warning time and detection probability become individual objectives for each risk class. Thus, decision makers can identify monitoring networks valid for controlling the top risk sources, and evaluate the capabilities (or search for least-cost upgrades) to also cover moderate, tolerable and unknown risk sources. Monitoring networks, which are valid for the remaining risk also cover all other risk sources, but only with a relatively poor early-warning time. The data provided for the optimization algorithm are calculated in a preprocessing step by a flow and transport model. It simulates, which potential contaminant plumes from the risk sources would be detectable where and when by all possible candidate positions for monitoring wells. Uncertainties due to hydro(geo)logical phenomena are taken into account by Monte-Carlo simulations. These include uncertainty in ambient flow direction of the groundwater, uncertainty of the conductivity field, and different scenarios for the pumping rates of the production wells. To avoid numerical dispersion during the transport simulations, we use particle-tracking random walk methods when simulating transport.
Challenges/issues of NIS used in particle accelerator facilities
NASA Astrophysics Data System (ADS)
Faircloth, Dan
2013-09-01
High current, high duty cycle negative ion sources are an essential component of many high power particle accelerators. This talk gives an overview of the state-of-the-art sources used around the world. Volume, surface and charge exchange negative ion production processes are detailed. Cesiated magnetron and Penning surface plasma sources are discussed along with surface converter sources. Multicusp volume sources with filament and LaB6 cathodes are described before moving onto RF inductively coupled volume sources with internal and external antennas. The major challenges facing accelerator facilities are detailed. Beam current, source lifetime and reliability are the most pressing. The pros and cons of each source technology is discussed along with their development programs. The uncertainties and unknowns common to these sources are discussed. The dynamics of cesium surface coverage and the causes of source variability are still unknown. Minimizing beam emittance is essential to maximizing the transport of high current beams; space charge effects are very important. The basic physics of negative ion production is still not well understood, theoretical and experimental programs continue to improve this, but there are still many mysteries to be solved.
Zhang, Jian-Hua; Zeng, Xin; Chen, Xu-Sheng; Mao, Zhong-Gui
2018-04-21
The glucose-glycerol mixed carbon source remarkably reduced the batch fermentation time of ε-poly-L-lysine (ε-PL) production, leading to higher productivity of both biomass and ε-PL, which was of great significance in industrial microbial fermentation. Our previous study confirmed the positive influence of fast cell growth on the ε-PL biosynthesis, while the direct influence of mixed carbon source on ε-PL production was still unknown. In this work, chemostat culture was employed to study the capacity of ε-PL biosynthesis in different carbon sources at a same dilution rate of 0.05 h -1 . The results indicated that the mixed carbon source could enhance the ε-PL productivity besides the rapid cell growth. Analysis of key enzymes demonstrated that the activities of phosphoenolpyruvate carboxylase, citrate synthase, aspartokinase and ε-PL synthetase were all increased in chemostat culture with the mixed carbon source. In addition, the carbon fluxes were also improved in the mixed carbon source in terms of tricarboxylic acid cycle, anaplerotic and diaminopimelate pathway. Moreover, the mixed carbon source also accelerated the energy metabolism, leading to higher levels of energy charge and NADH/NAD + ratio. The overall improvements of primary metabolism in chemostat culture with glucose-glycerol combination provided sufficient carbon skeletons and ATP for ε-PL biosynthesis. Therefore, the significantly higher ε-PL productivity in the mixed carbon source was a combined effect of both superior substrate group and rapid cell growth.
Genome-wide protein-protein interactions and protein function exploration in cyanobacteria
Lv, Qi; Ma, Weimin; Liu, Hui; Li, Jiang; Wang, Huan; Lu, Fang; Zhao, Chen; Shi, Tieliu
2015-01-01
Genome-wide network analysis is well implemented to study proteins of unknown function. Here, we effectively explored protein functions and the biological mechanism based on inferred high confident protein-protein interaction (PPI) network in cyanobacteria. We integrated data from seven different sources and predicted 1,997 PPIs, which were evaluated by experiments in molecular mechanism, text mining of literatures in proved direct/indirect evidences, and “interologs” in conservation. Combined the predicted PPIs with known PPIs, we obtained 4,715 no-redundant PPIs (involving 3,231 proteins covering over 90% of genome) to generate the PPI network. Based on the PPI network, terms in Gene ontology (GO) were assigned to function-unknown proteins. Functional modules were identified by dissecting the PPI network into sub-networks and analyzing pathway enrichment, with which we investigated novel function of underlying proteins in protein complexes and pathways. Examples of photosynthesis and DNA repair indicate that the network approach is a powerful tool in protein function analysis. Overall, this systems biology approach provides a new insight into posterior functional analysis of PPIs in cyanobacteria. PMID:26490033
An evolutive real-time source inversion based on a linear inverse formulation
NASA Astrophysics Data System (ADS)
Sanchez Reyes, H. S.; Tago, J.; Cruz-Atienza, V. M.; Metivier, L.; Contreras Zazueta, M. A.; Virieux, J.
2016-12-01
Finite source inversion is a steppingstone to unveil earthquake rupture. It is used on ground motion predictions and its results shed light on seismic cycle for better tectonic understanding. It is not yet used for quasi-real-time analysis. Nowadays, significant progress has been made on approaches regarding earthquake imaging, thanks to new data acquisition and methodological advances. However, most of these techniques are posterior procedures once seismograms are available. Incorporating source parameters estimation into early warning systems would require to update the source build-up while recording data. In order to go toward this dynamic estimation, we developed a kinematic source inversion formulated in the time-domain, for which seismograms are linearly related to the slip distribution on the fault through convolutions with Green's functions previously estimated and stored (Perton et al., 2016). These convolutions are performed in the time-domain as we progressively increase the time window of records at each station specifically. Selected unknowns are the spatio-temporal slip-rate distribution to keep the linearity of the forward problem with respect to unknowns, as promoted by Fan and Shearer (2014). Through the spatial extension of the expected rupture zone, we progressively build-up the slip-rate when adding new data by assuming rupture causality. This formulation is based on the adjoint-state method for efficiency (Plessix, 2006). The inverse problem is non-unique and, in most cases, underdetermined. While standard regularization terms are used for stabilizing the inversion, we avoid strategies based on parameter reduction leading to an unwanted non-linear relationship between parameters and seismograms for our progressive build-up. Rise time, rupture velocity and other quantities can be extracted later on as attributs from the slip-rate inversion we perform. Satisfactory results are obtained on a synthetic example (FIgure 1) proposed by the Source Inversion Validation project (Mai et al. 2011). A real case application is currently being explored. Our specific formulation, combined with simple prior information, as well as numerical results obtained so far, yields interesting perspectives for a real-time implementation.
A new component of cosmic rays of unknown origin at a few MeV per nucleon
NASA Technical Reports Server (NTRS)
Gloecker, G.
1974-01-01
Recently discovered anomalies in the abundances and energy spectra of quiet time, extraterrestrial hydrogen, helium, carbon, nitrogen, and oxygen require serious revisions of origin theories to account for this new component of cosmic radiation. Abnormally large O/C and N/C ratios, long term intensity variations with time, and radial gradient measurements indicate a non-solar origin for these 2 to 30 MeV/nucleon particles. Ideas suggested to explain these measurements range from acceleration of galactic source material having an unusual composition to local acceleration of particles within the solar cavity. Observations are at present insufficient to choose between these alternate origin models.
Groundwater Pollution Source Identification using Linked ANN-Optimization Model
NASA Astrophysics Data System (ADS)
Ayaz, Md; Srivastava, Rajesh; Jain, Ashu
2014-05-01
Groundwater is the principal source of drinking water in several parts of the world. Contamination of groundwater has become a serious health and environmental problem today. Human activities including industrial and agricultural activities are generally responsible for this contamination. Identification of groundwater pollution source is a major step in groundwater pollution remediation. Complete knowledge of pollution source in terms of its source characteristics is essential to adopt an effective remediation strategy. Groundwater pollution source is said to be identified completely when the source characteristics - location, strength and release period - are known. Identification of unknown groundwater pollution source is an ill-posed inverse problem. It becomes more difficult for real field conditions, when the lag time between the first reading at observation well and the time at which the source becomes active is not known. We developed a linked ANN-Optimization model for complete identification of an unknown groundwater pollution source. The model comprises two parts- an optimization model and an ANN model. Decision variables of linked ANN-Optimization model contain source location and release period of pollution source. An objective function is formulated using the spatial and temporal data of observed and simulated concentrations, and then minimized to identify the pollution source parameters. In the formulation of the objective function, we require the lag time which is not known. An ANN model with one hidden layer is trained using Levenberg-Marquardt algorithm to find the lag time. Different combinations of source locations and release periods are used as inputs and lag time is obtained as the output. Performance of the proposed model is evaluated for two and three dimensional case with error-free and erroneous data. Erroneous data was generated by adding uniformly distributed random error (error level 0-10%) to the analytically computed concentration values. The main advantage of the proposed model is that it requires only upper half of the breakthrough curve and is capable of predicting source parameters when the lag time is not known. Linking of ANN model with proposed optimization model reduces the dimensionality of the decision variables of the optimization model by one and hence complexity of optimization model is reduced. The results show that our proposed linked ANN-Optimization model is able to predict the source parameters for the error-free data accurately. The proposed model was run several times to obtain the mean, standard deviation and interval estimate of the predicted parameters for observations with random measurement errors. It was observed that mean values as predicted by the model were quite close to the exact values. An increasing trend was observed in the standard deviation of the predicted values with increasing level of measurement error. The model appears to be robust and may be efficiently utilized to solve the inverse pollution source identification problem.
Maslehaty, Homajoun; Petridis, Athanassios K; Barth, Harald; Doukas, Alexandros; Mehdorn, Hubertus Maximilian
2011-01-01
Spontaneous subarachnoid hemorrhage (SAH) without evidence of a bleeding source on the first digital subtraction angiogram (DSA) - also called SAH of unknown origin - is observed in up to 27% of all cases. Depending on the bleeding pattern on CT scanning, SAH can be differentiated into perimesencephalic (PM-SAH) and non-perimesencephalic SAH (NON-PM-SAH). The aim of our study was to investigate the effectiveness of magnetic resonance imaging (MRI) for detecting a bleeding source in SAH of unknown origin. We retrospectively reviewed 1,226 patients with spontaneous SAH between January 1991 and December 2008 in our department. DSA was performed in 1,068 patients, with negative results in 179 patients. Forty-seven patients were categorized as having PM-SAH and 132 patients as having NON-PM-SAH. MRI of the brain and the craniocervical region was performed within 72 h after diagnosis of SAH and demonstrated no bleeding sources in any of the PM-SAH and NON-PM-SAH patients (100% negative). In our experience MRI did not produce any additional benefit for detecting a bleeding source after SAH with a negative angiogram. The costs of this examination exceeded the clinical value. Despite our results MRI should be discussed on a case-by-case basis because rare bleeding sources are periodically diagnosed in cases of NON-PM-SAH.
Disturbance Source Separation in Shear Flows Using Blind Source Separation Methods
NASA Astrophysics Data System (ADS)
Gluzman, Igal; Cohen, Jacob; Oshman, Yaakov
2017-11-01
A novel approach is presented for identifying disturbance sources in wall-bounded shear flows. The method can prove useful for active control of boundary layer transition from laminar to turbulent flow. The underlying idea is to consider the flow state, as measured in sensors, to be a mixture of sources, and to use Blind Source Separation (BSS) techniques to recover the separate sources and their unknown mixing process. We present a BSS method based on the Degenerate Unmixing Estimation Technique. This method can be used to identify any (a priori unknown) number of sources by using the data acquired by only two sensors. The power of the new method is demonstrated via numerical and experimental proofs of concept. Wind tunnel experiments involving boundary layer flow over a flat plate were carried out, in which two hot-wire anemometers were used to separate disturbances generated by disturbance generators such as a single dielectric barrier discharge plasma actuator and a loudspeaker.
NASA Astrophysics Data System (ADS)
Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.
2016-12-01
A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
CO2 Flux From Antarctic Dry Valley Soils: Determining the Source and Environmental Controls
NASA Astrophysics Data System (ADS)
Risk, D. A.; Macintyre, C. M.; Shanhun, F.; Almond, P. C.; Lee, C.; Cary, C.
2014-12-01
Soils within the McMurdo Dry Valleys are known to respire carbon dioxide (CO2), but considerable debate surrounds the contributing sources and mechanisms that drive temporal variability. While some of the CO2 is of biological origin, other known contributors to variability include geochemical sources within, or beneath, the soil column. The relative contribution from each of these sources will depend on seasonal and environmental drivers such as temperature and wind that exert influence on temporal dynamics. To supplement a long term CO2 surface flux monitoring station that has now recorded fluxes over three full annual cycles, in January 2014 an automated flux and depth concentration monitoring system was installed in the Spaulding Pond area of Taylor Valley, along with standard meteorological sensors, to assist in defining source contributions through time. During two weeks of data we observed marked diel variability in CO2 concentrations within the profile (~100 ppm CO2 above or below atmospheric), and of CO2 moving across the soil surface. The pattern at many depths suggested an alternating diel-scale transition from source to sink that seemed clearly correlated with temperature-driven changes in the solubility of CO2 in water films. This CO2 solution storage flux was very highly coupled to soil temperature. A small depth source of unknown origin also appeared to be present. A controlled laboratory soil experiment was conducted to confirm the magnitude of fluxes into and out of soil water films, and confirmed the field results and temperature dependence. Ultimately, this solution storage flux needs to be well understood if the small biological fluxes from these soils are to be properly quantified and monitored for change. Here, we present results from the 2013/2014 field season and these supplementary experiments, placed in the context of 3 year long term continuous measurement of soil CO2 flux within the Dry Valleys.
A Bayesian framework for infrasound location
NASA Astrophysics Data System (ADS)
Modrak, Ryan T.; Arrowsmith, Stephen J.; Anderson, Dale N.
2010-04-01
We develop a framework for location of infrasound events using backazimuth and infrasonic arrival times from multiple arrays. Bayesian infrasonic source location (BISL) developed here estimates event location and associated credibility regions. BISL accounts for unknown source-to-array path or phase by formulating infrasonic group velocity as random. Differences between observed and predicted source-to-array traveltimes are partitioned into two additive Gaussian sources, measurement error and model error, the second of which accounts for the unknown influence of wind and temperature on path. By applying the technique to both synthetic tests and ground-truth events, we highlight the complementary nature of back azimuths and arrival times for estimating well-constrained event locations. BISL is an extension to methods developed earlier by Arrowsmith et al. that provided simple bounds on location using a grid-search technique.
Deconvolution Methods and Systems for the Mapping of Acoustic Sources from Phased Microphone Arrays
NASA Technical Reports Server (NTRS)
Humphreys, Jr., William M. (Inventor); Brooks, Thomas F. (Inventor)
2012-01-01
Mapping coherent/incoherent acoustic sources as determined from a phased microphone array. A linear configuration of equations and unknowns are formed by accounting for a reciprocal influence of one or more cross-beamforming characteristics thereof at varying grid locations among the plurality of grid locations. An equation derived from the linear configuration of equations and unknowns can then be iteratively determined. The equation can be attained by the solution requirement of a constraint equivalent to the physical assumption that the coherent sources have only in phase coherence. The size of the problem may then be reduced using zoning methods. An optimized noise source distribution is then generated over an identified aeroacoustic source region associated with a phased microphone array (microphones arranged in an optimized grid pattern including a plurality of grid locations) in order to compile an output presentation thereof, thereby removing beamforming characteristics from the resulting output presentation.
Deconvolution methods and systems for the mapping of acoustic sources from phased microphone arrays
NASA Technical Reports Server (NTRS)
Brooks, Thomas F. (Inventor); Humphreys, Jr., William M. (Inventor)
2010-01-01
A method and system for mapping acoustic sources determined from a phased microphone array. A plurality of microphones are arranged in an optimized grid pattern including a plurality of grid locations thereof. A linear configuration of N equations and N unknowns can be formed by accounting for a reciprocal influence of one or more beamforming characteristics thereof at varying grid locations among the plurality of grid locations. A full-rank equation derived from the linear configuration of N equations and N unknowns can then be iteratively determined. A full-rank can be attained by the solution requirement of the positivity constraint equivalent to the physical assumption of statically independent noise sources at each N location. An optimized noise source distribution is then generated over an identified aeroacoustic source region associated with the phased microphone array in order to compile an output presentation thereof, thereby removing the beamforming characteristics from the resulting output presentation.
Dowling, Sally; Pontin, David
2017-01-01
Breastmilk is widely considered as the optimum nutrition source for babies and an important factor in both improving public health and reducing health inequalities. Current international/national policy supports long-term breastfeeding. UK breastfeeding initiation rates are high but rapidly decline, and the numbers breastfeeding in the second year and beyond are unknown. This study used the concept of liminality to explore the experiences of a group of women breastfeeding long-term in the United Kingdom, building on Mahon-Daly and Andrews. Over 80 breastfeeding women were included within the study, which used micro-ethnographic methods (participant observation in breastfeeding support groups, face-to-face interviews and online asynchronous interviews via email). Findings about women's experiences are congruent with the existing literature, although it is mostly dated and from outside the United Kingdom. Liminality was found to be useful in providing insight into women's experiences of long-term breastfeeding in relation to both time and place. Understanding women's experience of breastfeeding beyond current usual norms can be used to inform work with breastfeeding mothers and to encourage more women to breastfeed for longer.
The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank
NASA Astrophysics Data System (ADS)
Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing
2018-03-01
In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.
DEVELOPMENT AND EVALUATION OF PM 2.5 SOURCE APPORTIONMENT METHODOLOGIES
The receptor model called Positive Matrix Factorization (PMF) has been extensively used to apportion sources of ambient fine particulate matter (PM2.5), but the accuracy of source apportionment results currently remains unknown. In addition, air quality forecast model...
Fission meter and neutron detection using poisson distribution comparison
Rowland, Mark S; Snyderman, Neal J
2014-11-18
A neutron detector system and method for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. Comparison of the observed neutron count distribution with a Poisson distribution is performed to distinguish fissile material from non-fissile material.
Li, Yongming; Tong, Shaocheng
The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small neighborhood of zero. Finally, numerical results of practical examples are presented to further demonstrate the effectiveness of the proposed control strategy.The problem of active fault-tolerant control (FTC) is investigated for the large-scale nonlinear systems in nonstrict-feedback form. The nonstrict-feedback nonlinear systems considered in this paper consist of unstructured uncertainties, unmeasured states, unknown interconnected terms, and actuator faults (e.g., bias fault and gain fault). A state observer is designed to solve the unmeasurable state problem. Neural networks (NNs) are used to identify the unknown lumped nonlinear functions so that the problems of unstructured uncertainties and unknown interconnected terms can be solved. By combining the adaptive backstepping design principle with the combination Nussbaum gain function property, a novel NN adaptive output-feedback FTC approach is developed. The proposed FTC controller can guarantee that all signals in all subsystems are bounded, and the tracking errors for each subsystem converge to a small neighborhood of zero. Finally, numerical results of practical examples are presented to further demonstrate the effectiveness of the proposed control strategy.
2004-06-23
Vibrio cholerae ) + — — — + Unknown Salmonella Typhimurium + — + — — Unknown Typhoid fever (Salmonella Typhi) + O — — — Unknown Source: This...disseminated by contamination of food or drink. Cholera bb ( Vibrio cholerae ) Cholera occurs in many of the developing countries of Africa and Asia...diseaseinfo/cholera_g.htm]; the Health Canada Material Safety Data Sheet - Infectious Substances for Vibrio cholerae , found online at [http://www.hc-sc.gc.ca
Organic carbon sources and sinks in San Francisco Bay: variability induced by river flow
Jassby, Alan D.; Powell, T.M.; Cloern, James E.
1993-01-01
Sources and sinks of organic carbon for San Francisco Bay (California, USA) were estimated for 1980. Sources for the southern reach were dominated by phytoplankton and benthic microalgal production. River loading of organic matter was an additional important factor in the northern reach. Tidal marsh export and point sources played a secondary role. Autochthonous production in San Francisco Bay appears to be less than the mean for temperate-zone estuaries, primarily because turbidity limits microalgal production and the development of seagrass beds. Exchange between the Bay and Pacific Ocean plays an unknown but potentially important role in the organic carbon balance. Interannual variability in the organic carbon supply was assessed for Suisun Bay, a northern reach subembayment that provides habitat for important fish species (delta smelt Hypomesus transpacificus and larval striped bass Morone saxatilus). The total supply fluctuated by an order of magnitude; depending on the year, either autochthonous sources (phytoplankton production) or allochthonous sources (riverine loading) could be dominant. The primary cause of the year-to-year change was variability of freshwater inflows from the Sacramento and San Joaquin rivers, and its magnitude was much larger than long-term changes arising from marsh destruction and point source decreases. Although interannual variability of the total organic carbon supply could not be assessed for the southern reach, year-to-year changes in phytoplankton production were much smaller than in Suisun Bay, reflecting a relative lack of river influence.
Association of Internet search trends with suicide death in Taipei City, Taiwan, 2004-2009.
Yang, Albert C; Tsai, Shi-Jen; Huang, Norden E; Peng, Chung-Kang
2011-07-01
Although Internet has become an important source for affected people seeking suicide information, the connection between Internet searches for suicide information and suicidal death remains largely unknown. This study aims to evaluate the association between suicide and Internet searches trends for 37 suicide-related terms representing major known risks of suicide. This study analyzes suicide death data in Taipei City, Taiwan and corresponding local Internet search trend data provided by Google Insights for Search during the period from January 2004 to December 2009. The investigation uses cross correlation analysis to estimate the temporal relationship between suicide and Internet search trends and multiple linear regression analysis to identify significant factors associated with suicide from a pool of search trend data that either coincides or precedes the suicide death. Results show that a set of suicide-related search terms, the trends of which either temporally coincided or preceded trends of suicide data, were associated with suicide death. These search factors varied among different suicide samples. Searches for "major depression" and "divorce" accounted for, at most, 30.2% of the variance in suicide data. When considering only leading suicide trends, searches for "divorce" and the pro-suicide term "complete guide of suicide," accounted for 22.7% of variance in suicide data. Appropriate filtering and detection of potentially harmful source in keyword-driven search results by search engine providers may be a reasonable strategy to reduce suicide deaths. Copyright © 2011 Elsevier B.V. All rights reserved.
First principles cable braid electromagnetic penetration model
Warne, Larry Kevin; Langston, William L.; Basilio, Lorena I.; ...
2016-01-01
The model for penetration of a wire braid is rigorously formulated. Integral formulas are developed from energy principles for both self and transfer immittances in terms of potentials for the fields. The detailed boundary value problem for the wire braid is also set up in a very efficient manner; the braid wires act as sources for the potentials in the form of a sequence of line multi-poles with unknown coefficients that are determined by means of conditions arising from the wire surface boundary conditions. Approximations are introduced to relate the local properties of the braid wires to a simplified infinitemore » periodic planar geometry. Furthermore, this is used to treat nonuniform coaxial geometries including eccentric interior coaxial arrangements and an exterior ground plane.« less
Radiation hazard during a manned mission to Mars.
Jäkel, Oliver
2004-01-01
The radiation hazard of interplanetary flights is currently one of the major obstacles to manned missions to Mars. Highly energetic, heavy-charged particles from galactic cosmic radiation can not be sufficiently shielded in space vehicles. The long-term radiation effects to humans of these particles are largely unknown. In addition, unpredictable storms of solar particles may expose the crew to doses that lead to acute radiation effects. A manned flight to Mars currently seems to be a high-risk adventure. This article provides an overview on the radiation sources and risks for a crew on a manned flight to Mars, as currently estimated by scientists of the US National Administration for Space and Aeronautics (NASA) and the Space Studies Board (SSB) of the US National Research Council.
Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L
2010-07-01
This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out.
Johnson, Raymond H.; DeWitt, Ed; Wirt, Laurie; Arnold, L. Rick; Horton, John D.
2011-01-01
The National Park Service (NPS) seeks additional information to better understand the source(s) of groundwater and associated groundwater flow paths to Montezuma Well in Montezuma Castle National Monument, central Arizona. The source of water to Montezuma Well, a flowing sinkhole in a desert setting, is poorly understood. Water emerges from the middle limestone facies of the lacustrine Verde Formation, but the precise origin of the water and its travel path are largely unknown. Some have proposed artesian flow to Montezuma Well through the Supai Formation, which is exposed along the eastern margin of the Verde Valley and underlies the Verde Formation. The groundwater recharge zone likely lies above the floor of the Verde Valley somewhere to the north or east of Montezuma Well, where precipitation is more abundant. Additional data from groundwater, surface water, and bedrock geology are required for Montezuma Well and the surrounding region to test the current conceptual ideas, to provide new details on the groundwater flow in the area, and to assist in future management decisions. The results of this research will provide information for long-term water resource management and the protection of water rights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond, P.A.
1993-03-01
The global geochemical cycle for an element tracks its path from its various sources to its sinks via processes of weathering and transportation. The cycle may then be quantified in a necessarily approximate manner. The geochemical cycle (thus quantified) reveals constraints (known and unknown) on an element's behavior imposed by the various processes which act on it. In the context of a global geochemical cycle, a continent becomes essentially a source term. If, however, an element's behavior is examined in a local or regional context, sources and their related sinks may be identified. This suggests that small-scale geochemical cycles maymore » be superimposed on global geochemical cycles. Definition of such sub-cycles may clarify the distribution of an element in the earth's near-surface environment. In Florida, phosphate minerals of the Hawthorn Group act as a widely distributed source of uranium. Uranium is transported by surface- and ground-waters. Florida is the site of extensive wetlands and peatlands. The organic matter associated with these deposits adsorbs uranium and may act as a local sink depending on its hydrogeologic setting. This work examines the role of organic matter in the distribution of uranium in the surface and shallow subsurface environments of central and north Florida.« less
Climate-driven spatial dynamics of plague among prairie dog colonies.
Snäll, T; O'Hara, R B; Ray, C; Collinge, S K
2008-02-01
We present a Bayesian hierarchical model for the joint spatial dynamics of a host-parasite system. The model was fitted to long-term data on regional plague dynamics and metapopulation dynamics of the black-tailed prairie dog, a declining keystone species of North American prairies. The rate of plague transmission between colonies increases with increasing precipitation, while the rate of infection from unknown sources decreases in response to hot weather. The mean annual dispersal distance of plague is about 10 km, and topographic relief reduces the transmission rate. Larger colonies are more likely to become infected, but colony area does not affect the infectiousness of colonies. The results suggest that prairie dog movements do not drive the spread of plague through the landscape. Instead, prairie dogs are useful sentinels of plague epizootics. Simulations suggest that this model can be used for predicting long-term colony and plague dynamics as well as for identifying which colonies are most likely to become infected in a specific year.
Electromagnetic torques in the core and resonant excitation of decadal polar motion
NASA Astrophysics Data System (ADS)
Mound, Jon E.
2005-02-01
Motion of the rotation axis of the Earth contains decadal variations with amplitudes on the order of 10 mas. The origin of these decadal polar motions is unknown. A class of rotational normal modes of the core-mantle system termed torsional oscillations are known to affect the length of day (LOD) at decadal periods and have also been suggested as a possible excitation source for the observed decadal polar motion. Torsional oscillations involve relative motion between the outer core and the surrounding solid bodies, producing electromagnetic torques at the inner-core boundary (ICB) and core-mantle boundary (CMB). It has been proposed that the ICB torque can explain the excitation of the approximately 30-yr-period polar motion termed the Markowitz wobble. This paper uses the results of a torsional oscillation model to calculate the torques generated at Markowitz and other decadal periods and finds, in contrast to previous results, that electromagnetic torques at the ICB can not explain the observed polar motion.
Whole-plant adjustments in coconut (Cocos nucifera) in response to sink-source imbalance.
Mialet-Serra, I; Clement-Vidal, A; Roupsard, O; Jourdan, C; Dingkuhn, M
2008-08-01
Coconut (Cocos nucifera L.) is a perennial tropical monocotyledon that produces fruit continuously. The physiological function of the large amounts of sucrose stored in coconut stems is unknown. To test the hypothesis that reserve storage and mobilization enable the crop to adjust to variable sink-source relationships at the scale of the whole plant, we investigated the dynamics of dry matter production, yield and yield components, and concentrations of nonstructural carbohydrate reserves in a coconut plantation on Vanuatu Island in the South Pacific. Two treatments were implemented continuously over 29 months (April 2002 to August 2004): 50% leaf pruning (to reduce the source) and 100% fruit and inflorescence pruning (to reduce the sink). The pruning treatments had little effect on carbohydrate reserves because they affected only petioles, not the main reserve pool in the stem. Both pruning treatments greatly reduced dry matter production of the reproductive compartment, but vegetative growth and development were negligibly affected by treatment and season. Leaf pruning increased radiation-use efficiency (RUE) initially, and fruit pruning greatly reduced RUE throughout the experiment. Changes in RUE were negatively correlated with leaflet soluble sugar concentration, indicating feedback inhibition of photosynthesis. We conclude that vegetative development and growth of coconut show little phenotypic plasticity, assimilate demand for growth being largely independent of a fluctuating assimilate supply. The resulting sink-source imbalances were partly compensated for by transitory reserves and, more importantly, by variable RUE in the short term, and by adjustment of fruit load in the long term. Possible physiological mechanisms are discussed, as well as modeling concepts that may be applied to coconut and similar tree crops.
2015-01-01
Targeted environmental monitoring reveals contamination by known chemicals, but may exclude potentially pervasive but unknown compounds. Marine mammals are sentinels of persistent and bioaccumulative contaminants due to their longevity and high trophic position. Using nontargeted analysis, we constructed a mass spectral library of 327 persistent and bioaccumulative compounds identified in blubber from two ecotypes of common bottlenose dolphins (Tursiops truncatus) sampled in the Southern California Bight. This library of halogenated organic compounds (HOCs) consisted of 180 anthropogenic contaminants, 41 natural products, 4 with mixed sources, 8 with unknown sources, and 94 with partial structural characterization and unknown sources. The abundance of compounds whose structures could not be fully elucidated highlights the prevalence of undiscovered HOCs accumulating in marine food webs. Eighty-six percent of the identified compounds are not currently monitored, including 133 known anthropogenic chemicals. Compounds related to dichlorodiphenyltrichloroethane (DDT) were the most abundant. Natural products were, in some cases, detected at abundances similar to anthropogenic compounds. The profile of naturally occurring HOCs differed between ecotypes, suggesting more abundant offshore sources of these compounds. This nontargeted analytical framework provided a comprehensive list of HOCs that may be characteristic of the region, and its application within monitoring surveys may suggest new chemicals for evaluation. PMID:25526519
Publications - GMC 249 | Alaska Division of Geological & Geophysical
DGGS GMC 249 Publication Details Title: Source rock geochemical and visual kerogen data from cuttings Reference Unknown, 1995, Source rock geochemical and visual kerogen data from cuttings (2,520-8,837') of the
Distributed control system for parallel-connected DC boost converters
Goldsmith, Steven
2017-08-15
The disclosed invention is a distributed control system for operating a DC bus fed by disparate DC power sources that service a known or unknown load. The voltage sources vary in v-i characteristics and have time-varying, maximum supply capacities. Each source is connected to the bus via a boost converter, which may have different dynamic characteristics and power transfer capacities, but are controlled through PWM. The invention tracks the time-varying power sources and apportions their power contribution while maintaining the DC bus voltage within the specifications. A central digital controller solves the steady-state system for the optimal duty cycle settings that achieve a desired power supply apportionment scheme for a known or predictable DC load. A distributed networked control system is derived from the central system that utilizes communications among controllers to compute a shared estimate of the unknown time-varying load through shared bus current measurements and bus voltage measurements.
NASA Astrophysics Data System (ADS)
Kovalets, Ivan V.; Efthimiou, George C.; Andronopoulos, Spyros; Venetsanos, Alexander G.; Argyropoulos, Christos D.; Kakosimos, Konstantinos E.
2018-05-01
In this work, we present an inverse computational method for the identification of the location, start time, duration and quantity of emitted substance of an unknown air pollution source of finite time duration in an urban environment. We considered a problem of transient pollutant dispersion under stationary meteorological fields, which is a reasonable assumption for the assimilation of available concentration measurements within 1 h from the start of an incident. We optimized the calculation of the source-receptor function by developing a method which requires integrating as many backward adjoint equations as the available measurement stations. This resulted in high numerical efficiency of the method. The source parameters are computed by maximizing the correlation function of the simulated and observed concentrations. The method has been integrated into the CFD code ADREA-HF and it has been tested successfully by performing a series of source inversion runs using the data of 200 individual realizations of puff releases, previously generated in a wind tunnel experiment.
Analysis of Ribosome Inactivating Protein (RIP): A Bioinformatics Approach
NASA Astrophysics Data System (ADS)
Jothi, G. Edward Gnana; Majilla, G. Sahaya Jose; Subhashini, D.; Deivasigamani, B.
2012-10-01
In spite of the medical advances in recent years, the world is in need of different sources to encounter certain health issues.Ribosome Inactivating Proteins (RIPs) were found to be one among them. In order to get easy access about RIPs, there is a need to analyse RIPs towards constructing a database on RIPs. Also, multiple sequence alignment was done towards screening for homologues of significant RIPs from rare sources against RIPs from easily available sources in terms of similarity. Protein sequences were retrieved from SWISS-PROT and are further analysed using pair wise and multiple sequence alignment.Analysis shows that, 151 RIPs have been characterized to date. Amongst them, there are 87 type I, 37 type II, 1 type III and 25 unknown RIPs. The sequence length information of various RIPs about the availability of full or partial sequence was also found. The multiple sequence alignment of 37 type I RIP using the online server Multalin, indicates the presence of 20 conserved residues. Pairwise alignment and multiple sequence alignment of certain selected RIPs in two groups namely Group I and Group II were carried out and the consensus level was found to be 98%, 98% and 90% respectively.
NASA Technical Reports Server (NTRS)
Lakota, Barbara Anne
1998-01-01
This thesis develops a method to model the acoustic field generated by a monopole source placed in a moving rectangular duct. The walls of the duct are assumed to be infinitesimally thin and the source is placed at the center of the duct. The total acoustic pressure is written in terms of the free-space pressure, or incident pressure, and the scattered pressure. The scattered pressure is the augmentation to the incident pressure due to the presence of the duct. It satisfies a homogeneous wave equation and is discontinuous across the duct walls. Utilizing an integral representation of the scattered pressure, a set of singular boundary integral equations governing the unknown jump in scattered pressure is derived. This equation is solved by the method of collocation after representing the jump in pressure as a double series of shape functions. The solution obtained is then substituted back into the integral representation to determine the scattered pressure, and the total acoustic pressure at any point in the field. A few examples are included to illustrate the influence of various geometric and kinematic parameters on the radiated sound field.
Photoprotection in sequestered plastids of sea slugs and respective algal sources
Cruz, Sónia; Cartaxana, Paulo; Newcomer, Rebecca; Dionísio, Gisela; Calado, Ricardo; Serôdio, João; Pelletreau, Karen N.; Rumpho, Mary E.
2015-01-01
Some sea slugs are capable of retaining functional sequestered chloroplasts (kleptoplasts) for variable periods of time. The mechanisms supporting the maintenance of these organelles in animal hosts are still largely unknown. Non-photochemical quenching (NPQ) and the occurrence of a xanthophyll cycle were investigated in the sea slugs Elysia viridis and E. chlorotica using chlorophyll fluorescence measurements and pigment analysis. The photoprotective capacity of kleptoplasts was compared to that observed in their respective algal source, Codium tomentosum and Vaucheria litorea. A functional xanthophyll cycle and a rapidly reversible NPQ component were found in V. litorea and E. chlorotica but not in C. tomentosum and E. viridis. To our knowledge, this is the first report of the absence of a functional xanthophyll cycle in a green macroalgae. The absence of a functional xanthophyll cycle in C. tomentosum could contribute to the premature loss of photosynthetic activity and relatively short-term retention of kleptoplasts in E. viridis. On the contrary, E. chlorotica displays one of the longest functional examples of kleptoplasty known so far. We speculate that different efficiencies of photoprotection and repair mechanisms of algal food sources play a role in the longevity of photosynthetic activity in kleptoplasts retained by sea slugs. PMID:25601025
Electromagnetic Field Penetration Studies
NASA Technical Reports Server (NTRS)
Deshpande, M.D.
2000-01-01
A numerical method is presented to determine electromagnetic shielding effectiveness of rectangular enclosure with apertures on its wall used for input and output connections, control panels, visual-access windows, ventilation panels, etc. Expressing EM fields in terms of cavity Green's function inside the enclosure and the free space Green's function outside the enclosure, integral equations with aperture tangential electric fields as unknown variables are obtained by enforcing the continuity of tangential electric and magnetic fields across the apertures. Using the Method of Moments, the integral equations are solved for unknown aperture fields. From these aperture fields, the EM field inside a rectangular enclosure due to external electromagnetic sources are determined. Numerical results on electric field shielding of a rectangular cavity with a thin rectangular slot obtained using the present method are compared with the results obtained using simple transmission line technique for code validation. The present technique is applied to determine field penetration inside a Boeing-757 by approximating its passenger cabin as a rectangular cavity filled with a homogeneous medium and its passenger windows by rectangular apertures. Preliminary results for, two windows, one on each side of fuselage were considered. Numerical results for Boeing-757 at frequencies 26 MHz, 171-175 MHz, and 428-432 MHz are presented.
37 CFR 260.7 - Unknown copyright owners.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Unknown copyright owners. 260.7 Section 260.7 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION SERVICES' DIGITAL...
37 CFR 260.7 - Unknown copyright owners.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Unknown copyright owners. 260.7 Section 260.7 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION SERVICES' DIGITAL...
37 CFR 260.7 - Unknown copyright owners.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Unknown copyright owners. 260.7 Section 260.7 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION SERVICES' DIGITAL...
37 CFR 260.7 - Unknown copyright owners.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Unknown copyright owners. 260.7 Section 260.7 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION SERVICES' DIGITAL...
Li, Yongming; Tong, Shaocheng
2017-06-28
In this paper, an adaptive neural networks (NNs)-based decentralized control scheme with the prescribed performance is proposed for uncertain switched nonstrict-feedback interconnected nonlinear systems. It is assumed that nonlinear interconnected terms and nonlinear functions of the concerned systems are unknown, and also the switching signals are unknown and arbitrary. A linear state estimator is constructed to solve the problem of unmeasured states. The NNs are employed to approximate unknown interconnected terms and nonlinear functions. A new output feedback decentralized control scheme is developed by using the adaptive backstepping design technique. The control design problem of nonlinear interconnected switched systems with unknown switching signals can be solved by the proposed scheme, and only a tuning parameter is needed for each subsystem. The proposed scheme can ensure that all variables of the control systems are semi-globally uniformly ultimately bounded and the tracking errors converge to a small residual set with the prescribed performance bound. The effectiveness of the proposed control approach is verified by some simulation results.
NASA Astrophysics Data System (ADS)
Cantelli, A.; D'Orta, F.; Cattini, A.; Sebastianelli, F.; Cedola, L.
2015-08-01
A computational model is developed for retrieving the positions and the emission rates of unknown pollution sources, under steady state conditions, starting from the measurements of the concentration of the pollutants. The approach is based on the minimization of a fitness function employing a genetic algorithm paradigm. The model is tested considering both pollutant concentrations generated through a Gaussian model in 25 points in a 3-D test case domain (1000m × 1000m × 50 m) and experimental data such as the Prairie Grass field experiments data in which about 600 receptors were located along five concentric semicircle arcs and the Fusion Field Trials 2007. The results show that the computational model is capable to efficiently retrieve up to three different unknown sources.
1. Drop Structure on the Arizona Crosscut Canal. Photographer unknown, ...
1. Drop Structure on the Arizona Crosscut Canal. Photographer unknown, no date. Note that caption is incorrect: in relation to Camelback Mountain (rear), this can only be the Old Crosscut. Source: reprinted from the 13th Annual Report of the U.S. Geological Survey, 1893. - Old Crosscut Canal, North Side of Salt River, Phoenix, Maricopa County, AZ
The characteristics and impact of source of infection on sepsis-related ICU outcomes.
Jeganathan, Niranjan; Yau, Stephen; Ahuja, Neha; Otu, Dara; Stein, Brian; Fogg, Louis; Balk, Robert
2017-10-01
Source of infection is an independent predictor of sepsis-related mortality. To date, studies have failed to evaluate differences in septic patients based on the source of infection. Retrospective study of all patients with sepsis admitted to the ICU of a university hospital within a 12month time period. Sepsis due to intravascular device and multiple sources had the highest number of positive blood cultures and microbiology whereas lung and abdominal sepsis had the least. The observed hospital mortality was highest for sepsis due to multiple sources and unknown cause, and was lowest when due to abdominal, genitourinary (GU) or skin/soft tissue. Patients with sepsis due to lungs, unknown and multiple sources had the highest rates of multi-organ failure, whereas those with sepsis due to GU and skin/soft tissue had the lowest rates. Those with multisource sepsis had a significantly higher median ICU length of stay and hospital cost. There are significant differences in patient characteristics, microbiology positivity, organs affected, mortality, length of stay and cost based on the source of sepsis. These differences should be considered in future studies to be able to deliver personalized care. Copyright © 2017 Elsevier Inc. All rights reserved.
Constrained Null Space Component Analysis for Semiblind Source Separation Problem.
Hwang, Wen-Liang; Lu, Keng-Shih; Ho, Jinn
2018-02-01
The blind source separation (BSS) problem extracts unknown sources from observations of their unknown mixtures. A current trend in BSS is the semiblind approach, which incorporates prior information on sources or how the sources are mixed. The constrained independent component analysis (ICA) approach has been studied to impose constraints on the famous ICA framework. We introduced an alternative approach based on the null space component (NCA) framework and referred to the approach as the c-NCA approach. We also presented the c-NCA algorithm that uses signal-dependent semidefinite operators, which is a bilinear mapping, as signatures for operator design in the c-NCA approach. Theoretically, we showed that the source estimation of the c-NCA algorithm converges with a convergence rate dependent on the decay of the sequence, obtained by applying the estimated operators on corresponding sources. The c-NCA can be formulated as a deterministic constrained optimization method, and thus, it can take advantage of solvers developed in optimization society for solving the BSS problem. As examples, we demonstrated electroencephalogram interference rejection problems can be solved by the c-NCA with proximal splitting algorithms by incorporating a sparsity-enforcing separation model and considering the case when reference signals are available.
17. Photographic copy of photograph. Location unknown but assumed to ...
17. Photographic copy of photograph. Location unknown but assumed to be uper end of canal. Features no longer extant. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, WASTEWAY, SLUICEWAY, & BRIDGE, 1/26/25. - San Carlos Irrigation Project, Marin Canal, Amhurst-Hayden Dam to Picacho Reservoir, Coolidge, Pinal County, AZ
... term Narrowed blood vessels; enlarged pupils; increased body temperature, heart rate, and blood pressure; headache; abdominal pain ... unconsciousness, slowed heart rate and breathing, lower body temperature, seizures, coma, death. Long-term Unknown. Other Health- ...
NASA Astrophysics Data System (ADS)
Kurudirek, M.; Medhat, M. E.
2014-07-01
An alternative approach is used to measure normalized mass attenuation coefficients (μ/ρ) of materials with unknown thickness and density. The adopted procedure is based on the use of simultaneous emission of Kα and Kβ X-ray lines as well as gamma peaks from radioactive sources in transmission geometry. 109Cd and 60Co radioactive sources were used for the purpose of the investigation. It has been observed that using the simultaneous X- and/or gamma rays of different energy allows accurate determination of relative mass attenuation coefficients by eliminating the dependence of μ/ρ on thickness and density of the material.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warne, Larry K.; Langston, William L.; Basilio, Lorena I.
The model for penetration of a wire braid is rigorously formulated. Integral formulas are developed from energy principles and reciprocity for both self and transfer immittances in terms of potentials for the fields. The detailed boundary value problem for the wire braid is also setup in a very efficient manner; the braid wires act as sources for the potentials in the form of a sequence of line multipoles with unknown coefficients that are determined by means of conditions arising from the wire surface boundary conditions. Approximations are introduced to relate the local properties of the braid wires to a simplifiedmore » infinite periodic planar geometry. This is used in a simplified application of reciprocity to be able to treat nonuniform coaxial geometries including eccentric interior coaxial arrangements and an exterior ground plane.« less
Huang, Yao-Ting; Chen, Jia-Min; Ho, Bing-Ching; Wu, Zong-Yen; Kuo, Rita C; Liu, Po-Yu
2018-01-01
Stenotrophomonas acidaminiphila is an aerobic, glucose non-fermentative, Gram-negative bacterium that been isolated from various environmental sources, particularly aquatic ecosystems. Although resistance to multiple antimicrobial agents has been reported in S. acidaminiphila , the mechanisms are largely unknown. Here, for the first time, we report the complete genome and antimicrobial resistome analysis of a clinical isolate S. acidaminiphila SUNEO which is resistant to sulfamethoxazole. Comparative analysis among closely related strains identified common and strain-specific genes. In particular, comparison with a sulfamethoxazole-sensitive strain identified a mutation within the sulfonamide-binding site of folP in SUNEO, which may reduce the binding affinity of sulfamethoxazole. Selection pressure analysis indicated folP in SUNEO is under purifying selection, which may be owing to long-term administration of sulfonamide against Stenotrophomonas .
Cardin, Jessica A; Raksin, Jonathan N; Schmidt, Marc F
2005-04-01
Sensorimotor integration in the avian song system is crucial for both learning and maintenance of song, a vocal motor behavior. Although a number of song system areas demonstrate both sensory and motor characteristics, their exact roles in auditory and premotor processing are unclear. In particular, it is unknown whether input from the forebrain nucleus interface of the nidopallium (NIf), which exhibits both sensory and premotor activity, is necessary for both auditory and premotor processing in its target, HVC. Here we show that bilateral NIf lesions result in long-term loss of HVC auditory activity but do not impair song production. NIf is thus a major source of auditory input to HVC, but an intact NIf is not necessary for motor output in adult zebra finches.
NASA Astrophysics Data System (ADS)
Cascio, David M.
1988-05-01
States of nature or observed data are often stochastically modelled as Gaussian random variables. At times it is desirable to transmit this information from a source to a destination with minimal distortion. Complicating this objective is the possible presence of an adversary attempting to disrupt this communication. In this report, solutions are provided to a class of minimax and maximin decision problems, which involve the transmission of a Gaussian random variable over a communications channel corrupted by both additive Gaussian noise and probabilistic jamming noise. The jamming noise is termed probabilistic in the sense that with nonzero probability 1-P, the jamming noise is prevented from corrupting the channel. We shall seek to obtain optimal linear encoder-decoder policies which minimize given quadratic distortion measures.
NASA Astrophysics Data System (ADS)
Kuhlman, K. L.; Neuman, S. P.
2006-12-01
Furman and Neuman (2003) proposed a Laplace Transform Analytic Element Method (LT-AEM) for transient groundwater flow. LT-AEM applies the traditionally steady-state AEM to the Laplace transformed groundwater flow equation, and back-transforms the resulting solution to the time domain using a Fourier Series numerical inverse Laplace transform method (de Hoog, et.al., 1982). We have extended the method so it can compute hydraulic head and flow velocity distributions due to any two-dimensional combination and arrangement of point, line, circular and elliptical area sinks and sources, nested circular or elliptical regions having different hydraulic properties, and areas of specified head, flux or initial condition. The strengths of all sinks and sources, and the specified head and flux values, can all vary in both space and time in an independent and arbitrary fashion. Initial conditions may vary from one area element to another. A solution is obtained by matching heads and normal fluxes along the boundary of each element. The effect which each element has on the total flow is expressed in terms of generalized Fourier series which converge rapidly (<20 terms) in most cases. As there are more matching points than unknown Fourier terms, the matching is accomplished in Laplace space using least-squares. The method is illustrated by calculating the resulting transient head and flow velocities due to an arrangement of elements in both finite and infinite domains. The 2D LT-AEM elements already developed and implemented are currently being extended to solve the 3D groundwater flow equation.
Functional Annotation of the Arabidopsis Genome Using Controlled Vocabularies1
Berardini, Tanya Z.; Mundodi, Suparna; Reiser, Leonore; Huala, Eva; Garcia-Hernandez, Margarita; Zhang, Peifen; Mueller, Lukas A.; Yoon, Jungwoon; Doyle, Aisling; Lander, Gabriel; Moseyko, Nick; Yoo, Danny; Xu, Iris; Zoeckler, Brandon; Montoya, Mary; Miller, Neil; Weems, Dan; Rhee, Seung Y.
2004-01-01
Controlled vocabularies are increasingly used by databases to describe genes and gene products because they facilitate identification of similar genes within an organism or among different organisms. One of The Arabidopsis Information Resource's goals is to associate all Arabidopsis genes with terms developed by the Gene Ontology Consortium that describe the molecular function, biological process, and subcellular location of a gene product. We have also developed terms describing Arabidopsis anatomy and developmental stages and use these to annotate published gene expression data. As of March 2004, we used computational and manual annotation methods to make 85,666 annotations representing 26,624 unique loci. We focus on associating genes to controlled vocabulary terms based on experimental data from the literature and use The Arabidopsis Information Resource-developed PubSearch software to facilitate this process. Each annotation is tagged with a combination of evidence codes, evidence descriptions, and references that provide a robust means to assess data quality. Annotation of all Arabidopsis genes will allow quantitative comparisons between sets of genes derived from sources such as microarray experiments. The Arabidopsis annotation data will also facilitate annotation of newly sequenced plant genomes by using sequence similarity to transfer annotations to homologous genes. In addition, complete and up-to-date annotations will make unknown genes easy to identify and target for experimentation. Here, we describe the process of Arabidopsis functional annotation using a variety of data sources and illustrate several ways in which this information can be accessed and used to infer knowledge about Arabidopsis and other plant species. PMID:15173566
41. Photocopy of progress photograph ca. 1974, photographer unknown. Original ...
41. Photocopy of progress photograph ca. 1974, photographer unknown. Original photograph Property of United States Air Force, 21" Space Command. This is the source for views 41 to 47. CAPE COD AIR STATION PAVE PAWS FACILITY - SHOWING BUILDING "RED IRON" STEEL STRUCTURE NEARING COMPLETION. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA
Vein networks in hydrothermal systems provide constraints for the monitoring of active volcanoes.
Cucci, Luigi; Di Luccio, Francesca; Esposito, Alessandra; Ventura, Guido
2017-03-10
Vein networks affect the hydrothermal systems of many volcanoes, and variations in their arrangement may precede hydrothermal and volcanic eruptions. However, the long-term evolution of vein networks is often unknown because data are lacking. We analyze two gypsum-filled vein networks affecting the hydrothermal field of the active Lipari volcanic Island (Italy) to reconstruct the dynamics of the hydrothermal processes. The older network (E1) consists of sub-vertical, N-S striking veins; the younger network (E2) consists of veins without a preferred strike and dip. E2 veins have larger aperture/length, fracture density, dilatancy, and finite extension than E1. The fluid overpressure of E2 is larger than that of E1 veins, whereas the hydraulic conductance is lower. The larger number of fracture intersections in E2 slows down the fluid movement, and favors fluid interference effects and pressurization. Depths of the E1 and E2 hydrothermal sources are 0.8 km and 4.6 km, respectively. The decrease in the fluid flux, depth of the hydrothermal source, and the pressurization increase in E2 are likely associated to a magma reservoir. The decrease of fluid discharge in hydrothermal fields may reflect pressurization at depth potentially preceding hydrothermal explosions. This has significant implications for the long-term monitoring strategy of volcanoes.
Using an epiphytic moss to identify previously unknown sources of atmospheric cadmium pollution
Geoffrey H. Donovan; Sarah E. Jovan; Demetrios Gatziolis; Igor Burstyn; Yvonne L. Michael; Michael C. Amacher; Vicente J. Monleon
2016-01-01
Urban networks of air-quality monitors are often too widely spaced to identify sources of air pollutants, especially if they do not disperse far from emission sources. The objectives of this study were to test the use of moss bio-indicators to develop a fine-scale map of atmospherically-derived cadmium and to identify the sources of cadmium in a complex urban setting....
Lagrange constraint neural network for audio varying BSS
NASA Astrophysics Data System (ADS)
Szu, Harold H.; Hsu, Charles C.
2002-03-01
Lagrange Constraint Neural Network (LCNN) is a statistical-mechanical ab-initio model without assuming the artificial neural network (ANN) model at all but derived it from the first principle of Hamilton and Lagrange Methodology: H(S,A)= f(S)- (lambda) C(s,A(x,t)) that incorporates measurement constraint C(S,A(x,t))= (lambda) ([A]S-X)+((lambda) 0-1)((Sigma) isi -1) using the vector Lagrange multiplier-(lambda) and a- priori Shannon Entropy f(S) = -(Sigma) i si log si as the Contrast function of unknown number of independent sources si. Szu et al. have first solved in 1997 the general Blind Source Separation (BSS) problem for spatial-temporal varying mixing matrix for the real world remote sensing where a large pixel footprint implies the mixing matrix [A(x,t)] necessarily fill with diurnal and seasonal variations. Because the ground truth is difficult to be ascertained in the remote sensing, we have thus illustrated in this paper, each step of the LCNN algorithm for the simulated spatial-temporal varying BSS in speech, music audio mixing. We review and compare LCNN with other popular a-posteriori Maximum Entropy methodologies defined by ANN weight matrix-[W] sigmoid-(sigma) post processing H(Y=(sigma) ([W]X)) by Bell-Sejnowski, Amari and Oja (BSAO) called Independent Component Analysis (ICA). Both are mirror symmetric of the MaxEnt methodologies and work for a constant unknown mixing matrix [A], but the major difference is whether the ensemble average is taken at neighborhood pixel data X's in BASO or at the a priori sources S variables in LCNN that dictates which method works for spatial-temporal varying [A(x,t)] that would not allow the neighborhood pixel average. We expected the success of sharper de-mixing by the LCNN method in terms of a controlled ground truth experiment in the simulation of variant mixture of two music of similar Kurtosis (15 seconds composed of Saint-Saens Swan and Rachmaninov cello concerto).
Ayvaz, M Tamer
2010-09-20
This study proposes a linked simulation-optimization model for solving the unknown groundwater pollution source identification problems. In the proposed model, MODFLOW and MT3DMS packages are used to simulate the flow and transport processes in the groundwater system. These models are then integrated with an optimization model which is based on the heuristic harmony search (HS) algorithm. In the proposed simulation-optimization model, the locations and release histories of the pollution sources are treated as the explicit decision variables and determined through the optimization model. Also, an implicit solution procedure is proposed to determine the optimum number of pollution sources which is an advantage of this model. The performance of the proposed model is evaluated on two hypothetical examples for simple and complex aquifer geometries, measurement error conditions, and different HS solution parameter sets. Identified results indicated that the proposed simulation-optimization model is an effective way and may be used to solve the inverse pollution source identification problems. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Microseismic imaging using a source function independent full waveform inversion method
NASA Astrophysics Data System (ADS)
Wang, Hanchen; Alkhalifah, Tariq
2018-07-01
At the heart of microseismic event measurements is the task to estimate the location of the source microseismic events, as well as their ignition times. The accuracy of locating the sources is highly dependent on the velocity model. On the other hand, the conventional microseismic source locating methods require, in many cases, manual picking of traveltime arrivals, which do not only lead to manual effort and human interaction, but also prone to errors. Using full waveform inversion (FWI) to locate and image microseismic events allows for an automatic process (free of picking) that utilizes the full wavefield. However, FWI of microseismic events faces incredible nonlinearity due to the unknown source locations (space) and functions (time). We developed a source function independent FWI of microseismic events to invert for the source image, source function and the velocity model. It is based on convolving reference traces with these observed and modelled to mitigate the effect of an unknown source ignition time. The adjoint-state method is used to derive the gradient for the source image, source function and velocity updates. The extended image for the source wavelet in Z axis is extracted to check the accuracy of the inverted source image and velocity model. Also, angle gathers are calculated to assess the quality of the long wavelength component of the velocity model. By inverting for the source image, source wavelet and the velocity model simultaneously, the proposed method produces good estimates of the source location, ignition time and the background velocity for synthetic examples used here, like those corresponding to the Marmousi model and the SEG/EAGE overthrust model.
CLOTHES AS A SOURCE OF PARTICLES CONTRIBUTING TO THE "PERSONAL CLOUD"
Previous studies such as EPA's PTEAM Study have documented increased personal exposures to particles compared to either indoor or outdoor concentrations--a finding that bas been characterized as a "personal cloud." The sources of the personal cloud are unknown, but co...
Identifying Attributes of CO2 Leakage Zones in Shallow Aquifers Using a Parametric Level Set Method
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Islam, A.; Wheeler, M.
2016-12-01
Leakage through abandoned wells and geologic faults poses the greatest risk to CO2 storage permanence. For shallow aquifers, secondary CO2 plumes emanating from the leak zones may go undetected for a sustained period of time and has the greatest potential to cause large-scale and long-term environmental impacts. Identification of the attributes of leak zones, including their shape, location, and strength, is required for proper environmental risk assessment. This study applies a parametric level set (PaLS) method to characterize the leakage zone. Level set methods are appealing for tracking topological changes and recovering unknown shapes of objects. However, level set evolution using the conventional level set methods is challenging. In PaLS, the level set function is approximated using a weighted sum of basis functions and the level set evolution problem is replaced by an optimization problem. The efficacy of PaLS is demonstrated through recovering the source zone created by CO2 leakage into a carbonate aquifer. Our results show that PaLS is a robust source identification method that can recover the approximate source locations in the presence of measurement errors, model parameter uncertainty, and inaccurate initial guesses of source flux strengths. The PaLS inversion framework introduced in this work is generic and can be adapted for any reactive transport model by switching the pre- and post-processing routines.
Detection of X-ray flares from AX J1714.1-3912, the unidentified source near RX J1713.7-3946
NASA Astrophysics Data System (ADS)
Miceli, Marco; Bamba, Aya
2018-04-01
Context. Molecular clouds are predicted to emit nonthermal X-rays when they are close to particle-accelerating supernova remnants (SNRs), and the hard X-ray source AX J1714.1-3912, near the SNR RX J1713.7-3946, has long been considered a candidate for diffuse nonthermal emission associated with cosmic rays diffusing from the remnant to a closeby molecular cloud. Aim. We aim at ascertaining the nature of this source by analyzing two dedicated X-ray observations performed with Suzaku and Chandra. Methods: We extracted images from the data in various energy bands, spectra, and light curves and studied the long-term evolution of the X-ray emission on the basis of the 4.5 yr time separation between the two observations. Results: We found that there is no diffuse emission associated with AX J1714.1-3912, which is instead the point-like source CXOU J171343.9-391205. We discovered rapid time variability (timescale 103 s), together with a high intrinsic absorption and a hard nonthermal spectrum (power law with photon index Γ 1.4). We also found that the X-ray flux of the source drops down by 1-2 orders of magnitude on a timescale of a few years. Conclusions: Our results suggest a possible association between AX J1714.1-3912 and a previously unknown supergiant fast X-ray transient, although further follow-up observations are necessary to prove this association definitively.
The Stellar Cusp in the Galactic Center: Three-Dimensional Orbits of Stars
NASA Astrophysics Data System (ADS)
Chappell, Samantha; Ghez, Andrea M.; Boehle, Anna; Yelda, Sylvana; Sitarski, Breann; Witzel, Gunther; Do, Tuan; Lu, Jessica R.; Morris, Mark; Becklin, Eric E.
2015-01-01
We present new findings from our long term study of the nuclear star cluster around the Galaxy's central supermassive blackhole (SMBH). Measurements where made using speckle and laser guided adaptive optics imaging and integral field spectroscopy on the Keck telescopes. We report 13 new measurable accelerating sources around the SMBH, down to ~17 mag in K band, only 4 of which are known to be young stars, the rest are either known to be old stars or have yet to be spectral typed. Thus we more than double the number of measured accelerations for the known old stars and unknown spectral type population (increasing the number from 6 to 15). Previous observations suggest a flat density profile of late-type stars, contrary to the theorized Bahcall-Wolf cusp (Bahcall & Wolf 1976, 1977; Buchholz et al. 2009; Do et al. 2009; Bartko et al. 2010). With three-dimensional orbits of significantly accelerating sources, we will be able to better characterize the stellar cusp in the Galactic center, including the slope of the stellar density profile.
Fire in the vein: Heroin acidity and its proximal effect on users’ health
Ciccarone, Daniel; Harris, Magdalena
2016-01-01
The loss of functioning veins (venous sclerosis) is a root cause of suffering for long-term heroin injectors. In addition to perpetual frustration and loss of pleasure/esteem, venous sclerosis leads to myriad medical consequences including skin infections, for example, abscess, and possibly elevated HIV/HCV risks due to injection into larger jugular and femoral veins. The etiology of venous sclerosis is unknown and users’ perceptions of cause/meaning unexplored. This commentary stems from our hypothesis that venous sclerosis is causally related to heroin acidity, which varies by heroin source-form and preparation. We report pilot study data on first ever in vivo measurements of heroin pH and as well as qualitative data on users’ concerns and perceptions regarding the caustic nature of heroin and its effects. Heroin pH testing in natural settings is feasible and a useful tool for further research. Our preliminary findings, for example, that different heroin source-forms and preparations have a two log difference in acidity, have potentially broad, vital and readily implementable harm reduction implications. PMID:26077143
Fire in the vein: Heroin acidity and its proximal effect on users' health.
Ciccarone, Daniel; Harris, Magdalena
2015-11-01
The loss of functioning veins (venous sclerosis) is a root cause of suffering for long-term heroin injectors. In addition to perpetual frustration and loss of pleasure/esteem, venous sclerosis leads to myriad medical consequences including skin infections, for example, abscess, and possibly elevated HIV/HCV risks due to injection into larger jugular and femoral veins. The etiology of venous sclerosis is unknown and users' perceptions of cause/meaning unexplored. This commentary stems from our hypothesis that venous sclerosis is causally related to heroin acidity, which varies by heroin source-form and preparation. We report pilot study data on first ever in vivo measurements of heroin pH and as well as qualitative data on users' concerns and perceptions regarding the caustic nature of heroin and its effects. Heroin pH testing in natural settings is feasible and a useful tool for further research. Our preliminary findings, for example, that different heroin source-forms and preparations have a two log difference in acidity, have potentially broad, vital and readily implementable harm reduction implications. Copyright © 2015. Published by Elsevier B.V.
Inverse scattering for an exterior Dirichlet program
NASA Technical Reports Server (NTRS)
Hariharan, S. I.
1981-01-01
Scattering due to a metallic cylinder which is in the field of a wire carrying a periodic current is considered. The location and shape of the cylinder is obtained with a far field measurement in between the wire and the cylinder. The same analysis is applicable in acoustics in the situation that the cylinder is a soft wall body and the wire is a line source. The associated direct problem in this situation is an exterior Dirichlet problem for the Helmholtz equation in two dimensions. An improved low frequency estimate for the solution of this problem using integral equation methods is presented. The far field measurements are related to the solutions of boundary integral equations in the low frequency situation. These solutions are expressed in terms of mapping function which maps the exterior of the unknown curve onto the exterior of a unit disk. The coefficients of the Laurent expansion of the conformal transformations are related to the far field coefficients. The first far field coefficient leads to the calculation of the distance between the source and the cylinder.
Campylobacteriosis - an overview.
Sarkar, S R; Hossain, M A; Paul, S K; Ray, N C; Sultana, S; Rahman, M M; Islam, A
2014-01-01
Campylobacteriosis is a collective term, used for infectious, emerging foodborne disease caused by Campylobacter species comprising Gram negative, curved, and microaerophilic pathogens. The true incidence of human campylobacteriosis is unknown for most countries of the world including Bangladesh. But campylobacteriosis is not uncommon in our country. Due to its increasing incidence in many countries of the world, it is an important issue now a day. Animals such as birds are the main sources of infection. Farm animals such as cattle, poultry are commonly infected from such sources and raw milk, undercooked or poorly handled meat becomes contaminated. Transmission of campylobacteriosis to human occurs through consumption of infected, unpasteurized animal milk and milk products, undercooked poultry and through contaminated drinking water. Contact with contaminated poultry, livestock or household pets, especially puppies, can also cause disease. Due to variability of clinical features and limited availability of laboratory facilities, the disease remains largely under-reported. Early and specific diagnosis is important to ensure a favourable outcome regarding this food borne disease. Antibiotic treatment is controversial, and has only a benefit on the duration of symptoms. Campylobacter infections can be prevented by some simple hygienic food handling practices.
Fajardo, Geroncio C.; Posid, Joseph; Papagiotas, Stephen; Lowe, Luis
2015-01-01
There have been periodic electronic news media reports of potential bioterrorism-related incidents involving unknown substances (often referred to as “white powder”) since the 2001 intentional dissemination of Bacillus anthracis through the US Postal System. This study reviewed the number of unknown “white powder” incidents reported online by the electronic news media and compared them with unknown “white powder” incidents reported to the US Centers for Disease Control and Prevention (CDC) and the US Federal Bureau of Investigation (FBI) during a two-year period from June 1, 2009 and May 31, 2011. Results identified 297 electronic news media reports, 538 CDC reports, and 384 FBI reports of unknown “white powder.” This study showed different unknown “white powder” incidents captured by each of the three sources. However, the authors could not determine the public health implications of this discordance. PMID:25420771
ERIC Educational Resources Information Center
Geva, R.; Eshel, R.; Leitner, Y.; Fattal-Valevski, A.; Harel, S.
2008-01-01
Background: Recent reports showed that children born with intrauterine growth restriction (IUGR) are at greater risk of experiencing verbal short-term memory span (STM) deficits that may impede their learning capacities at school. It is still unknown whether these deficits are modality dependent. Methods: This long-term, prospective design study…
Discovering Peripheral Arterial Disease Cases from Radiology Notes Using Natural Language Processing
Savova, Guergana K.; Fan, Jin; Ye, Zi; Murphy, Sean P.; Zheng, Jiaping; Chute, Christopher G.; Kullo, Iftikhar J.
2010-01-01
As part of the Electronic Medical Records and Genomics Network, we applied, extended and evaluated an open source clinical Natural Language Processing system, Mayo’s Clinical Text Analysis and Knowledge Extraction System, for the discovery of peripheral arterial disease cases from radiology reports. The manually created gold standard consisted of 223 positive, 19 negative, 63 probable and 150 unknown cases. Overall accuracy agreement between the system and the gold standard was 0.93 as compared to a named entity recognition baseline of 0.46. Sensitivity for the positive, probable and unknown cases was 0.93–0.96, and for the negative cases was 0.72. Specificity and negative predictive value for all categories were in the 90’s. The positive predictive value for the positive and unknown categories was in the high 90’s, for the negative category was 0.84, and for the probable category was 0.63. We outline the main sources of errors and suggest improvements. PMID:21347073
8 CFR 341.2 - Examination upon application.
Code of Federal Regulations, 2010 CFR
2010-01-01
... claimed is precluded by reason of death, refusal to testify, unknown whereabouts, advanced age, mental or physical incapacity, or severe illness or infirmity, another witness or witnesses shall be produced. A... as the relationship between the claimant and the citizen source or sources; the citizenship of the...
8 CFR 341.2 - Examination upon application.
Code of Federal Regulations, 2011 CFR
2011-01-01
... claimed is precluded by reason of death, refusal to testify, unknown whereabouts, advanced age, mental or physical incapacity, or severe illness or infirmity, another witness or witnesses shall be produced. A... as the relationship between the claimant and the citizen source or sources; the citizenship of the...
(U) An Analytic Examination of Piezoelectric Ejecta Mass Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tregillis, Ian Lee
2017-02-02
Ongoing efforts to validate a Richtmyer-Meshkov instability (RMI) based ejecta source model [1, 2, 3] in LANL ASC codes use ejecta areal masses derived from piezoelectric sensor data [4, 5, 6]. However, the standard technique for inferring masses from sensor voltages implicitly assumes instantaneous ejecta creation [7], which is not a feature of the RMI source model. To investigate the impact of this discrepancy, we define separate “areal mass functions” (AMFs) at the source and sensor in terms of typically unknown distribution functions for the ejecta particles, and derive an analytic relationship between them. Then, for the case of single-shockmore » ejection into vacuum, we use the AMFs to compare the analytic (or “true”) accumulated mass at the sensor with the value that would be inferred from piezoelectric voltage measurements. We confirm the inferred mass is correct when creation is instantaneous, and furthermore prove that when creation is not instantaneous, the inferred values will always overestimate the true mass. Finally, we derive an upper bound for the error imposed on a perfect system by the assumption of instantaneous ejecta creation. When applied to shots in the published literature, this bound is frequently less than several percent. Errors exceeding 15% may require velocities or timescales at odds with experimental observations.« less
Reduced flight-to-light behaviour of moth populations exposed to long-term urban light pollution
Ebert, Dieter
2016-01-01
The globally increasing light pollution is a well-recognized threat to ecosystems, with negative effects on human, animal and plant wellbeing. The most well-known and widely documented consequence of light pollution is the generally fatal attraction of nocturnal insects to artificial light sources. However, the evolutionary consequences are unknown. Here we report that moth populations from urban areas with high, globally relevant levels of light pollution over several decades show a significantly reduced flight-to-light behaviour compared with populations of the same species from pristine dark-sky habitats. Using a common garden setting, we reared moths from 10 different populations from early-instar larvae and experimentally compared their flight-to-light behaviour under standardized conditions. Moths from urban populations had a significant reduction in the flight-to-light behaviour compared with pristine populations. The reduced attraction to light sources of ‘city moths' may directly increase these individuals' survival and reproduction. We anticipate that it comes with a reduced mobility, which negatively affects foraging as well as colonization ability. As nocturnal insects are of eminent significance as pollinators and the primary food source of many vertebrates, an evolutionary change of the flight-to-light behaviour thereby potentially cascades across species interaction networks. PMID:27072407
Reduced flight-to-light behaviour of moth populations exposed to long-term urban light pollution.
Altermatt, Florian; Ebert, Dieter
2016-04-01
The globally increasing light pollution is a well-recognized threat to ecosystems, with negative effects on human, animal and plant wellbeing. The most well-known and widely documented consequence of light pollution is the generally fatal attraction of nocturnal insects to artificial light sources. However, the evolutionary consequences are unknown. Here we report that moth populations from urban areas with high, globally relevant levels of light pollution over several decades show a significantly reduced flight-to-light behaviour compared with populations of the same species from pristine dark-sky habitats. Using a common garden setting, we reared moths from 10 different populations from early-instar larvae and experimentally compared their flight-to-light behaviour under standardized conditions. Moths from urban populations had a significant reduction in the flight-to-light behaviour compared with pristine populations. The reduced attraction to light sources of 'city moths' may directly increase these individuals' survival and reproduction. We anticipate that it comes with a reduced mobility, which negatively affects foraging as well as colonization ability. As nocturnal insects are of eminent significance as pollinators and the primary food source of many vertebrates, an evolutionary change of the flight-to-light behaviour thereby potentially cascades across species interaction networks. © 2016 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Hacke, Peter L.; Kempe, Michael D.
2015-06-14
Reduced optical transmittance of encapsulation resulting from ultraviolet (UV) degradation has frequently been identified as a cause of decreased PV module performance through the life of installations in the field. The present module safety and qualification standards, however, apply short UV doses only capable of examining design robustness or 'infant mortality' failures. Essential information that might be used to screen encapsulation through product lifetime remains unknown. For example, the relative efficacy of xenon-arc and UVA-340 fluorescent sources or the typical range of activation energy for degradation is not quantified. We have conducted an interlaboratory experiment to provide the understanding thatmore » will be used towards developing a climate- and configuration-specific (UV) weathering test. Five representative, known formulations of EVA were studied in addition to one TPU material. Replicate laminated silica/polymer/silica specimens are being examined at 14 institutions using a variety of indoor chambers (including Xe, UVA-340, and metal-halide light sources) or field aging. The solar-weighted transmittance, yellowness index, and the UV cut-off wavelength, determined from the measured hemispherical transmittance, are examined to provide understanding and guidance for the UV light source (lamp type) and temperature used in accelerated UV aging tests. Index Terms -- reliability, durability, thermal activation.« less
Pulmonary embolism as a complication of long-term total parenteral nutrition.
Mailloux, R J; DeLegge, M H; Kirby, D F
1993-01-01
Although much has been written concerning the complications of long-term total parenteral nutrition, little or no mention of pulmonary embolism is made in the literature. We present two patients maintained on home total parenteral nutrition who suffered pulmonary emboli, one while receiving standard heparin therapy. No potential source other than their indwelling total parenteral nutrition catheter was identified. Studies have revealed catheter-related thrombosis in up to 50% of patients with indwelling central venous catheters. Although early surgical literature suggested that upper extremity deep vein thromboses rarely embolize, more recent investigations have proven this false. In fact, the risk of pulmonary emboli appeared to be greatest in those thrombi that were catheter related. Because of this risk, we suggest a hypercoaguable work-up in any patient with a history of recurrent thrombosis. Heparin is central to the current preventive regimens; however, further study is needed to determine the most efficacious dose. Future development of less thrombogenic catheters will also be of assistance. Thrombolytic agents currently have an expanding role in the treatment of thrombotic complications. Whether they will have a future role in prevention remains unknown.
Park, Eun Sug; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford
2015-06-01
A major difficulty with assessing source-specific health effects is that source-specific exposures cannot be measured directly; rather, they need to be estimated by a source-apportionment method such as multivariate receptor modeling. The uncertainty in source apportionment (uncertainty in source-specific exposure estimates and model uncertainty due to the unknown number of sources and identifiability conditions) has been largely ignored in previous studies. Also, spatial dependence of multipollutant data collected from multiple monitoring sites has not yet been incorporated into multivariate receptor modeling. The objectives of this project are (1) to develop a multipollutant approach that incorporates both sources of uncertainty in source-apportionment into the assessment of source-specific health effects and (2) to develop enhanced multivariate receptor models that can account for spatial correlations in the multipollutant data collected from multiple sites. We employed a Bayesian hierarchical modeling framework consisting of multivariate receptor models, health-effects models, and a hierarchical model on latent source contributions. For the health model, we focused on the time-series design in this project. Each combination of number of sources and identifiability conditions (additional constraints on model parameters) defines a different model. We built a set of plausible models with extensive exploratory data analyses and with information from previous studies, and then computed posterior model probability to estimate model uncertainty. Parameter estimation and model uncertainty estimation were implemented simultaneously by Markov chain Monte Carlo (MCMC*) methods. We validated the methods using simulated data. We illustrated the methods using PM2.5 (particulate matter ≤ 2.5 μm in aerodynamic diameter) speciation data and mortality data from Phoenix, Arizona, and Houston, Texas. The Phoenix data included counts of cardiovascular deaths and daily PM2.5 speciation data from 1995-1997. The Houston data included respiratory mortality data and 24-hour PM2.5 speciation data sampled every six days from a region near the Houston Ship Channel in years 2002-2005. We also developed a Bayesian spatial multivariate receptor modeling approach that, while simultaneously dealing with the unknown number of sources and identifiability conditions, incorporated spatial correlations in the multipollutant data collected from multiple sites into the estimation of source profiles and contributions based on the discrete process convolution model for multivariate spatial processes. This new modeling approach was applied to 24-hour ambient air concentrations of 17 volatile organic compounds (VOCs) measured at nine monitoring sites in Harris County, Texas, during years 2000 to 2005. Simulation results indicated that our methods were accurate in identifying the true model and estimated parameters were close to the true values. The results from our methods agreed in general with previous studies on the source apportionment of the Phoenix data in terms of estimated source profiles and contributions. However, we had a greater number of statistically insignificant findings, which was likely a natural consequence of incorporating uncertainty in the estimated source contributions into the health-effects parameter estimation. For the Houston data, a model with five sources (that seemed to be Sulfate-Rich Secondary Aerosol, Motor Vehicles, Industrial Combustion, Soil/Crustal Matter, and Sea Salt) showed the highest posterior model probability among the candidate models considered when fitted simultaneously to the PM2.5 and mortality data. There was a statistically significant positive association between respiratory mortality and same-day PM2.5 concentrations attributed to one of the sources (probably industrial combustion). The Bayesian spatial multivariate receptor modeling approach applied to the VOC data led to a highest posterior model probability for a model with five sources (that seemed to be refinery, petrochemical production, gasoline evaporation, natural gas, and vehicular exhaust) among several candidate models, with the number of sources varying between three and seven and with different identifiability conditions. Our multipollutant approach assessing source-specific health effects is more advantageous than a single-pollutant approach in that it can estimate total health effects from multiple pollutants and can also identify emission sources that are responsible for adverse health effects. Our Bayesian approach can incorporate not only uncertainty in the estimated source contributions, but also model uncertainty that has not been addressed in previous studies on assessing source-specific health effects. The new Bayesian spatial multivariate receptor modeling approach enables predictions of source contributions at unmonitored sites, minimizing exposure misclassification and providing improved exposure estimates along with their uncertainty estimates, as well as accounting for uncertainty in the number of sources and identifiability conditions.
Testing the Hypothesis of a Homoscedastic Error Term in Simple, Nonparametric Regression
ERIC Educational Resources Information Center
Wilcox, Rand R.
2006-01-01
Consider the nonparametric regression model Y = m(X)+ [tau](X)[epsilon], where X and [epsilon] are independent random variables, [epsilon] has a median of zero and variance [sigma][squared], [tau] is some unknown function used to model heteroscedasticity, and m(X) is an unknown function reflecting some conditional measure of location associated…
Developing Zika vaccines: the lessons for disease X.
Barrett, Alan D T
2018-06-26
There is an urgent need to develop vaccines against emerging diseases, including those caused by pathogens that are currently unknown to cause human disease, termed 'disease X'. Here, Zika virus infection is considered as an example of disease X. The speed of Zika vaccine development provides optimism for our ability to prepare vaccines against unknown pathogens.
Manfredi, Marcello; Robotti, Elisa; Bearman, Greg; France, Fenella; Barberis, Elettra; Shor, Pnina; Marengo, Emilio
2016-01-01
Today the long-term conservation of cultural heritage is a big challenge: often the artworks were subjected to unknown interventions, which eventually were found to be harmful. The noninvasive investigation of the conservation treatments to which they were subjected to is a crucial step in order to undertake the best conservation strategies. We describe here the preliminary results on a quick and direct method for the nondestructive identification of the various interventions of parchment by means of direct analysis in real time (DART) ionization and high-resolution time-of-flight mass spectrometry and chemometrics. The method has been developed for the noninvasive analysis of the Dead Sea Scrolls, one of the most important archaeological discoveries of the 20th century. In this study castor oil and glycerol parchment treatments, prepared on new parchment specimens, were investigated in order to evaluate two different types of operations. The method was able to identify both treatments. In order to investigate the effect of the ion source temperature on the mass spectra, the DART-MS analysis was also carried out at several temperatures. Due to the high sensitivity, simplicity, and no sample preparation requirement, the proposed analytical methodology could help conservators in the challenging analysis of unknown treatments in cultural heritage.
Milman, Boris L
2005-01-01
A library consisting of 3766 MS(n) spectra of 1743 compounds, including 3126 MS2 spectra acquired mainly using ion trap (IT) and triple-quadrupole (QqQ) instruments, was composed of numerous collections/sources. Ionization techniques were mainly electrospray ionization and also atmospheric pressure chemical ionization and chemical ionization. The library was tested for the performance in identification of unknowns, and in this context this work is believed to be the largest of all known tests of product-ion mass spectral libraries. The MS2 spectra of the same compounds from different collections were in turn divided into spectra of 'unknown' and reference compounds. For each particular compound, library searches were performed resulting in selection by taking into account the best matches for each spectral collection/source. Within each collection/source, replicate MS2 spectra differed in the collision energy used. Overall, there were up to 950 search results giving the best match factors and their ranks in corresponding hit lists. In general, the correct answers were obtained as the 1st rank in up to 60% of the search results when retrieved with (on average) 2.2 'unknown' and 6.2 reference replicates per compound. With two or more replicates of both 'unknown' and reference spectra (the average numbers of replicates were 4.0 and 7.8, respectively), the fraction of correct answers in the 1st rank increased to 77%. This value is close to the performance of established electron ionization mass spectra libraries (up to 79%) found by other workers. The hypothesis that MS2 spectra better match reference spectra acquired using the same type of tandem mass spectrometer (IT or QqQ) was neither strongly proved nor rejected here. The present work shows that MS2 spectral libraries containing sufficiently numerous different entries for each compound are sufficiently efficient for identification of unknowns and suitable for use with different tandem mass spectrometers. 2005 John Wiley & Sons, Ltd.
Swift J181723.1-164300 is likely a new bursting neutron star low-mass X-ray binary
NASA Astrophysics Data System (ADS)
Parikh, Aastha; Wijnands, Rudy; Degenaar, Nathalie; Altamirano, Diego
2017-08-01
On 28 July 2017 Swift/BAT triggered (#00765081) on an event corresponding to a previously unknown source (Barthelmy et al. 2017, GCN #21369, #21385). Its properties suggested it was likely a Galactic source and not a gamma-ray burst.
ERIC Educational Resources Information Center
Soil Conservation Service (USDA), Washington, DC.
Nonpoint source pollution is both a relatively recent concern and a complex phenomenon with many unknowns. Knowing the extent to which agricultural sources contribute to the total pollutant load, the extent to which various control practices decrease this load, and the effect of reducing the pollutants delivered to a water body are basic to the…
The chemistry of poisons in amphibian skin.
Daly, J W
1995-01-01
Poisons are common in nature, where they often serve the organism in chemical defense. Such poisons either are produced de novo or are sequestered from dietary sources or symbiotic organisms. Among vertebrates, amphibians are notable for the wide range of noxious agents that are contained in granular skin glands. These compounds include amines, peptides, proteins, steroids, and both water-soluble and lipid-soluble alkaloids. With the exception of the alkaloids, most seem to be produced de novo by the amphibian. The skin of amphibians contains many structural classes of alkaloids previously unknown in nature. These include the batrachotoxins, which have recently been discovered to also occur in skin and feathers of a bird, the histrionicotoxins, the gephyrotoxins, the decahydroquinolines, the pumiliotoxins and homopumiliotoxins, epibatidine, and the samandarines. Some amphibian skin alkaloids are clearly sequestered from the diet, which consists mainly of small arthropods. These include pyrrolizidine and indolizidine alkaloids from ants, tricyclic coccinellines from beetles, and pyrrolizidine oximes, presumably from millipedes. The sources of other alkaloids in amphibian skin, including the batrachotoxins, the decahydroquinolines, the histrionicotoxins, the pumiliotoxins, and epibatidine, are unknown. While it is possible that these are produced de novo or by symbiotic microorganisms, it appears more likely that they are sequestered by the amphibians from as yet unknown dietary sources. PMID:7816854
NASA Astrophysics Data System (ADS)
Ubertini, Pietro; Sidoli, L.; Sguera, V.; Bazzano, A.
2009-12-01
Supergiant Fast X-ray Transients (SFXTs) are one of the most interesting (and unexpected) results of the INTEGRAL mission. They are a new class of HMXBs displaying short hard X-ray outbursts (duration less tha a day) characterized by fast flares (few hours timescale) and large dinamic range (10E3-10E4). The physical mechanism driving their peculiar behaviour is still unclear and highly debated: some models involve the structure of the supergiant companion donor wind (likely clumpy, in a spherical or non spherical geometry) and the orbital properties (wide separation with eccentric or circular orbit), while others involve the properties of the neutron star compact object and invoke very low magnetic field values (B < 1E10 G) or alternatively very high (B>1E14 G, magnetars). The picture is still highly unclear from the observational point of view as well: no cyclotron lines have been detected in the spectra, thus the strength of the neutron star magnetic field is unknown. Orbital periods have been measured in only 4 systems, spanning from 3.3 days to 165 days. Even the duty cycle seems to be quite different from source to source. The Energetic X-ray Imaging Survey Telescope (EXIST), with its hard X-ray all-sky survey and large improved limiting sensitivity, will allow us to get a clearer picture of SFXTs. A complete census of their number is essential to enlarge the sample. A long term and continuous as possible X-ray monitoring is crucial to -(1) obtain the duty cycle, -(2 )investigate their unknown orbital properties (separation, orbital period, eccentricity),- (3) to completely cover the whole outburst activity, (4)-to search for cyclotron lines in the high energy spectra. EXIST observations will provide crucial informations to test the different models and shed light on the peculiar behaviour of SFXTs.
Identifing Atmospheric Pollutant Sources Using Artificial Neural Networks
NASA Astrophysics Data System (ADS)
Paes, F. F.; Campos, H. F.; Luz, E. P.; Carvalho, A. R.
2008-05-01
The estimation of the area source pollutant strength is a relevant issue for atmospheric environment. This characterizes an inverse problem in the atmospheric pollution dispersion. In the inverse analysis, an area source domain is considered, where the strength of such area source term is assumed unknown. The inverse problem is solved by using a supervised artificial neural network: multi-layer perceptron. The conection weights of the neural network are computed from delta rule - learning process. The neural network inversion is compared with results from standard inverse analysis (regularized inverse solution). In the regularization method, the inverse problem is formulated as a non-linear optimization approach, whose the objective function is given by the square difference between the measured pollutant concentration and the mathematical models, associated with a regularization operator. In our numerical experiments, the forward problem is addressed by a source-receptor scheme, where a regressive Lagrangian model is applied to compute the transition matrix. The second order maximum entropy regularization is used, and the regularization parameter is calculated by the L-curve technique. The objective function is minimized employing a deterministic scheme (a quasi-Newton algorithm) [1] and a stochastic technique (PSO: particle swarm optimization) [2]. The inverse problem methodology is tested with synthetic observational data, from six measurement points in the physical domain. The best inverse solutions were obtained with neural networks. References: [1] D. R. Roberti, D. Anfossi, H. F. Campos Velho, G. A. Degrazia (2005): Estimating Emission Rate and Pollutant Source Location, Ciencia e Natura, p. 131-134. [2] E.F.P. da Luz, H.F. de Campos Velho, J.C. Becceneri, D.R. Roberti (2007): Estimating Atmospheric Area Source Strength Through Particle Swarm Optimization. Inverse Problems, Desing and Optimization Symposium IPDO-2007, April 16-18, Miami (FL), USA, vol 1, p. 354-359.
Wang, Zhe; Zhou, Xinmiao; Liu, Xin; Dong, Ying; Zhang, Jinlan
2017-01-01
Stanozolol is one of the most commonly abused anabolic androgenic steroids (AAS) by athletes and usually detected by its parent drug and major metabolites. However, its metabolic pathway is complex, varied and individually different, it is important to characterize its overall metabolic profiles and discover new and long-term metabolites for the aims of expanding detection windows. High performance liquid chromatography coupled with triple quadrupole mass spectrometer (HPLC-MS/MS) was used to analyze the human urine after oral administration of stanozolol. Multiple reaction monitoring (MRM), one of the scan modes of triple quadrupole mass spectrometer showing extremely high sensitivity was well used to develop a strategy for metabolic profiles characterization and long-term metabolites detection based on typical precursor to product ion transitions of parent drug and its major metabolites. Utilizing the characteristic fragment ions of stanozolol and its major metabolites as the product ions, and speculating unknown precursor ions based on the possible phase I and phase II metabolic reactions in human body, the metabolite profiles of stanozolol could be comprehensively discovered, especially for those unknown and low concentration metabolites in human urine. Then these metabolites were further well structure identified by targeted high resolution MS/MS scan of quadrupole-time of flight mass spectrometry (Q-TOF). Applying this strategy, 27 phase I and 21 phase II metabolites of stanozolol were identified, in which 13 phase I and 14 phase II metabolites have not been reported previously. The 9 out of 48 metabolites could be detected over 15days post drug administration. This strategy could be employed effectively to characterize AAS metabolic profiles and discover unknown and long-term metabolites in sports drug testing. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhang, Mingfeng; Lin, Qing; Qi, Tong; Wang, Tiankun; Chen, Ching-Cheng; Riggs, Arthur D.; Zeng, Defu
2016-01-01
We previously reported that long-term administration of a low dose of gastrin and epidermal growth factor (GE) augments β-cell neogenesis in late-stage diabetic autoimmune mice after eliminating insulitis by induction of mixed chimerism. However, the source of β-cell neogenesis is still unknown. SRY (sex-determining region Y)-box 9+ (Sox9+) ductal cells in the adult pancreas are clonogenic and can give rise to insulin-producing β cells in an in vitro culture. Whether Sox9+ ductal cells in the adult pancreas can give rise to β cells in vivo remains controversial. Here, using lineage-tracing with genetic labeling of Insulin- or Sox9-expressing cells, we show that hyperglycemia (>300 mg/dL) is required for inducing Sox9+ ductal cell differentiation into insulin-producing β cells, and medium hyperglycemia (300–450 mg/dL) in combination with long-term administration of low-dose GE synergistically augments differentiation and is associated with normalization of blood glucose in nonautoimmune diabetic C57BL/6 mice. Short-term administration of high-dose GE cannot augment differentiation, although it can augment preexisting β-cell replication. These results indicate that medium hyperglycemia combined with long-term administration of low-dose GE represents one way to induce Sox9+ ductal cell differentiation into β cells in adult mice. PMID:26733677
NASA Technical Reports Server (NTRS)
Laster, Rachel M.
2004-01-01
Scientists in the Office of Life and Microgravity Sciences and Applications within the Microgravity Research Division oversee studies in important physical, chemical, and biological processes in microgravity environment. Research is conducted in microgravity environment because of the beneficial results that come about for experiments. When research is done in normal gravity, scientists are limited to results that are affected by the gravity of Earth. Microgravity provides an environment where solid, liquid, and gas can be observed in a natural state of free fall and where many different variables are eliminated. One challenge that NASA faces is that space flight opportunities need to be used effectively and efficiently in order to ensure that some of the most scientifically promising research is conducted. Different vibratory sources are continually active aboard the International Space Station (ISS). Some of the vibratory sources include crew exercise, experiment setup, machinery startup (life support fans, pumps, freezer/compressor, centrifuge), thruster firings, and some unknown events. The Space Acceleration Measurement System (SAMs), which acts as the hardware and carefully positioned aboard the ISS, along with the Microgravity Environment Monitoring System MEMS), which acts as the software and is located here at NASA Glenn, are used to detect these vibratory sources aboard the ISS and recognize them as disturbances. The various vibratory disturbances can sometimes be harmful to the scientists different research projects. Some vibratory disturbances are recognized by the MEMS's database and some are not. Mainly, the unknown events that occur aboard the International Space Station are the ones of major concern. To better aid in the research experiments, the unknown events are identified and verified as unknown events. Features, such as frequency, acceleration level, time and date of recognition of the new patterns are stored in an Excel database. My task is to carefully synthesize frequency and acceleration patterns of unknown events within the Excel database into a new file to determine whether or not certain information that is received i s considered a real vibratory source. Once considered as a vibratory source, further analysis is carried out. The resulting information is used to retrain the MEMS to recognize them as known patterns. These different vibratory disturbances are being constantly monitored to observe if, in any way, the disturbances have an effect on the microgravity environment that research experiments are exposed to. If the disturbance has little or no effect on the experiments, then research is continued. However, if the disturbance is harmful to the experiment, scientists act accordingly by either minimizing the source or terminating the research and neither NASA's time nor money is wasted.
Micro-seismic imaging using a source function independent full waveform inversion method
NASA Astrophysics Data System (ADS)
Wang, Hanchen; Alkhalifah, Tariq
2018-03-01
At the heart of micro-seismic event measurements is the task to estimate the location of the source micro-seismic events, as well as their ignition times. The accuracy of locating the sources is highly dependent on the velocity model. On the other hand, the conventional micro-seismic source locating methods require, in many cases manual picking of traveltime arrivals, which do not only lead to manual effort and human interaction, but also prone to errors. Using full waveform inversion (FWI) to locate and image micro-seismic events allows for an automatic process (free of picking) that utilizes the full wavefield. However, full waveform inversion of micro-seismic events faces incredible nonlinearity due to the unknown source locations (space) and functions (time). We developed a source function independent full waveform inversion of micro-seismic events to invert for the source image, source function and the velocity model. It is based on convolving reference traces with these observed and modeled to mitigate the effect of an unknown source ignition time. The adjoint-state method is used to derive the gradient for the source image, source function and velocity updates. The extended image for the source wavelet in Z axis is extracted to check the accuracy of the inverted source image and velocity model. Also, angle gathers is calculated to assess the quality of the long wavelength component of the velocity model. By inverting for the source image, source wavelet and the velocity model simultaneously, the proposed method produces good estimates of the source location, ignition time and the background velocity for synthetic examples used here, like those corresponding to the Marmousi model and the SEG/EAGE overthrust model.
[The benefits of doing excercise in the elderly].
Avila-Funes, José Alberto; García-Mayo, Emilio José
2004-01-01
Advanced age is associated with changes in body composition such as muscular mass loss, which is defined as sarcopenia. The former term plays a key role in the frailty model, although its source is unknown. Myriad strategies have been used to improve and increase muscular mass and function in older persons. The muscle is a versatile system that owes its great capacity to adaption to regular exercise programs. Aerobic exercise and resistance training improve muscular function and can minimize and even reverse sarcopenia in the elderly (healthy, very elderly or frail). The main difference in prescribing exercise for healthy adults and elderly individuals is that intensity of training program is lower for the latter. This review is aimed toward the physiopathologic aspects and clinical implications regarding muscular mass loss and to programs directed toward increasing strength and/or endurance in the elderly.
Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies
Rukhin, Andrew L.
2011-01-01
A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed. PMID:26989583
Maximum Likelihood and Restricted Likelihood Solutions in Multiple-Method Studies.
Rukhin, Andrew L
2011-01-01
A formulation of the problem of combining data from several sources is discussed in terms of random effects models. The unknown measurement precision is assumed not to be the same for all methods. We investigate maximum likelihood solutions in this model. By representing the likelihood equations as simultaneous polynomial equations, the exact form of the Groebner basis for their stationary points is derived when there are two methods. A parametrization of these solutions which allows their comparison is suggested. A numerical method for solving likelihood equations is outlined, and an alternative to the maximum likelihood method, the restricted maximum likelihood, is studied. In the situation when methods variances are considered to be known an upper bound on the between-method variance is obtained. The relationship between likelihood equations and moment-type equations is also discussed.
NASA Technical Reports Server (NTRS)
Petit, Gerard; Thomas, Claudine; Tavella, Patrizia
1993-01-01
Millisecond pulsars are galactic objects that exhibit a very stable spinning period. Several tens of these celestial clocks have now been discovered, which opens the possibility that an average time scale may be deduced through a long-term stability algorithm. Such an ensemble average makes it possible to reduce the level of the instabilities originating from the pulsars or from other sources of noise, which are unknown but independent. The basis for such an algorithm is presented and applied to real pulsar data. It is shown that pulsar time could shortly become more stable than the present atomic time, for averaging times of a few years. Pulsar time can also be used as a flywheel to maintain the accuracy of atomic time in case of temporary failure of the primary standards, or to transfer the improved accuracy of future standards back to the present.
Long-term infrared monitoring of stellar sources from earth orbit
NASA Technical Reports Server (NTRS)
Maran, S. P.; Heinsheimer, T. F.; Stocker, T. L.; Anand, S. P. S.; Chapman, R. D.; Hobbs, R. W.; Michalitsanos, A. G.; Wright, F. H.; Kipp, S. L.
1976-01-01
A program is discussed which involved monitoring the photometric activity of 18 bright variable IR stars at 2.7 microns with satellite- and rocket-borne instrumentation in the period from 1971 to 1975. The stellar sample includes 3 Lb variables, 8 semiregular variables, 5 Mira-type variables, and 2 previously unknown and unclassified IR variables. Detailed light curves of many of these stars were determined for intervals of 3 yr or more; spectra from 2.7 to 20 microns were constructed for nine of them using data obtained entirely with instruments above the atmosphere. Photometric IR light curves and other data are presented for SW Virginis, R Aquilae, S Scuti, IRC 00265, RT Hydrae, S Orionis, S Canis Minoris, Omicron Ceti, and R Leonis. Several hypotheses concerning the interpretation of the IR data are examined.
Inference of relativistic electron spectra from measurements of inverse Compton radiation
NASA Astrophysics Data System (ADS)
Craig, I. J. D.; Brown, J. C.
1980-07-01
The inference of relativistic electron spectra from spectral measurement of inverse Compton radiation is discussed for the case where the background photon spectrum is a Planck function. The problem is formulated in terms of an integral transform that relates the measured spectrum to the unknown electron distribution. A general inversion formula is used to provide a quantitative assessment of the information content of the spectral data. It is shown that the observations must generally be augmented by additional information if anything other than a rudimentary two or three parameter model of the source function is to be derived. It is also pointed out that since a similar equation governs the continuum spectra emitted by a distribution of black-body radiators, the analysis is relevant to the problem of stellar population synthesis from galactic spectra.
NASA Astrophysics Data System (ADS)
Torpin, Trevor; Boyd, Patricia T.; Smale, Alan P.
2015-01-01
The bright, unusual black-hole X-ray binary LMC X-3 has been monitored virtually continuously by the Japanese MAXI X-ray All-Sky Monitor aboard the International Space Station (Matsuoka, et al., PASJ, 2009) from August 2009 to the present. Comparison with RXTE PCA and ASM light curves during the ~2.33-year period of overlap demonstrate that despite slight differences in energy-band boundaries both the ASM and MAXI faithfully reproduce characteristics of the high-amplitude, nonperiodic long-term variability, on the order of 100-300 days, clearly seen in the more sensitive PCA monitoring. The mechanism for this variability at a timescale many times longer than the 1.7-day orbital period is still unknown. Models to explain the long-term variability invoke mechanisms such as changes in mass transfer rate, and/or a precessing warped accretion disk. Observations of LMC X-3 have not definitely determined whether wind accretion or Roche-love overflow is the driver of the long-term variability. Recent MAXI monitoring of LMC X-3 includes excellent coverage of a rare anomalous low state (ALS) where the X-ray source cannot be distinguished from the background, as well as several normal low states, in which the source count rate passes smoothly through a low, yet detectable value. Pointed Swift XRT and UVOT observations also sample this ALS and one normal low state well. We combine these data sets to study the correlations between the wavelength regimes observed during the ALS versus the normal low. We also examine the behavior of the X-ray hardness ratios using XRT and MAXI monitoring data during the ALS versus the normal low state.
NASA Astrophysics Data System (ADS)
van Borm, Werner August
Electron probe X-ray microanalysis (EPXMA) in combination with an automation system and an energy-dispersive X-ray detection system was used to analyse thousands of microscopical particles, originating from the ambient atmosphere. The huge amount of data was processed by a newly developed X-ray correction method and a number of data reduction procedures. A standardless ZAF procedure for EPXMA was developed for quick semi-quantitative analysis of particles starting from simple corrections, valid for bulk samples and modified taking into account the particle finit diameter, assuming a spherical shape. Tested on a limited database of bulk and particulate samples, the compromise between calculation speed and accuracy yielded for elements with Z > 14 accuracies on concentrations less than 10% while absolute deviations remained below 4 weight%, thus being only important for low concentrations. Next, the possibilities for the use of supervised and unsupervised multivariate particle classification were investigated for source apportionment of individual particles. In a detailed study of the unsupervised cluster analysis technique several aspects were considered, that have a severe influence on the final cluster analysis results, i.e. data acquisition, X-ray peak identification, data normalization, scaling, variable selection, similarity measure, cluster strategy, cluster significance and error propagation. A supervised approach was developed using an expert system-like approach in which identification rules are builded to describe the particle classes in a unique manner. Applications are presented for particles sampled (1) near a zinc smelter (Vieille-Montagne, Balen, Belgium), analyzed for heavy metals, (2) in an urban aerosol (Antwerp, Belgium), analyzed for over 20 elements and (3) in a rural aerosol originating from a swiss mountain area (Bern). Thus is was possible to pinpoint a number of known and unknown sources and characterize their emissions in terms of particles abundance and particle composition. Alternatively, the bulk analysis of filters (total, fine and coarse mode) using Particle Induced X -Ray Emission (PIXE) and the application of a receptor modeling approach provided for complementary information on a macroscopical level. A computer program was developed incorporating an absolute factor analysis based receptor modeling procedure. Source profiles and contributions are described by elemental concentrations and an atmospheric mass balance is put forward. The latter method was applied in a two year study of the Antwerp urban aerosol and for the swiss aerosol, revealing a number of previously known and unknown sources. Both methods were successfully combined to increase the source resolution.
Significance of microbial asynchronous anabolism to soil carbon dynamics driven by litter inputs
Fan, Zhaosheng; Liang, Chao
2015-01-01
Soil organic carbon (SOC) plays an important role in the global carbon cycle. However, it remains largely unknown how plant litter inputs impact magnitude, composition and source configuration of the SOC stocks over long term through microbial catabolism and anabolism, mostly due to uncoupled research on litter decomposition and SOC formation. This limits our ability to predict soil system responses to changes in land-use and climate. Here, we examine how microbes act as a valve controlling carbon sequestrated from plant litters versus released to the atmosphere in natural ecosystems amended with plant litters varying in quantity and quality. We find that litter quality – not quantity – regulates long-term SOC dynamics under different plausible scenarios. Long-term changes in bulk SOC stock occur only when the quality of carbon inputs causes asynchronous change in a microbial physiological trait, defined as “microbial biosynthesis acceleration” (MBA). This is the first theoretical demonstration that the response of the SOC stocks to litter inputs is critically determined by the microbial physiology. Our work suggests that total SOC at an equilibrium state may be an intrinsic property of a given ecosystem, which ultimately is controlled by the asynchronous MBA between microbial functional groups. PMID:25849864
Significance of microbial asynchronous anabolism to soil carbon dynamics driven by litter inputs
Fan, Zhaosheng; Liang, Chao
2015-04-02
Soil organic carbon (SOC) plays an important role in the global carbon cycle. However, it remains largely unknown how plant litter inputs impact magnitude, composition and source configuration of the SOC stocks over long term through microbial catabolism and anabolism, mostly due to uncoupled research on litter decomposition and SOC formation. This limits our ability to predict soil system responses to changes in land-use and climate. Here, we examine how microbes act as a valve controlling carbon sequestrated from plant litters versus released to the atmosphere in natural ecosystems amended with plant litters varying in quantity and quality. We findmore » that litter quality – not quantity – regulates long-term SOC dynamics under different plausible scenarios. Long-term changes in bulk SOC stock occur only when the quality of carbon inputs causes asynchronous change in a microbial physiological trait, defined as ‘‘microbial biosynthesis acceleration’’ (MBA). This is the first theoretical demonstration that the response of the SOC stocks to litter inputs is critically determined by the microbial physiology. Our work suggests that total SOC at an equilibrium state may be an intrinsic property of a given ecosystem, which ultimately is controlled by the asynchronous MBA between microbial functional groups.« less
ERIC Educational Resources Information Center
Pfaffman, Jay
2008-01-01
Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…
Actinomyces cardiffensis sp. nov. from Human Clinical Sources
Hall, Val; Collins, Mattew D.; Hutson, Roger; Falsen, Enevold; Duerden, Brian I.
2002-01-01
Eight strains of a previously undescribed catalase-negative Actinomyces-like bacterium were recovered from human clinical specimens. The morphological and biochemical characteristics of the isolates were consistent with their assignment to the genus Actinomyces, but they did not appear to correspond to any recognized species. 16S rRNA gene sequence analysis showed the organisms represent a hitherto unknown species within the genus Actinomyces related to, albeit distinct from, a group of species which includes Actinomyces turicensis and close relatives. Based on biochemical and molecular genetic evidence, it is proposed that the unknown isolates from human clinical sources be classified as a new species, Actinomyces cardiffensis sp. nov. The type strain of Actinomyces cardiffensis is CCUG 44997T. PMID:12202588
Fajardo, Geroncio C; Posid, Joseph; Papagiotas, Stephen; Lowe, Luis
2015-01-01
There have been periodic electronic news media reports of potential bioterrorism-related incidents involving unknown substances (often referred to as "white powder") since the 2001 intentional dissemination of Bacillus anthracis through the U.S. Postal System. This study reviewed the number of unknown "white powder" incidents reported online by the electronic news media and compared them with unknown "white powder" incidents reported to the U.S. Centers for Disease Control and Prevention (CDC) and the U.S. Federal Bureau of Investigation (FBI) during a 2-year period from June 1, 2009 and May 31, 2011. Results identified 297 electronic news media reports, 538 CDC reports, and 384 FBI reports of unknown "white powder." This study showed different unknown "white powder" incidents captured by each of the three sources. However, the authors could not determine the public health implications of this discordance. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Swift J1822.3-1606: A Probable New SGR in Ground Analysis of BAT Data
NASA Astrophysics Data System (ADS)
Cummings, J. R.; Burrows, D.; Campana, S.; Kennea, J. A.; Krimm, H. A.; Palmer, D. M.; Sakamoto, T.; Zan, S.
2011-07-01
At 2011-07-14 at 12:47:47.1 UTC, Swift-BAT triggered (#457261) on a previously unknown source, Swift J1822.3-1606. This was at the same time as Fermi-GBM trigger #332340476. Only a subthreshold source was detected onboard. There were two subsequent rate increases of similar size, probably from the same source at about T+26 sec and T+308 sec, the latter also causing a rate trigger with no significant source found onboard (#457263).
Microbial source tracking (MST) assays have been mostly employed in temperate climates. However, their value as monitoring tools in tropical and subtropical regions is unknown since the geographic and temporal stability of the assays has not been extensively tested. The objective...
Musicians have better memory than nonmusicians: A meta-analysis.
Talamini, Francesca; Altoè, Gianmarco; Carretti, Barbara; Grassi, Massimo
2017-01-01
Several studies have found that musicians perform better than nonmusicians in memory tasks, but this is not always the case, and the strength of this apparent advantage is unknown. Here, we conducted a meta-analysis with the aim of clarifying whether musicians perform better than nonmusicians in memory tasks. Education Source; PEP (WEB)-Psychoanalytic Electronic Publishing; Psychology and Behavioral Science (EBSCO); PsycINFO (Ovid); PubMed; ScienceDirect-AllBooks Content (Elsevier API); SCOPUS (Elsevier API); SocINDEX with Full Text (EBSCO) and Google Scholar were searched for eligible studies. The selected studies involved two groups of participants: young adult musicians and nonmusicians. All the studies included memory tasks (loading long-term, short-term or working memory) that contained tonal, verbal or visuospatial stimuli. Three meta-analyses were run separately for long-term memory, short-term memory and working memory. We collected 29 studies, including 53 memory tasks. The results showed that musicians performed better than nonmusicians in terms of long-term memory, g = .29, 95% CI (.08-.51), short-term memory, g = .57, 95% CI (.41-.73), and working memory, g = .56, 95% CI (.33-.80). To further explore the data, we included a moderator (the type of stimulus presented, i.e., tonal, verbal or visuospatial), which was found to influence the effect size for short-term and working memory, but not for long-term memory. In terms of short-term and working memory, the musicians' advantage was large with tonal stimuli, moderate with verbal stimuli, and small or null with visuospatial stimuli. The three meta-analyses revealed a small effect size for long-term memory, and a medium effect size for short-term and working memory, suggesting that musicians perform better than nonmusicians in memory tasks. Moreover, the effect of the moderator suggested that, the type of stimuli influences this advantage.
Musicians have better memory than nonmusicians: A meta-analysis
Altoè, Gianmarco; Carretti, Barbara; Grassi, Massimo
2017-01-01
Background Several studies have found that musicians perform better than nonmusicians in memory tasks, but this is not always the case, and the strength of this apparent advantage is unknown. Here, we conducted a meta-analysis with the aim of clarifying whether musicians perform better than nonmusicians in memory tasks. Methods Education Source; PEP (WEB)—Psychoanalytic Electronic Publishing; Psychology and Behavioral Science (EBSCO); PsycINFO (Ovid); PubMed; ScienceDirect—AllBooks Content (Elsevier API); SCOPUS (Elsevier API); SocINDEX with Full Text (EBSCO) and Google Scholar were searched for eligible studies. The selected studies involved two groups of participants: young adult musicians and nonmusicians. All the studies included memory tasks (loading long-term, short-term or working memory) that contained tonal, verbal or visuospatial stimuli. Three meta-analyses were run separately for long-term memory, short-term memory and working memory. Results We collected 29 studies, including 53 memory tasks. The results showed that musicians performed better than nonmusicians in terms of long-term memory, g = .29, 95% CI (.08–.51), short-term memory, g = .57, 95% CI (.41–.73), and working memory, g = .56, 95% CI (.33–.80). To further explore the data, we included a moderator (the type of stimulus presented, i.e., tonal, verbal or visuospatial), which was found to influence the effect size for short-term and working memory, but not for long-term memory. In terms of short-term and working memory, the musicians’ advantage was large with tonal stimuli, moderate with verbal stimuli, and small or null with visuospatial stimuli. Conclusions The three meta-analyses revealed a small effect size for long-term memory, and a medium effect size for short-term and working memory, suggesting that musicians perform better than nonmusicians in memory tasks. Moreover, the effect of the moderator suggested that, the type of stimuli influences this advantage. PMID:29049416
ERIC Educational Resources Information Center
Vallès, Astrid; Granic, Ivica; De Weerd, Peter; Martens, Gerard J. M.
2014-01-01
Modulation of cortical network connectivity is crucial for an adaptive response to experience. In the rat barrel cortex, long-term sensory stimulation induces cortical network modifications and neuronal response changes of which the molecular basis is unknown. Here, we show that long-term somatosensory stimulation by enriched environment…
ERIC Educational Resources Information Center
Dintzner, Matthew R.; Kinzie, Charles R.; Pulkrabek, Kimberly A.; Arena, Anthony F.
2011-01-01
SIPCAn, an acronym for separation, isolation, purification, characterization, and analysis, is presented as a one-term, integrated project for the first-term undergraduate organic laboratory course. Students are assigned two mixtures of unknown organic compounds--a mixture of two liquid compounds and a mixture of two solid compounds--at the…
Shivaprakash, K. Nagaraju; Ramesha, B. Thimmappa; Uma Shaanker, Ramanan; Dayanandan, Selvadurai; Ravikanth, Gudasalamani
2014-01-01
Background and Question The harvesting of medicinal plants from wild sources is escalating in many parts of the world, compromising the long-term survival of natural populations of medicinally important plants and sustainability of sources of raw material to meet pharmaceutical industry needs. Although protected areas are considered to play a central role in conservation of plant genetic resources, the effectiveness of protected areas for maintaining medicinal plant populations subject to intense harvesting pressure remain largely unknown. We conducted genetic and demographic studies of Nothapodytes nimmoniana Graham, one of the extensively harvested medicinal plant species in the Western Ghats biodiversity hotspot, India to assess the effectiveness of protected areas in long-term maintenance of economically important plant species. Methodology/Principal Findings The analysis of adults and seedlings of N. nimmoniana in four protected and four non-protected areas using 7 nuclear microsatellite loci revealed that populations that are distributed within protected areas are subject to lower levels of harvesting and maintain higher genetic diversity (He = 0.816, Ho = 0.607, A = 18.857) than populations in adjoining non-protected areas (He = 0.781, Ho = 0.511, A = 15.571). Furthermore, seedlings in protected areas had significantly higher observed heterozygosity (Ho = 0.630) and private alleles as compared to seedlings in adjoining non-protected areas (Ho = 0.426). Most populations revealed signatures of recent genetic bottleneck. The prediction of long-term maintenance of genetic diversity using BOTTLESIM indicated that current population sizes of the species are not sufficient to maintain 90% of present genetic diversity for next 100 years. Conclusions/Significance Overall, these results highlight the need for establishing more protected areas encompassing a large number of adult plants in the Western Ghats to conserve genetic diversity of economically and medicinally important plant species. PMID:25493426
Zhu, Xiaoyan; Shen, Wenqiang; Huang, Junyang; Zhang, Tianquan; Zhang, Xiaobo; Cui, Yuanjiang; Sang, Xianchun; Ling, Yinghua; Li, Yunfeng; Wang, Nan; Zhao, Fangmin; Zhang, Changwei; Yang, Zhenglin; He, Guanghua
2018-03-01
Sugars are the most abundant organic compounds produced by plants, and can be used to build carbon skeletons and generate energy. The sugar accumulation 1 (OsSAC1) gene encodes a protein with an unknown function that exhibits four N-terminal transmembrane regions and two conserved domains of unknown function, DUF4220 and DUF594. OsSAC1 was found to be poorly and specifically expressed at the bottoms of young leaves and in the developing leaf sheaths. Subcellular location results showed that OsSAC1 was co-localized with ER:mCherry and targeted the endoplasmic reticulum (ER). OsSAC1 has been found to affect sugar partitioning in rice (Oryza sativa). I2/KI starch staining, ultrastructure observations and starch content measurements indicated that more and larger starch granules accumulated in ossac1 source leaves than in wild-type (WT) source leaves. Additionally, higher sucrose and glucose concentrations accumulated in the ossac1 source leaves than in WT source leaves, whereas lower sucrose and glucose concentrations were observed in the ossac1 young leaves and developing leaf sheaths than in those of the WT. Much greater expression of OsAGPL1 and OsAGPS1 (responsible for starch synthesis) and significantly less expression of OscFBP1, OscFBP2, OsSPS1 and OsSPS11 (responsible for sucrose synthesis) and OsSWEET11, OsSWEET14 and OsSUT1 (responsible for sucrose loading) occurred in ossac1 source leaves than in WT source leaves. A greater amount of the rice plasmodesmatal negative regulator OsGSD1 was detected in ossac1 young leaves and developing leaf sheaths than in those of the WT. These results suggest that ER-targeted OsSAC1 may indirectly regulate sugar partitioning in carbon-demanding young leaves and developing leaf sheaths.
Sensitivity Analysis for some Water Pollution Problem
NASA Astrophysics Data System (ADS)
Le Dimet, François-Xavier; Tran Thu, Ha; Hussaini, Yousuff
2014-05-01
Sensitivity Analysis for Some Water Pollution Problems Francois-Xavier Le Dimet1 & Tran Thu Ha2 & M. Yousuff Hussaini3 1Université de Grenoble, France, 2Vietnamese Academy of Sciences, 3 Florida State University Sensitivity analysis employs some response function and the variable with respect to which its sensitivity is evaluated. If the state of the system is retrieved through a variational data assimilation process, then the observation appears only in the Optimality System (OS). In many cases, observations have errors and it is important to estimate their impact. Therefore, sensitivity analysis has to be carried out on the OS, and in that sense sensitivity analysis is a second order property. The OS can be considered as a generalized model because it contains all the available information. This presentation proposes a method to carry out sensitivity analysis in general. The method is demonstrated with an application to water pollution problem. The model involves shallow waters equations and an equation for the pollutant concentration. These equations are discretized using a finite volume method. The response function depends on the pollutant source, and its sensitivity with respect to the source term of the pollutant is studied. Specifically, we consider: • Identification of unknown parameters, and • Identification of sources of pollution and sensitivity with respect to the sources. We also use a Singular Evolutive Interpolated Kalman Filter to study this problem. The presentation includes a comparison of the results from these two methods. .
MIPS AGN and Galaxy Evolution Survey
NASA Astrophysics Data System (ADS)
Jannuzi, Buell; Armus, Lee; Borys, Colin; Brand, Kate; Brodwin, Mark; Brown, Michael; Cool, Richard; Desai, Vandana; Dey, Arjun; Dickinson, Mark; Dole, Herve; Eisenstein, Daniel; Kochanek, Christopher; Le Floc'h, Emeric; Morrison, Jane; Papovich, Casey; Perez-Gonzalez, Pablo; Rieke, George; Rieke, Marcia; Stern, Daniel; Weiner, Ben; Zehavi, Idit
2008-03-01
We propose a far-IR survey of the 9 square degree Bootes field of the NOAO Deep Wide-Field Survey (NDWFS) to 5-sigma flux limits of 0.2, 12.8 and 120 mJy to detect approximately 60000, 3000, and 400 sources at 24, 70 and 160 microns respectively. By combining observations at different roll angles, our maps will have excellent control of detector drifts, enabling precise fluctuation analyses in all three maps. In combination with the matching X-ray, UV, optical, near-IR, and mid-IR photometry, variability data, and the 22,000 spectroscopic redshifts for the field, we have three primary goals. First, we will survey the evolution of LIRGS/ULIRGS to redshifts of 0.6/1.3 at 24 microns and 0.4/0.8 at 70 microns. Over 500 0.6
"A Marriage on the Rocks": An Unknown Letter by William H. Kilpatrick about His Project Method
ERIC Educational Resources Information Center
Knoll, Michael
2010-01-01
William H. Kilpatrick is worldwide known as "Mr. Project Method." But the origin of his celebrated paper of 1918 has never been explored. The discovery of a hitherto unknown letter reveals that Kilpatrick was an educational entrepreneur who, without regard for language and tradition, adopted the term "project" and used it in a provocative new way…
Reder, Lynne M; Victoria, Lindsay W; Manelis, Anna; Oates, Joyce M; Dutcher, Janine M; Bates, Jordan T; Cook, Shaun; Aizenstein, Howard J; Quinlan, Joseph; Gyulai, Ferenc
2013-03-01
In two experiments, we provided support for the hypothesis that stimuli with preexisting memory representations (e.g., famous faces) are easier to associate to their encoding context than are stimuli that lack long-term memory representations (e.g., unknown faces). Subjects viewed faces superimposed on different backgrounds (e.g., the Eiffel Tower). Face recognition on a surprise memory test was better when the encoding background was reinstated than when it was swapped with a different background; however, the reinstatement advantage was modulated by how many faces had been seen with a given background, and reinstatement did not improve recognition for unknown faces. The follow-up experiment added a drug intervention that inhibited the ability to form new associations. Context reinstatement did not improve recognition for famous or unknown faces under the influence of the drug. The results suggest that it is easier to associate context to faces that have a preexisting long-term memory representation than to faces that do not.
ERIC Educational Resources Information Center
Meerschman, Iris; Van Lierde, Kristiane; Van Puyvelde, Caro; Bostyn, Astrid; Claeys, Sofie; D'haeseleer, Evelien
2018-01-01
Background: In contrast with most medical and pharmaceutical therapies, the optimal dosage for voice therapy or training is unknown. Aims: The aim of this study was to compare the effect of a short-term intensive voice training (IVT) with a longer-term traditional voice training (TVT) on the vocal quality and vocal capacities of vocally healthy…
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
The lasting memory enhancements of retrospective attention
Reaves, Sarah; Strunk, Jonathan; Phillips, Shekinah; Verhaeghen, Paul; Duarte, Audrey
2016-01-01
Behavioral research has shown that spatial cues that orient attention toward task relevant items being maintained in visual short-term memory (VSTM) enhance item memory accuracy. However, it is unknown if these retrospective attentional cues (“retro-cues”) enhance memory beyond typical short-term memory delays. It is also unknown whether retro-cues affect the spatial information associated with VSTM representations. Emerging evidence suggests that processes that affect short-term memory maintenance may also affect long-term memory (LTM) but little work has investigated the role of attention in LTM. In the current event-related potential (ERP) study, we investigated the duration of retrospective attention effects and the impact of retrospective attention manipulations on VSTM representations. Results revealed that retro-cueing improved both VSTM and LTM memory accuracy and that posterior maximal ERPs observed during VSTM maintenance predicted subsequent LTM performance. N2pc ERPs associated with attentional selection were attenuated by retro-cueing suggesting that retrospective attention may disrupt maintenance of spatial configural information in VSTM. Collectively, these findings suggest that retrospective attention can alter the structure of memory representations, which impacts memory performance beyond short-term memory delays. PMID:27038756
Exposure to air pollution particles can be associated with increased human morbidity and mortality. The mechanism(s) of lung injury remains unknown. We tested the hypothesis that lung exposure to oil fly ash (an emission source air pollution particle) causes in vivo free radical ...
AGN classification for X-ray sources in the 105 month Swift/BAT survey
NASA Astrophysics Data System (ADS)
Masetti, N.; Bassani, L.; Palazzi, E.; Malizia, A.; Stephen, J. B.; Ubertini, P.
2018-03-01
We here provide classifications for 8 hard X-ray sources listed as 'unknown AGN' in the 105 month Swift/BAT all-sky survey catalogue (Oh et al. 2018, ApJS, 235, 4). The corresponding optical spectra were extracted from the 6dF Galaxy Survey (Jones et al. 2009, MNRAS, 399, 683).
Alfonse, Lauren E; Garrett, Amanda D; Lun, Desmond S; Duffy, Ken R; Grgicak, Catherine M
2018-01-01
DNA-based human identity testing is conducted by comparison of PCR-amplified polymorphic Short Tandem Repeat (STR) motifs from a known source with the STR profiles obtained from uncertain sources. Samples such as those found at crime scenes often result in signal that is a composite of incomplete STR profiles from an unknown number of unknown contributors, making interpretation an arduous task. To facilitate advancement in STR interpretation challenges we provide over 25,000 multiplex STR profiles produced from one to five known individuals at target levels ranging from one to 160 copies of DNA. The data, generated under 144 laboratory conditions, are classified by total copy number and contributor proportions. For the 70% of samples that were synthetically compromised, we report the level of DNA damage using quantitative and end-point PCR. In addition, we characterize the complexity of the signal by exploring the number of detected alleles in each profile. Copyright © 2017 Elsevier B.V. All rights reserved.
Identifying chemicals that are planetary boundary threats.
MacLeod, Matthew; Breitholtz, Magnus; Cousins, Ian T; de Wit, Cynthia A; Persson, Linn M; Rudén, Christina; McLachlan, Michael S
2014-10-07
Rockström et al. proposed a set of planetary boundaries that delimit a "safe operating space for humanity". Many of the planetary boundaries that have so far been identified are determined by chemical agents. Other chemical pollution-related planetary boundaries likely exist, but are currently unknown. A chemical poses an unknown planetary boundary threat if it simultaneously fulfills three conditions: (1) it has an unknown disruptive effect on a vital Earth system process; (2) the disruptive effect is not discovered until it is a problem at the global scale, and (3) the effect is not readily reversible. In this paper, we outline scenarios in which chemicals could fulfill each of the three conditions, then use the scenarios as the basis to define chemical profiles that fit each scenario. The chemical profiles are defined in terms of the nature of the effect of the chemical and the nature of exposure of the environment to the chemical. Prioritization of chemicals in commerce against some of the profiles appears feasible, but there are considerable uncertainties and scientific challenges that must be addressed. Most challenging is prioritizing chemicals for their potential to have a currently unknown effect on a vital Earth system process. We conclude that the most effective strategy currently available to identify chemicals that are planetary boundary threats is prioritization against profiles defined in terms of environmental exposure combined with monitoring and study of the biogeochemical processes that underlie vital Earth system processes to identify currently unknown disruptive effects.
[Kawasaki disease in children and adolescents].
Neudorf, U
2011-12-01
Kawasaki disease (KD) is a systemic vasculitis of unknown etiology. The diagnostic criteria are fulfilled with fever of unknown origin and 4 of the following 5 criteria: bilateral conjunctival injection, cervical lymphadenopathy, polymorphous rash, oral mucous membrane changes (injected lips, strawberry tongue) and peripheral extremity changes (erythema, edema, desquamation). If less than 4 criteria are found incomplete KD can be diagnosed. The therapy is 2 g/kg body weight single dose intravenous immunoglobulin and acetylsalicylic acid (ASS). In the long-term follow-up the main focus is on the coronary arteries because coronary changes play a key role in the intensity of long-term management. There is some evidence that KD is a risk factor for cardiovascular diseases in adults.
Newborn human brain identifies repeated auditory feature conjunctions of low sequential probability.
Ruusuvirta, Timo; Huotilainen, Minna; Fellman, Vineta; Näätänen, Risto
2004-11-01
Natural environments are usually composed of multiple sources for sounds. The sounds might physically differ from one another only as feature conjunctions, and several of them might occur repeatedly in the short term. Nevertheless, the detection of rare sounds requires the identification of the repeated ones. Adults have some limited ability to effortlessly identify repeated sounds in such acoustically complex environments, but the developmental onset of this finite ability is unknown. Sleeping newborn infants were presented with a repeated tone carrying six frequent (P = 0.15 each) and six rare (P approximately 0.017 each) conjunctions of its frequency, intensity and duration. Event-related potentials recorded from the infants' scalp were found to shift in amplitude towards positive polarity selectively in response to rare conjunctions. This finding suggests that humans are relatively hard-wired to preattentively identify repeated auditory feature conjunctions even when such conjunctions occur rarely among other similar ones.
Potential Implication of Residual Viremia in Patients on Effective Antiretroviral Therapy
2015-01-01
Abstract The current antiretroviral therapy (ART) has suppressed viremia to below the limit of detection of clinical viral load assays; however, it cannot eliminate viremia completely in the body even after prolonged treatment. Plasma HIV-1 loads persist at extremely low levels below the clinical detection limit. This low-level viremia (termed “residual viremia”) cannot be abolished in most patients, even after the addition of a new class of drug, i.e., viral integrase inhibitor, to the combined antiretroviral regimens. Neither the cellular source nor the clinical significance of this residual viremia in patients on ART remains fully clear at present. Since residual plasma viruses generally do not evolve with time in the presence of effective ART, one prediction is that these viruses are persistently released at low levels from one or more stable but yet unknown HIV-1 reservoirs in the body during therapy. This review attempts to emphasize the source of residual viremia as another important reservoir (namely, “active reservoir”) distinct from the well-known latent HIV-1 reservoir in the body, and why its elimination should be a priority in the effort for HIV-1 eradication. PMID:25428885
Rotating plasma structures in the cross-field discharge of Hall thrusters
NASA Astrophysics Data System (ADS)
Mazouffre, Stephane; Grimaud, Lou; Tsikata, Sedina; Matyash, Konstantin
2016-09-01
Rotating plasma structures, also termed rotating spokes, are observed in various types of low-pressure discharges with crossed electric and magnetic field configurations, such as Penning sources, magnetron discharges, negative ion sources and Hall thrusters. Such structures correspond to large-scale high-density plasma blocks that rotate in the E×B drift direction with a typical frequency on the order of a few kHz. Although such structures have been extensively studied in many communities, the mechanism at their origin and their role in electron transport across the magnetic field remain unknown. Here, we will present insights into the nature of spokes, gained from a combination of experiments and advanced particle-in-cell numerical simulations that aim at better understanding the physics and the impact of rotating plasma structures in the ExB discharge of the Hall thruster. As rotating spokes appear in the ionization region of such thrusters, and are therefore difficult to probe with diagnostics, experiments have been performed with a wall-less Hall thruster. In this configuration, the entire plasma discharge is pushed outside the dielectric cavity, through which the gas is injected, using the combination of specific magnetic field topology with appropriate anode geometry.
Rate/state Coulomb stress transfer model for the CSEP Japan seismicity forecast
NASA Astrophysics Data System (ADS)
Toda, Shinji; Enescu, Bogdan
2011-03-01
Numerous studies retrospectively found that seismicity rate jumps (drops) by coseismic Coulomb stress increase (decrease). The Collaboratory for the Study of Earthquake Prediction (CSEP) instead provides us an opportunity for prospective testing of the Coulomb hypothesis. Here we adapt our stress transfer model incorporating rate and state dependent friction law to the CSEP Japan seismicity forecast. We demonstrate how to compute the forecast rates of large shocks in 2009 using the large earthquakes during the past 120 years. The time dependent impact of the coseismic stress perturbations explains qualitatively well the occurrence of the recent moderate size shocks. Such ability is partly similar to that of statistical earthquake clustering models. However, our model differs from them as follows: the off-fault aftershock zones can be simulated using finite fault sources; the regional areal patterns of triggered seismicity are modified by the dominant mechanisms of the potential sources; the imparted stresses due to large earthquakes produce stress shadows that lead to a reduction of the forecasted number of earthquakes. Although the model relies on several unknown parameters, it is the first physics based model submitted to the CSEP Japan test center and has the potential to be tuned for short-term earthquake forecasts.
Determining the Intensity of a Point-Like Source Observed on the Background of AN Extended Source
NASA Astrophysics Data System (ADS)
Kornienko, Y. V.; Skuratovskiy, S. I.
2014-12-01
The problem of determining the time dependence of intensity of a point-like source in case of atmospheric blur is formulated and solved by using the Bayesian statistical approach. A pointlike source is supposed to be observed on the background of an extended source with constant in time though unknown brightness. The equation system for optimal statistical estimation of the sequence of intensity values in observation moments is obtained. The problem is particularly relevant for studying gravitational mirages which appear while observing a quasar through the gravitational field of a far galaxy.
Chandran, A; Mazumder, A
2015-12-01
The aims of this study were to investigate the temporal variation in Escherichia coli density and its sources at the drinking water intake of Comox Lake for a period of 3 years (2011-2013). Density of E. coli was assessed by standard membrane filtration method. Source tracking of E. coli were done by using BOX-A1R-based rep-PCR DNA fingerprinting method. Over the years, the mean E. coli density ranged from nondetectable to 9·8 CFU 100 ml(-1) . The density of E. coli in each of the years did not show any significant difference (P > 0·05); however, a comparatively higher density was observed during the fall. Wildlife was (64·28%, 153/238) identified as the major contributing source of E. coli, followed by human (18·06%, 43/238) and unknown sources (17·64%, 42/238). Although the sources were varied by year and season, over all, the predominant contributing sources were black bear, human, unknown, elk, horse and gull. The findings of this investigation identified the multiple animal sources contributing faecal bacteria into the drinking water intake of Comox Lake and their varying temporal occurrence. The results of this study can reliably inform the authorities about the most vulnerable period (season) of faecal bacterial loading and their potential sources in the lake for improving risk assessment and pollution mitigation. © 2015 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Crane, P.; Silliman, S. E.; Boukari, M.; Atoro, I.; Azonsi, F.
2005-12-01
Deteriorating groundwater quality, as represented by high nitrates, in the Colline province of Benin, West Africa was identified by the Benin national water agency, Direction Hydraulique. For unknown reasons the Colline province had consistently higher nitrate levels than any other region of the country. In an effort to address this water quality issue, a collaborative team was created that incorporated professionals from the Universite d'Abomey-Calavi (Benin), the University of Notre Dame (USA), Direction l'Hydraulique (a government water agency in Benin), Centre Afrika Obota (an educational NGO in Benin), and the local population of the village of Adourekoman. The goals of the project were to: (i) identify the source of nitrates, (ii) test field techniques for long term, local monitoring, and (iii) identify possible solutions to the high levels of groundwater nitrates. In order to accomplish these goals, the following methods were utilized: regional sampling of groundwater quality, field methods that allowed the local population to regularly monitor village groundwater quality, isotopic analysis, and sociological methods of surveys, focus groups, and observations. It is through the combination of these multi-disciplinary methods that all three goals were successfully addressed leading to preliminary identification of the sources of nitrates in the village of Adourekoman, confirmation of utility of field techniques, and initial assessment of possible solutions to the contamination problem.
Overweight and diabetes prevention: is a low-carbohydrate-high-fat diet recommendable?
Brouns, Fred
2018-03-14
In the past, different types of diet with a generally low-carbohydrate content (< 50-< 20 g/day) have been promoted, for weight loss and diabetes, and the effectiveness of a very low dietary carbohydrate content has always been a matter of debate. A significant reduction in the amount of carbohydrates in the diet is usually accompanied by an increase in the amount of fat and to a lesser extent, also protein. Accordingly, using the term "low carb-high fat" (LCHF) diet is most appropriate. Low/very low intakes of carbohydrate food sources may impact on overall diet quality and long-term effects of such drastic diet changes remain at present unknown. This narrative review highlights recent metabolic and clinical outcomes of studies as well as practical feasibility of low LCHF diets. A few relevant observations are as follows: (1) any diet type resulting in reduced energy intake will result in weight loss and related favorable metabolic and functional changes; (2) short-term LCHF studies show both favorable and less desirable effects; (3) sustained adherence to a ketogenic LCHF diet appears to be difficult. A non-ketogenic diet supplying 100-150 g carbohydrate/day, under good control, may be more practical. (4) There is lack of data supporting long-term efficacy, safety and health benefits of LCHF diets. Any recommendation should be judged in this light. (5) Lifestyle intervention in people at high risk of developing type 2 diabetes, while maintaining a relative carbohydrate-rich diet, results in long-term prevention of progression to type 2 diabetes and is generally seen as safe.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Englbrecht, F; Lindner, F; Bin, J
2016-06-15
Purpose: To measure and simulate well-defined electron spectra using a linear accelerator and a permanent-magnetic wide-angle spectrometer to test the performance of a novel reconstruction algorithm for retrieval of unknown electron-sources, in view of application to diagnostics of laser-driven particle acceleration. Methods: Six electron energies (6, 9, 12, 15, 18 and 21 MeV, 40cm × 40cm field-size) delivered by a Siemens Oncor linear accelerator were recorded using a permanent-magnetic wide-angle electron spectrometer (150mT) with a one dimensional slit (0.2mm × 5cm). Two dimensional maps representing beam-energy and entrance-position along the slit were measured using different scintillating screens, read by anmore » online CMOS detector of high resolution (0.048mm × 0.048mm pixels) and large field of view (5cm × 10cm). Measured energy-slit position maps were compared to forward FLUKA simulations of electron transport through the spectrometer, starting from IAEA phase-spaces of the accelerator. The latter ones were validated against measured depth-dose and lateral profiles in water. Agreement of forward simulation and measurement was quantified in terms of position and shape of the signal distribution on the detector. Results: Measured depth-dose distributions and lateral profiles in the water phantom showed good agreement with forward simulations of IAEA phase-spaces, thus supporting usage of this simulation source in the study. Measured energy-slit position maps and those obtained by forward Monte-Carlo simulations showed satisfactory agreement in shape and position. Conclusion: Well-defined electron beams of known energy and shape will provide an ideal scenario to study the performance of a novel reconstruction algorithm using measured and simulated signal. Future work will increase the stability and convergence of the reconstruction-algorithm for unknown electron sources, towards final application to the electrons which drive the interaction of TW-class laser pulses with nanometer thin target foils to accelerate protons and ions to multi-MeV kinetic energy. Cluster of Excellence of the German Research Foundation (DFG) “Munich-Centre for Advanced Photonics”.« less
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Bednarcyk, Brett A.
1997-01-01
An efficient implementation of the generalized method of cells micromechanics model is presented that allows analysis of periodic unidirectional composites characterized by repeating unit cells containing thousands of subcells. The original formulation, given in terms of Hill's strain concentration matrices that relate average subcell strains to the macroscopic strains, is reformulated in terms of the interfacial subcell tractions as the basic unknowns. This is accomplished by expressing the displacement continuity equations in terms of the stresses and then imposing the traction continuity conditions directly. The result is a mixed formulation wherein the unknown interfacial subcell traction components are related to the macroscopic strain components. Because the stress field throughout the repeating unit cell is piece-wise uniform, the imposition of traction continuity conditions directly in the displacement continuity equations, expressed in terms of stresses, substantially reduces the number of unknown subcell traction (and stress) components, and thus the size of the system of equations that must be solved. Further reduction in the size of the system of continuity equations is obtained by separating the normal and shear traction equations in those instances where the individual subcells are, at most, orthotropic. The reformulated version facilitates detailed analysis of the impact of the fiber cross-section geometry and arrangement on the response of multi-phased unidirectional composites with and without evolving damage. Comparison of execution times obtained with the original and reformulated versions of the generalized method of cells demonstrates the new version's efficiency.
Stochastic reduced order models for inverse problems under uncertainty
Warner, James E.; Aquino, Wilkins; Grigoriu, Mircea D.
2014-01-01
This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well. PMID:25558115
The Swift-BAT Hard X-ray Transient Monitor
NASA Technical Reports Server (NTRS)
Krimm, Hans; Markwardt, C. B.; Sanwal, D.; Tueller, J.
2006-01-01
The Burst Alert Telescope (BAT) on the Swift satellite is a large field of view instrument that continually monitors the sky to provide the gamma-ray burst trigger for Swift. An average of more than 70% of the sky is observed on a daily basis. The survey mode data is processed on two sets on time scales: from one minute to one day as part of the transient monitor program, and from one spacecraft pointing (approx.20 minutes) to the full mission duration for the hard X-ray survey program. The transient monitor has recently become public through the web site http:// swift.gsfc.nasa.gov/docs/swift/results/transients/. Sky images are processed to detect astrophysical sources in the 15-50 keV energy band and the detected flux or upper limit is calculated for >100 sources on time scales up to one day. Light curves are updated each time that new BAT data becomes available (approx.10 times daily). In addition, the monitor is sensitive to an outburst from a new or unknown source. Sensitivity as a function of time scale for catalog and unknown sources will be presented. The daily exposure for a typical source is approx.1500-3000 seconds, with a 1-sigma sensitivity of approx.4 mCrab. 90% of the sources are sampled at least every 16 days, but many sources are sampled daily. It is expected that the Swift-BAT transient monitor will become an important resource for the high energy astrophysics community.
Swift-BAT: Transient Source Monitoring
NASA Astrophysics Data System (ADS)
Barbier, L. M.; Barthelmy, S.; Cummings, J.; Gehrels, N.; Krimm, H.; Markwardt, C.; Mushotzky, R.; Parsons, A.; Sakamoto, T.; Tueller, J.; Fenimore, E.; Palmer, D.; Skinner, G.; Swift-BAT Team
2005-12-01
The Burst Alert Telescope (BAT) on the Swift satellite is a large field of view instrument that continually monitors the sky to provide the gamma-ray burst trigger for Swift. An average of more than 70% of the sky is observed on a daily basis. The survey mode data is processed on two sets of time scales: from one minute to one day as part of the transient monitor program, and from one spacecraft pointing ( ˜20 minutes) to the full mission duration for the hard X-ray survey program. In the transient monitor program, sky images are processed to detect astrophysical sources in six energy bands covering 15-350 keV. The detected flux or upper limit in each energy band is calculated for >300 objects on time scales up to one day. In addition, the monitor is sensitive to an outburst from a new or unknown source. Sensitivity as a function of time scale for catalog and unknown sources will be presented. The daily exposure for a typical source is ˜1500 - 3000 seconds, with a 1-sigma sensitivity of ˜4mCrab. 90% of the sources are sampled at least every 16 days, but many sources are sampled daily. The BAT team will soon make the results of the transient monitor public to the astrophysical community through the Swift mission web page. It is expected that the Swift-BAT transient monitor will become an important resource for the high energy astrophysics community.
Crop rotations and poultry litter impact dynamic soil chemical properties and soil biota long-term
USDA-ARS?s Scientific Manuscript database
Dynamic soil physiochemical interactions with conservation agricultural practices and soil biota are largely unknown. Therefore, this study aims to quantify long-term (12-yr) impacts of cover crops, poultry litter, crop rotations, and conservation tillage and their interactions on soil physiochemica...
Long-Term Care: Common Issues and Unknowns
ERIC Educational Resources Information Center
Swartz, Katherine; Miake, Naoko; Farag, Nadine
2012-01-01
All industrialized countries are grappling with a common problem--how to provide assistance of various kinds to their rapidly aging populations. The problem for countries searching for models of efficient and high-quality long-term care (LTC) policies is that fewer than a dozen countries have government-organized, formal LTC policies. Relatively…
High performance GPU processing for inversion using uniform grid searches
NASA Astrophysics Data System (ADS)
Venetis, Ioannis E.; Saltogianni, Vasso; Stiros, Stathis; Gallopoulos, Efstratios
2017-04-01
Many geophysical problems are described by systems of redundant, highly non-linear systems of ordinary equations with constant terms deriving from measurements and hence representing stochastic variables. Solution (inversion) of such problems is based on numerical, optimization methods, based on Monte Carlo sampling or on exhaustive searches in cases of two or even three "free" unknown variables. Recently the TOPological INVersion (TOPINV) algorithm, a grid search-based technique in the Rn space, has been proposed. TOPINV is not based on the minimization of a certain cost function and involves only forward computations, hence avoiding computational errors. The basic concept is to transform observation equations into inequalities on the basis of an optimization parameter k and of their standard errors, and through repeated "scans" of n-dimensional search grids for decreasing values of k to identify the optimal clusters of gridpoints which satisfy observation inequalities and by definition contain the "true" solution. Stochastic optimal solutions and their variance-covariance matrices are then computed as first and second statistical moments. Such exhaustive uniform searches produce an excessive computational load and are extremely time consuming for common computers based on a CPU. An alternative is to use a computing platform based on a GPU, which nowadays is affordable to the research community, which provides a much higher computing performance. Using the CUDA programming language to implement TOPINV allows the investigation of the attained speedup in execution time on such a high performance platform. Based on synthetic data we compared the execution time required for two typical geophysical problems, modeling magma sources and seismic faults, described with up to 18 unknown variables, on both CPU/FORTRAN and GPU/CUDA platforms. The same problems for several different sizes of search grids (up to 1012 gridpoints) and numbers of unknown variables were solved on both platforms, and execution time as a function of the grid dimension for each problem was recorded. Results indicate an average speedup in calculations by a factor of 100 on the GPU platform; for example problems with 1012 grid-points require less than two hours instead of several days on conventional desktop computers. Such a speedup encourages the application of TOPINV on high performance platforms, as a GPU, in cases where nearly real time decisions are necessary, for example finite fault modeling to identify possible tsunami sources.
Zhong, Zhixiong; Zhu, Yanzheng; Ahn, Choon Ki
2018-07-01
In this paper, we address the problem of reachable set estimation for continuous-time Takagi-Sugeno (T-S) fuzzy systems subject to unknown output delays. Based on the reachable set concept, a new controller design method is also discussed for such systems. An effective method is developed to attenuate the negative impact from the unknown output delays, which likely degrade the performance/stability of systems. First, an augmented fuzzy observer is proposed to capacitate a synchronous estimation for the system state and the disturbance term owing to the unknown output delays, which ensures that the reachable set of the estimation error is limited via the intersection operation of ellipsoids. Then, a compensation technique is employed to eliminate the influence on the system performance stemmed from the unknown output delays. Finally, the effectiveness and correctness of the obtained theories are verified by the tracking control of autonomous underwater vehicles. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
The long hold: Storing data at the National Archives
NASA Technical Reports Server (NTRS)
Thibodeau, Kenneth
1991-01-01
A description of the information collection and storage needs of the National Archives and Records Administration (NARA) is presented. The unique situation of NARA is detailed. Two aspects which make the issue of obsolescence especially complex and costly are dealing with incoherent data and satisfying unknown and unknowable requirements. The data is incoherent because it comes from a wide range of independent sources, covers unrelated subjects, and is organized and encoded in ways that are not only not controlled but often unknown until received. NARA's mission to preserve and provide access to records with enduring value makes NARA, in effect, the agent of future generations. NARA's responsibility to the future places itself is a perpetual quandary of devotion to serving needs which are unknown.
Schindler, B K; Bruns, S; Lach, G
2015-03-15
Mushrooms have, repeatedly, been shown to contain nicotine. Speculation about the source of contamination has been widespread, however the source of nicotine remains unknown. Previous studies indicate that putrescine, an intermediate in nicotine biosynthesis, can be formed in mushrooms, which might be metabolised to form nicotine. Thus, endogenous formation may be a possible cause for elevated nicotine levels in mushrooms. We present evidence from the literature that may support this hypothesis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gossner, C; Danielson, N; Gervelmeyer, A; Berthe, F; Faye, B; Kaasik Aaslav, K; Adlhoch, C; Zeller, H; Penttinen, P; Coulombier, D
2016-02-01
Middle East respiratory syndrome coronavirus (MERS-CoV) cases without documented contact with another human MERS-CoV case make up 61% (517/853) of all reported cases. These primary cases are of particular interest for understanding the source(s) and route(s) of transmission and for designing long-term disease control measures. Dromedary camels are the only animal species for which there is convincing evidence that it is a host species for MERS-CoV and hence a potential source of human infections. However, only a small proportion of the primary cases have reported contact with camels. Other possible sources and vehicles of infection include food-borne transmission through consumption of unpasteurized camel milk and raw meat, medicinal use of camel urine and zoonotic transmission from other species. There are critical knowledge gaps around this new disease which can only be closed through traditional field epidemiological investigations and studies designed to test hypothesis regarding sources of infection and risk factors for disease. Since the 1960s, there has been a radical change in dromedary camel farming practices in the Arabian Peninsula with an intensification of the production and a concentration of the production around cities. It is possible that the recent intensification of camel herding in the Arabian Peninsula has increased the virus' reproductive number and attack rate in camel herds while the 'urbanization' of camel herding increased the frequency of zoonotic 'spillover' infections from camels to humans. It is reasonable to assume, although difficult to measure, that the sensitivity of public health surveillance to detect previously unknown diseases is lower in East Africa than in Saudi Arabia and that sporadic human cases may have gone undetected there. © 2014 The Authors. Zoonoses and Public Health Published by Blackwell Verlag GmbH.
Contaminant desorption during long-term leaching of hydroxide-weathered Hanford sediments.
Thompson, Aaron; Steefel, Carl I; Perdrial, Nicolas; Chorover, Ion
2010-03-15
Mineral sorption/coprecipitation is thought to be a principal sequestration mechanism for radioactive (90)Sr and (137)Cs in sediments impacted by hyperalkaline, high-level radioactive waste (HLRW) at the DOE's Hanford site. However, the long-term persistence of neo-formed, contaminant bearing phases after removal of the HLRW source is unknown. We subjected pristine Hanford sediments to hyperalkaline Na-AI-NO(3)-OH solutions containing Sr, Cs, and I at 10(-5), 10(-5), and 10(-7) molal, respectively, for 182 days with either <10 ppmv or 385 ppmv pCO(2). This resulted in the formation of feldspathoid minerals. We leached these weathered sediments with dilute, neutral-pH solutions. After 500 pore volumes (PVs), effluent Sr, Cs, NO(3), Al, Si, and pH reached a steady-state with concentrations elevated above those of feedwater. Reactive transport modeling suggests that even after 500 PV, Cs desorption can be explained by ion exchange reactions, whereas Sr desorption is best described by dissolution of Sr-substituted, neo-formed minerals. While, pCO(2) had no effect on Sr or Cs sorption, sediments weathered at <10 ppmv pCO(2) did desorb more Sr (66% vs 28%) and Cs (13% vs 8%) during leaching than those weathered at 385 ppmv pCO(2). Thus, the dissolution of neo-formed aluminosilicates may represent a long-term, low-level supply of (90)Sr at the Hanford site.
NASA Technical Reports Server (NTRS)
Bowman, Elizabeth M.; Carpenter, Joyce; Roy, Robert J.; Van Keuren, Steve; Wilson, Mark E.
2015-01-01
Since 2007, the Oxygen Generation System (OGS) on board the International Space Station (ISS) has been producing oxygen for crew respiration via water electrolysis. As water is consumed in the OGS recirculating water loop, make-up water is furnished by the ISS potable water bus. A rise in Total Organic Carbon (TOC) was observed beginning in February, 2011, which continues through the present date. Increasing TOC is of concern because the organic constituents responsible for the TOC were unknown and had not been identified; hence their impacts on the operation of the electrolytic cell stack components and on microorganism growth rates and types are unknown. Identification of the compounds responsible for the TOC increase, their sources, and estimates of their loadings in the OGA as well as possible mitigation strategies are presented.
Multichannel myopic deconvolution in underwater acoustic channels via low-rank recovery
Tian, Ning; Byun, Sung-Hoon; Sabra, Karim; Romberg, Justin
2017-01-01
This paper presents a technique for solving the multichannel blind deconvolution problem. The authors observe the convolution of a single (unknown) source with K different (unknown) channel responses; from these channel outputs, the authors want to estimate both the source and the channel responses. The authors show how this classical signal processing problem can be viewed as solving a system of bilinear equations, and in turn can be recast as recovering a rank-1 matrix from a set of linear observations. Results of prior studies in the area of low-rank matrix recovery have identified effective convex relaxations for problems of this type and efficient, scalable heuristic solvers that enable these techniques to work with thousands of unknown variables. The authors show how a priori information about the channels can be used to build a linear model for the channels, which in turn makes solving these systems of equations well-posed. This study demonstrates the robustness of this methodology to measurement noises and parametrization errors of the channel impulse responses with several stylized and shallow water acoustic channel simulations. The performance of this methodology is also verified experimentally using shipping noise recorded on short bottom-mounted vertical line arrays. PMID:28599565
Side-emitting fiber optic position sensor
Weiss, Jonathan D [Albuquerque, NM
2008-02-12
A side-emitting fiber optic position sensor and method of determining an unknown position of an object by using the sensor. In one embodiment, a concentrated beam of light source illuminates the side of a side-emitting fiber optic at an unknown axial position along the fiber's length. Some of this side-illuminated light is in-scattered into the fiber and captured. As the captured light is guided down the fiber, its intensity decreases due to loss from side-emission away from the fiber and from bulk absorption within the fiber. By measuring the intensity of light emitted from one (or both) ends of the fiber with a photodetector(s), the axial position of the light source is determined by comparing the photodetector's signal to a calibrated response curve, look-up table, or by using a mathematical model. Alternatively, the side-emitting fiber is illuminated at one end, while a photodetector measures the intensity of light emitted from the side of the fiber, at an unknown position. As the photodetector moves further away from the illuminated end, the detector's signal strength decreases due to loss from side-emission and/or bulk absorption. As before, the detector's signal is correlated to a unique position along the fiber.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Pecha, Petr; Šmídl, Václav
2016-11-01
A stepwise sequential assimilation algorithm is proposed based on an optimisation approach for recursive parameter estimation and tracking of radioactive plume propagation in the early stage of a radiation accident. Predictions of the radiological situation in each time step of the plume propagation are driven by an existing short-term meteorological forecast and the assimilation procedure manipulates the model parameters to match the observations incoming concurrently from the terrain. Mathematically, the task is a typical ill-posed inverse problem of estimating the parameters of the release. The proposed method is designated as a stepwise re-estimation of the source term release dynamics and an improvement of several input model parameters. It results in a more precise determination of the adversely affected areas in the terrain. The nonlinear least-squares regression methodology is applied for estimation of the unknowns. The fast and adequately accurate segmented Gaussian plume model (SGPM) is used in the first stage of direct (forward) modelling. The subsequent inverse procedure infers (re-estimates) the values of important model parameters from the actual observations. Accuracy and sensitivity of the proposed method for real-time forecasting of the accident propagation is studied. First, a twin experiment generating noiseless simulated "artificial" observations is studied to verify the minimisation algorithm. Second, the impact of the measurement noise on the re-estimated source release rate is examined. In addition, the presented method can be used as a proposal for more advanced statistical techniques using, e.g., importance sampling. Copyright © 2016 Elsevier Ltd. All rights reserved.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
[Immortalization of erythroid progenitors for in vitro large-scale red cell production].
Caulier, A; Guyonneau Harmand, L; Garçon, L
2017-09-01
Population ageing and increase in cancer incidence may lead to a decreased availability of red blood cell units. Thus, finding an alternative source of red blood cells is a highly relevant challenge. The possibility to reproduce in vitro the human erythropoiesis opens a new era, particularly since the improvement in the culture systems allows to produce erythrocytes from induced-Pluripotent Stem Cells (iPSCs), or CD34 + Hematopoietic Stem Cells (HSCs). iPSCs have the advantage of in vitro self-renewal, but lead to poor amplification and maturation defects (high persistence of nucleated erythroid precursors). Erythroid differentiation from HSC allows a far better amplification and adult-like hemoglobin synthesis. But the inability of these progenitors to self-renew in vitro remains a limit in their use as a source of stem cells. A major improvement would consist in immortalizing these erythroid progenitors so that they could expand indefinitively. Inducible transgenesis is the first way to achieve this goal. To date, the best immortalized-cell models involve strong oncogenes induction, such as c-Myc, Bcl-xL, and mostly E6/E7 HPV16 viral oncoproteins. However, the quality of terminal differentiation of erythroid progenitors generated by these oncogenes is not optimal yet and the long-term stability of such systems is unknown. Moreover, viral transgenesis and inducible expression of oncogenes raise important problems in term of safety, since the enucleation rate is not 100% and no nucleated cells having replicative capacities should be present in the final product. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Towards the development of high temperature comparison artifacts for radiation thermometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teixeira, R. N.; Machin, G.; Orlando, A.
This paper describes the methodology and first results of the development of high temperature fixed point artifacts of unknown temperature suitable for scale comparison purposes. This study is being undertaken at the Thermal Metrology Division of Inmetro, Brazil, as part of PhD studies. In this initial phase of the study two identical cobalt carbon eutectic cells were constructed and one doped with a known amount of copper. This was an attempt to achieve a controlled change in the transition temperature of the alloy during melting. Copper was chosen due to the relatively simple phase diagram it forms with carbon andmore » cobalt. The cobalt, in powder form, was supplied by Alfa Aesar at 99.998 % purity, and was mixed with carbon powder (1,9 % by weight) of 99.9999 % purity. Complete filling of the crucible took 6 steps and was performed in a vertical furnace with graphite heating elements, in an inert gas atmosphere. The temperature measurements were performed using a KE LP3 radiation thermometer, which was previously evaluated for spectral responsivity, linearity and size-of-source effect (SSE). During these measurements, the thermometer stability was periodically checked using a silver fixed point blackbody maintained in a three zone furnace. The main purpose of the first part of this study is to dope a series of Co-C blackbody with differing amounts of copper, in order to alter their temperatures whilst still retaining good melting plateau performance. The long-term stability of the adjusted transition temperatures will also be investigated. Other dopants will be studied as the research progresses, and thermo chemical modeling will be performed in an attempt to understand the change in temperature with dopant concentration and so help select suitable dopants in the future. The overall objective is to construct comparison artifacts that have good performance, in terms of plateau shape and long-term temperature stability, but with unknown temperatures. These can then be used as comparison artifacts with no participant, except the pilot, knowing the temperature a priori.« less
Accuracy-preserving source term quadrature for third-order edge-based discretization
NASA Astrophysics Data System (ADS)
Nishikawa, Hiroaki; Liu, Yi
2017-09-01
In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.
The lasting memory enhancements of retrospective attention.
Reaves, Sarah; Strunk, Jonathan; Phillips, Shekinah; Verhaeghen, Paul; Duarte, Audrey
2016-07-01
Behavioral research has shown that spatial cues that orient attention toward task relevant items being maintained in visual short-term memory (VSTM) enhance item memory accuracy. However, it is unknown if these retrospective attentional cues ("retro-cues") enhance memory beyond typical short-term memory delays. It is also unknown whether retro-cues affect the spatial information associated with VSTM representations. Emerging evidence suggests that processes that affect short-term memory maintenance may also affect long-term memory (LTM) but little work has investigated the role of attention in LTM. In the current event-related potential (ERP) study, we investigated the duration of retrospective attention effects and the impact of retrospective attention manipulations on VSTM representations. Results revealed that retro-cueing improved both VSTM and LTM memory accuracy and that posterior maximal ERPs observed during VSTM maintenance predicted subsequent LTM performance. N2pc ERPs associated with attentional selection were attenuated by retro-cueing suggesting that retrospective attention may disrupt maintenance of spatial configural information in VSTM. Collectively, these findings suggest that retrospective attention can alter the structure of memory representations, which impacts memory performance beyond short-term memory delays. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul L. Wichlacz
2003-09-01
This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less
Garrison, Virginia H.; Majewski, Michael S.; Foreman, William T.; Genualdi, Susan A.; Mohammed, Azad; Massey Simonich, Stacy L.
2014-01-01
Anthropogenic semivolatile organic compounds (SOCs) that persist in the environment, bioaccumulate, are toxic at low concentrations, and undergo long-range atmospheric transport (LRT) were identified and quantified in the atmosphere of a Saharan dust source region (Mali) and during Saharan dust incursions at downwind sites in the eastern Caribbean (U.S. Virgin Islands, Trinidad and Tobago) and Cape Verde. More organochlorine and organophosphate pesticides (OCPPs), polycyclic aromatic hydrocarbons (PAHs), and polychlorinated biphenyl (PCB) congeners were detected in the Saharan dust region than at downwind sites. Seven of the 13 OCPPs detected occurred at all sites: chlordanes, chlorpyrifos, dacthal, dieldrin, endosulfans, hexachlorobenzene (HCB), and trifluralin. Total SOCs ranged from 1.9–126 ng/m3 (mean = 25 ± 34) at source and 0.05–0.71 ng/m3 (mean = 0.24 ± 0.18) at downwind sites during dust conditions. Most SOC concentrations were 1–3 orders of magnitude higher in source than downwind sites. A Saharan source was confirmed for sampled air masses at downwind sites based on dust particle elemental composition and rare earth ratios, atmospheric back trajectory models, and field observations. SOC concentrations were considerably below existing occupational and/or regulatory limits; however, few regulatory limits exist for these persistent organic compounds. Long-term effects of chronic exposure to low concentrations of SOCs are unknown, as are possible additive or synergistic effects of mixtures of SOCs, biologically active trace metals, and mineral dust particles transported together in Saharan dust air masses.
Emerging Disparities in Dietary Sodium Intake from Snacking in the US Population.
Dunford, Elizabeth K; Poti, Jennifer M; Popkin, Barry M
2017-06-17
The US population consumes dietary sodium well in excess of recommended levels. It is unknown how the contribution of snack foods to sodium intake has changed over time, and whether disparities exist within specific subgroups of the US population. To examine short and long term trends in the contribution of snack food sources to dietary sodium intake for US adults and children over a 37-year period from 1977 to 2014. We used data collected from eight nationally representative surveys of food intake in 50,052 US children aged 2-18 years, and 73,179 adults aged 19+ years between 1977 and 2014. Overall, patterns of snack food consumption, trends in sodium intake from snack food sources and trends in food and beverage sources of sodium from snack foods across race-ethnic, age, gender, body mass index, household education and income groups were examined. In all socio-demographic subgroups there was a significant increase in both per capita sodium intake, and the proportion of sodium intake derived from snacks from 1977-1978 to 2011-2014 ( p < 0.01). Those with the lowest household education, Non-Hispanic Black race-ethnicity, and the lowest income had the largest increase in sodium intake from snacks. While in 1977-1978 Non-Hispanic Blacks had a lower sodium intake from snacks compared to Non-Hispanic Whites ( p < 0.01), in 2011-2014 they had a significantly higher intake. Conclusions: Important disparities are emerging in dietary sodium intake from snack sources in Non-Hispanic Blacks. Our findings have implications for future policy interventions targeting specific US population subgroups.
Bellaïche, L; Laredo, J D; Lioté, F; Koeger, A C; Hamze, B; Ziza, J M; Pertuiset, E; Bardin, T; Tubiana, J M
1997-11-01
A prospective multicenter study. To evaluate the use of magnetic resonance imaging, in the differentiation between monoclonal gammopathies of unknown significance and multiple myeloma. Although multiple myeloma has been studied extensively with magnetic resonance imaging, to the authors' knowledge, no study has evaluated the clinical interest of magnetic resonance imaging in the differentiation between monoclonal gammopathies of unknown significance and multiple myeloma. The magnetic resonance examinations of the thoracolumbar spine in 24 patients with newly diagnosed monoclonal gammopathies of unknown significance were compared with those performed in 44 patients with newly diagnosed nontreated multiple myeloma. All findings on magnetic resonance examination performed in patients with monoclonal gammopathies of unknown significance were normal, whereas findings on 38 (86%) of the 44 magnetic resonance examinations performed in patients with multiple myeloma were abnormal. Magnetic resonance imaging can be considered as an additional diagnostic tool in differentiating between monoclonal gammopathies of unknown significance and multiple myeloma, which may be helpful when routine criteria are not sufficient. An abnormal finding on magnetic resonance examination in a patient with monoclonal gammopathies of unknown significance should suggest the diagnosis of multiple myeloma after other causes of marrow signal abnormalities are excluded. Magnetic resonance imaging also may be proposed in the long-term follow-up of monoclonal gammopathies of unknown significance when a new biologic or clinical event suggests the diagnosis of malignant monoclonal gammopathy.
A Robust Deconvolution Method based on Transdimensional Hierarchical Bayesian Inference
NASA Astrophysics Data System (ADS)
Kolb, J.; Lekic, V.
2012-12-01
Analysis of P-S and S-P conversions allows us to map receiver side crustal and lithospheric structure. This analysis often involves deconvolution of the parent wave field from the scattered wave field as a means of suppressing source-side complexity. A variety of deconvolution techniques exist including damped spectral division, Wiener filtering, iterative time-domain deconvolution, and the multitaper method. All of these techniques require estimates of noise characteristics as input parameters. We present a deconvolution method based on transdimensional Hierarchical Bayesian inference in which both noise magnitude and noise correlation are used as parameters in calculating the likelihood probability distribution. Because the noise for P-S and S-P conversion analysis in terms of receiver functions is a combination of both background noise - which is relatively easy to characterize - and signal-generated noise - which is much more difficult to quantify - we treat measurement errors as an known quantity, characterized by a probability density function whose mean and variance are model parameters. This transdimensional Hierarchical Bayesian approach has been successfully used previously in the inversion of receiver functions in terms of shear and compressional wave speeds of an unknown number of layers [1]. In our method we used a Markov chain Monte Carlo (MCMC) algorithm to find the receiver function that best fits the data while accurately assessing the noise parameters. In order to parameterize the receiver function we model the receiver function as an unknown number of Gaussians of unknown amplitude and width. The algorithm takes multiple steps before calculating the acceptance probability of a new model, in order to avoid getting trapped in local misfit minima. Using both observed and synthetic data, we show that the MCMC deconvolution method can accurately obtain a receiver function as well as an estimate of the noise parameters given the parent and daughter components. Furthermore, we demonstrate that this new approach is far less susceptible to generating spurious features even at high noise levels. Finally, the method yields not only the most-likely receiver function, but also quantifies its full uncertainty. [1] Bodin, T., M. Sambridge, H. Tkalčić, P. Arroucau, K. Gallagher, and N. Rawlinson (2012), Transdimensional inversion of receiver functions and surface wave dispersion, J. Geophys. Res., 117, B02301
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely to...
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely t...
USDA-ARS?s Scientific Manuscript database
The long-term impact of burn trauma on skeletal muscle bioenergetics remains unknown. Here, we determined respiratory capacity and function of skeletal muscle mitochondria in healthy individuals and in burn victims for up to two years post-injury. Biopsies were collected from the m. vastus lateralis...
A Matter of Comparative Music Education? Community Music in Germany
ERIC Educational Resources Information Center
Kertz-Welzel, Alexandra
2009-01-01
In German music education, the term "community music" is almost unknown. There could be various reasons for this fact such as a lack of community music activities in Germany, terminological problems concerning the German translation, or an appropriate explanation of the term "community music." This paper will discuss some of…
NASA Astrophysics Data System (ADS)
Efthimiou, George C.; Kovalets, Ivan V.; Venetsanos, Alexandros; Andronopoulos, Spyros; Argyropoulos, Christos D.; Kakosimos, Konstantinos
2017-12-01
An improved inverse modelling method to estimate the location and the emission rate of an unknown point stationary source of passive atmospheric pollutant in a complex urban geometry is incorporated in the Computational Fluid Dynamics code ADREA-HF and presented in this paper. The key improvement in relation to the previous version of the method lies in a two-step segregated approach. At first only the source coordinates are analysed using a correlation function of measured and calculated concentrations. In the second step the source rate is identified by minimizing a quadratic cost function. The validation of the new algorithm is performed by simulating the MUST wind tunnel experiment. A grid-independent flow field solution is firstly attained by applying successive refinements of the computational mesh and the final wind flow is validated against the measurements quantitatively and qualitatively. The old and new versions of the source term estimation method are tested on a coarse and a fine mesh. The new method appeared to be more robust, giving satisfactory estimations of source location and emission rate on both grids. The performance of the old version of the method varied between failure and success and appeared to be sensitive to the selection of model error magnitude that needs to be inserted in its quadratic cost function. The performance of the method depends also on the number and the placement of sensors constituting the measurement network. Of significant interest for the practical application of the method in urban settings is the number of concentration sensors required to obtain a ;satisfactory; determination of the source. The probability of obtaining a satisfactory solution - according to specified criteria -by the new method has been assessed as function of the number of sensors that constitute the measurement network.
Avian Hepatitis E Virus in Chickens, Taiwan, 2013
Hsu, Ingrid W.-Y.
2014-01-01
A previously unidentified strain of avian hepatitis E virus (aHEV) is now endemic among chickens in Taiwan. Analysis showed that the virus is 81.5%–86.5% similar to other aHEVs. In Taiwan, aHEV infection has been reported in chickens without aHEV exposure, suggesting transmission from asymptomatic cases or repeated introduction through an unknown common source(s). PMID:24378180
13. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
13. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1926. Vol. I, Narrative and Photographs, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown. PIMA LATERAL DROP NEAR KENILWORTH, 5/10/26 - San Carlos Irrigation Project, Pima Lateral, Main Canal at Sacaton Dam, Coolidge, Pinal County, AZ
N. S. Wagenbrenner; S. H. Chung; B. K. Lamb
2017-01-01
Wind erosion of soils burned by wildfire contributes substantial particulate matter (PM) in the form of dust to the atmosphere, but the magnitude of this dust source is largely unknown. It is important to accurately quantify dust emissions because they can impact human health, degrade visibility, exacerbate dust-on-snow issues (including snowmelt timing, snow chemistry...
Publications - GMC 253 | Alaska Division of Geological & Geophysical
of the following Copper River basin oil and gas exploratory wells: Amoco Production Company Ahtna Inc Reference Unknown, 1995, Source rock geochemical and visual kerogen data from cuttings of the following
Tsai, Jason S-H; Hsu, Wen-Teng; Lin, Long-Guei; Guo, Shu-Mei; Tann, Joseph W
2014-01-01
A modified nonlinear autoregressive moving average with exogenous inputs (NARMAX) model-based state-space self-tuner with fault tolerance is proposed in this paper for the unknown nonlinear stochastic hybrid system with a direct transmission matrix from input to output. Through the off-line observer/Kalman filter identification method, one has a good initial guess of modified NARMAX model to reduce the on-line system identification process time. Then, based on the modified NARMAX-based system identification, a corresponding adaptive digital control scheme is presented for the unknown continuous-time nonlinear system, with an input-output direct transmission term, which also has measurement and system noises and inaccessible system states. Besides, an effective state space self-turner with fault tolerance scheme is presented for the unknown multivariable stochastic system. A quantitative criterion is suggested by comparing the innovation process error estimated by the Kalman filter estimation algorithm, so that a weighting matrix resetting technique by adjusting and resetting the covariance matrices of parameter estimate obtained by the Kalman filter estimation algorithm is utilized to achieve the parameter estimation for faulty system recovery. Consequently, the proposed method can effectively cope with partially abrupt and/or gradual system faults and input failures by the fault detection. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Efficient Bayesian experimental design for contaminant source identification
NASA Astrophysics Data System (ADS)
Zhang, Jiangjiang; Zeng, Lingzao; Chen, Cheng; Chen, Dingjiang; Wu, Laosheng
2015-01-01
In this study, an efficient full Bayesian approach is developed for the optimal sampling well location design and source parameters identification of groundwater contaminants. An information measure, i.e., the relative entropy, is employed to quantify the information gain from concentration measurements in identifying unknown parameters. In this approach, the sampling locations that give the maximum expected relative entropy are selected as the optimal design. After the sampling locations are determined, a Bayesian approach based on Markov Chain Monte Carlo (MCMC) is used to estimate unknown parameters. In both the design and estimation, the contaminant transport equation is required to be solved many times to evaluate the likelihood. To reduce the computational burden, an interpolation method based on the adaptive sparse grid is utilized to construct a surrogate for the contaminant transport equation. The approximated likelihood can be evaluated directly from the surrogate, which greatly accelerates the design and estimation process. The accuracy and efficiency of our approach are demonstrated through numerical case studies. It is shown that the methods can be used to assist in both single sampling location and monitoring network design for contaminant source identifications in groundwater.
Methane Leak Detection and Emissions Quantification with UAVs
NASA Astrophysics Data System (ADS)
Barchyn, T.; Fox, T. A.; Hugenholtz, C.
2016-12-01
Robust leak detection and emissions quantification algorithms are required to accurately monitor greenhouse gas emissions. Unmanned aerial vehicles (UAVs, `drones') could both reduce the cost and increase the accuracy of monitoring programs. However, aspects of the platform create unique challenges. UAVs typically collect large volumes of data that are close to source (due to limited range) and often lower quality (due to weight restrictions on sensors). Here we discuss algorithm development for (i) finding sources of unknown position (`leak detection') and (ii) quantifying emissions from a source of known position. We use data from a simulated leak and field study in Alberta, Canada. First, we detail a method for localizing a leak of unknown spatial location using iterative fits against a forward Gaussian plume model. We explore sources of uncertainty, both inherent to the method and operational. Results suggest this method is primarily constrained by accurate wind direction data, distance downwind from source, and the non-Gaussian shape of close range plumes. Second, we examine sources of uncertainty in quantifying emissions with the mass balance method. Results suggest precision is constrained by flux plane interpolation errors and time offsets between spatially adjacent measurements. Drones can provide data closer to the ground than piloted aircraft, but large portions of the plume are still unquantified. Together, we find that despite larger volumes of data, working with close range plumes as measured with UAVs is inherently difficult. We describe future efforts to mitigate these challenges and work towards more robust benchmarking for application in industrial and regulatory settings.
Contaminant source identification using semi-supervised machine learning
NASA Astrophysics Data System (ADS)
Vesselinov, Velimir V.; Alexandrov, Boian S.; O'Malley, Daniel
2018-05-01
Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).
Contaminant source identification using semi-supervised machine learning
Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan
2017-11-08
Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less
Contaminant source identification using semi-supervised machine learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan
Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less
NASA Astrophysics Data System (ADS)
Charrier, J. G.; Richards-Henderson, N. K.; Bein, K. J.; McFall, A. S.; Wexler, A. S.; Anastasio, C.
2015-03-01
Recent epidemiological evidence supports the hypothesis that health effects from inhalation of ambient particulate matter (PM) are governed by more than just the mass of PM inhaled. Both specific chemical components and sources have been identified as important contributors to mortality and hospital admissions, even when these end points are unrelated to PM mass. Sources may cause adverse health effects via their ability to produce reactive oxygen species in the body, possibly due to the transition metal content of the PM. Our goal is to quantify the oxidative potential of ambient particle sources collected during two seasons in Fresno, CA, using the dithiothreitol (DTT) assay. We collected PM from different sources or source combinations into different ChemVol (CV) samplers in real time using a novel source-oriented sampling technique based on single-particle mass spectrometry. We segregated the particles from each source-oriented mixture into two size fractions - ultrafine Dp ≤ 0.17 μm) and submicron fine (0.17 μm ≤ Dp ≤ 1.0 μm) - and measured metals and the rate of DTT loss in each PM extract. We find that the mass-normalized oxidative potential of different sources varies by up to a factor of 8 and that submicron fine PM typically has a larger mass-normalized oxidative potential than ultrafine PM from the same source. Vehicular emissions, regional source mix, commute hours, daytime mixed layer, and nighttime inversion sources exhibit the highest mass-normalized oxidative potential. When we apportion DTT activity for total PM sampled to specific chemical compounds, soluble copper accounts for roughly 50% of total air-volume-normalized oxidative potential, soluble manganese accounts for 20%, and other unknown species, likely including quinones and other organics, account for 30%. During nighttime, soluble copper and manganese largely explain the oxidative potential of PM, while daytime has a larger contribution from unknown (likely organic) species.
Odum, Jackson K.; Williams, Robert; Stephenson, William J.; Tuttle, Martitia P.; Al-Shukri, Hadar
2016-01-01
We collected new high‐resolution P‐wave seismic‐reflection data to explore for possible faults beneath a roughly linear cluster of early to mid‐Holocene earthquake‐induced sand blows to the south of Marianna, Arkansas. The Daytona Beach sand blow deposits are located in east‐central Arkansas about 75 km southwest of Memphis, Tennessee, and about 80 km south of the southwestern end of the New Madrid seismic zone (NMSZ). Previous studies of these sand blows indicate that they were produced between 10,500 and 5350 yr B.P. (before A.D. 1950). The sand blows are large and similar in size to those in the heart of the NMSZ produced by the 1811–1812 earthquakes. The seismic‐reflection profiles reveal a previously unknown zone of near‐vertical faults imaged in the 100–1100‐m depth range that are approximately coincident with a cluster of earthquake‐induced sand blows and a near‐linear surface lineament composed of air photo tonal anomalies. These interpreted faults are expressed as vertical discontinuities with the largest displacement fault showing about 40 m of west‐side‐up displacement at the top of the Paleozoic section at about 1100 m depth. There are about 20 m of folding on reflections within the Eocene strata at 400 m depth. Increasing fault displacement with depth suggests long‐term recurrent faulting. The imaged faults within the vicinity of the numerous sand blow features could be a causative earthquake source, although it does not rule out the possibility of other seismic sources nearby. These newly located faults add to a growing list of potentially active Pleistocene–Holocene faults discovered over the last two decades that are within the Mississippi embayment region but outside of the historical NMSZ.
2014-06-01
high-throughput method has utility for evaluating a diversity of natural materials with unknown complex odor blends that can then be down-selected for...method has utility for evaluating a diversity of natural materials with unknown complex odor blends that can then be down-selected for further...leishmaniasis. Lancet 366: 1561-1577. Petts, S.L., Y. Tang, and R.D. Ward. 1997. Nectar from a wax plant, Hoya sp., as a carbohydrate source for
Integrating data types to enhance shoreline change assessments
NASA Astrophysics Data System (ADS)
Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.
2016-12-01
Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.
Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P
2016-10-01
An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.
12. Interior view of cement and aggregate batch plant showing ...
12. Interior view of cement and aggregate batch plant showing storage bins. Photographer unknown, c. 1926. Source: Ralph Pleasant. - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
NASA Astrophysics Data System (ADS)
O'Donoghue, Aileen A.; Haynes, Martha P.; Koopmann, Rebecca A.; Jones, Michael G.; Hallenbeck, Gregory L.; Giovanelli, Riccardo; Hoffman, Lyle; Craig, David W.; Undergraduate ALFALFA Team
2017-01-01
We have completed three “Harvesting ALFALFA” Arecibo observing programs in the direction of the Pisces-Perseus Supercluster (PPS) since ALFALFA observations were finished in 2012. The first was to perform follow-up observations on high signal-to-noise (S/N > 6.5) ALFALFA detections needing confirmation and low S/N sources lacking optical counterparts. A few more high S/N objects were observed in the second program along with targets visually selected from the Sloan Digital Sky Survey (SDSS). The third program included low S/N ALFALFA sources having optical counterparts with redshifts that were unknown or differed from the ALFALFA observations. It also included more galaxies selected from SDSS by eye and by Structured Query Language (SQL) searches with parameters intended to select galaxies at the distance of the PPS (~6,000 km/s). We used pointed basic Total-Power Position-Switched Observations in the 1340 - 1430 MHz ALFALFA frequency range. For sources of known redshift, we used the Wideband Arecibo Pulsar Processors (WAPP’s) , while for sources of unknown redshift we utilized a hybrid/dual bandwidth Doppler tracking mode using the Arecibo Interim 50-MHz Correlator with 9-level sampling.Results confirmed that a few high S/N ALFALFA sources are spurious as expected from the work of Saintonge (2007), low S/N ALFALA sources lacking an optical counterpart are all likely to be spurious, but low S/N sources with optical counterparts are generally reliable. Of the optically selected sources, about 80% were detected and tended to be near the distance of the PPS.This work has been supported by NSF grant AST-1211005.
Stereoscopic augmented reality with pseudo-realistic global illumination effects
NASA Astrophysics Data System (ADS)
de Sorbier, Francois; Saito, Hideo
2014-03-01
Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.
Fungal diversity in the Atacama Desert.
Santiago, Iara F; Gonçalves, Vívian N; Gómez-Silva, Benito; Galetovic, Alexandra; Rosa, Luiz H
2018-03-07
Fungi are generally easily dispersed, able to colonise a wide variety of substrata and can tolerate diverse environmental conditions. However, despite these abilities, the diversity of fungi in the Atacama Desert is practically unknown. Most of the resident fungi in desert regions are ubiquitous. Some of them, however, seem to display specific adaptations that enable them to survive under the variety of extreme conditions of these regions, such as high temperature, low availability of water, osmotic stress, desiccation, low availability of nutrients, and exposure to high levels of UV radiation. For these reasons, fungal communities living in the Atacama Desert represent an unknown part of global fungal diversity and, consequently, may be source of new species that could be potential sources for new biotechnological products. In this review, we focus on the current knowledge of the diversity, ecology, adaptive strategies, and biotechnological potential of the fungi reported in the different ecosystems of the Atacama Desert.
Weak signal transmission in complex networks and its application in detecting connectivity.
Liang, Xiaoming; Liu, Zonghua; Li, Baowen
2009-10-01
We present a network model of coupled oscillators to study how a weak signal is transmitted in complex networks. Through both theoretical analysis and numerical simulations, we find that the response of other nodes to the weak signal decays exponentially with their topological distance to the signal source and the coupling strength between two neighboring nodes can be figured out by the responses. This finding can be conveniently used to detect the topology of unknown network, such as the degree distribution, clustering coefficient and community structure, etc., by repeatedly choosing different nodes as the signal source. Through four typical networks, i.e., the regular one dimensional, small world, random, and scale-free networks, we show that the features of network can be approximately given by investigating many fewer nodes than the network size, thus our approach to detect the topology of unknown network may be efficient in practical situations with large network size.
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely to...
ERIC Educational Resources Information Center
Harry, Melissa L.; MacDonald, Lynn; McLuckie, Althea; Battista, Christina; Mahoney, Ellen K.; Mahoney, Kevin J.
2017-01-01
Background: Our aim was to explore previously unknown long-term outcomes of self-directed personal care services for young adults with intellectual disabilities and limitations in activities of daily living. Materials and Methods: The present authors utilized participatory action research and qualitative content analysis in interviewing 11 unpaid…
12-Month Follow-Up of Fluoxetine and Cognitive Behavioral Therapy for Binge Eating Disorder
ERIC Educational Resources Information Center
Grilo, Carlos M.; Crosby, Ross D.; Wilson, G. Terence; Masheb, Robin M.
2012-01-01
Objective: The longer term efficacy of medication treatments for binge-eating disorder (BED) remains unknown. This study examined the longer term effects of fluoxetine and cognitive behavioral therapy (CBT) either with fluoxetine (CBT + fluoxetine) or with placebo (CBT + placebo) for BED through 12-month follow-up after completing treatments.…
Erica A. H. Smithwick; Daniel M. Kashian; Michael G. Ryan; Monica G. Turner
2009-01-01
Long-term, landscape patterns in inorganic nitrogen (N) availability and N stocks following infrequent, stand-replacing fire are unknown but are important for interpreting the effect of disturbances on ecosystem function. Here, we present results from a replicated chronosequence study in the Greater Yellowstone Ecosystem (Wyoming, USA) directed at measuring inorganic N...
9. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
9. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1928 Vol I. Irrigation District #4, California and Southern Arizona, RG 75, BIA-Phoenix, Box 40, National Archives, Pacific Southwest Region) Photographer unknown. CASA BLANCA CANAL, HEADING AND FLUME, APRIL 10, 1928 - San Carlos Irrigation Project, Casa Blanca Canal, Gila River, Coolidge, Pinal County, AZ
10. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
10. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1928. Vol I. Irrigation District #4, California and Southern Arizona, RG 75, BIA-Phoenix, Box 40, National Archives, Pacific Southwest Region) Photographer unknown. CASA BLANCA CANAL, HEADING AND FLUME, APRIL 10, 1928 - San Carlos Irrigation Project, Casa Blanca Canal, Gila River, Coolidge, Pinal County, AZ
15. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
15. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1927. Vol. I, Narrative and Photographs, District #4, RG 75, Entry 655, BOx 29, National Archives, Washington, DC.) Photographer unknown. PIMA LATERAL, MCCLELLAN WASH CONDUIT, LOOKING SOUTH-WEST, 4/16/27 - San Carlos Irrigation Project, Pima Lateral, Main Canal at Sacaton Dam, Coolidge, Pinal County, AZ
MTR BASEMENT. DOORWAY TO SOURCE STORAGE VAULT IS AT CENTER ...
MTR BASEMENT. DOORWAY TO SOURCE STORAGE VAULT IS AT CENTER OF VIEW; TO DECONTAMINATION ROOM, AT RIGHT. PART OF MAZE ENTRY IS VISIBLE INSIDE VAULT DOORWAY. INL NEGATIVE NO. 7763. Unknown Photographer, photo was dated as 3/30/1953, but this was probably an error. The more likely date is 3/30/1952. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Schwemin, Friedhelm
2010-12-01
The appendices of the first 54 volumes (1776-1829) of the Berliner Astronomisches Jahrbuch (BAJ), edited by Johann Elert Bode, contain a plethora of biographically relevant notes, which are listed here, alphabetically sorted, in short versions. In parts, the listing possesses the quality of a primary source, and contains information on 771 persons. Many of them are poorly known or unknown.
Fletcher, Eugene; Feizi, Amir; Bisschops, Markus M M; Hallström, Björn M; Khoomrung, Sakda; Siewers, Verena; Nielsen, Jens
2017-01-01
Tolerance of yeast to acid stress is important for many industrial processes including organic acid production. Therefore, elucidating the molecular basis of long term adaptation to acidic environments will be beneficial for engineering production strains to thrive under such harsh conditions. Previous studies using gene expression analysis have suggested that both organic and inorganic acids display similar responses during short term exposure to acidic conditions. However, biological mechanisms that will lead to long term adaptation of yeast to acidic conditions remains unknown and whether these mechanisms will be similar for tolerance to both organic and inorganic acids is yet to be explored. We therefore evolved Saccharomyces cerevisiae to acquire tolerance to HCl (inorganic acid) and to 0.3M L-lactic acid (organic acid) at pH 2.8 and then isolated several low pH tolerant strains. Whole genome sequencing and RNA-seq analysis of the evolved strains revealed different sets of genome alterations suggesting a divergence in adaptation to these two acids. An altered sterol composition and impaired iron uptake contributed to HCl tolerance whereas the formation of a multicellular morphology and rapid lactate degradation was crucial for tolerance to high concentrations of lactic acid. Our findings highlight the contribution of both the selection pressure and nature of the acid as a driver for directing the evolutionary path towards tolerance to low pH. The choice of carbon source was also an important factor in the evolutionary process since cells evolved on two different carbon sources (raffinose and glucose) generated a different set of mutations in response to the presence of lactic acid. Therefore, different strategies are required for a rational design of low pH tolerant strains depending on the acid of interest. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Embolic Strokes of Unknown Source and Cryptogenic Stroke: Implications in Clinical Practice
Nouh, Amre; Hussain, Mohammed; Mehta, Tapan; Yaghi, Shadi
2016-01-01
Up to a third of strokes are rendered cryptogenic or of undetermined etiology. This number is specifically higher in younger patients. At times, inadequate diagnostic workups, multiple causes, or an under-recognized etiology contributes to this statistic. Embolic stroke of undetermined source, a new clinical entity particularly refers to patients with embolic stroke for whom the etiology of embolism remains unidentified despite through investigations ruling out established cardiac and vascular sources. In this article, we review current classification and discuss important clinical considerations in these patients; highlighting cardiac arrhythmias and structural abnormalities, patent foramen ovale, paradoxical sources, and potentially under-recognized, vascular, inflammatory, autoimmune, and hematologic sources in relation to clinical practice. PMID:27047443
Interactions Between Mineral Dust, Climate, and Ocean Ecosystems
NASA Technical Reports Server (NTRS)
Gasso, Santiago; Grassian, Vicki H.; Miller, Ron L.
2010-01-01
Over the past decade, technological improvements in the chemical and physical characterization of dust have provided insights into a number of phenomena that were previously unknown or poorly understood. In addition, models are now incorporating a wider range of physical processes, which will allow us to better quantify the climatic and ecological impacts of dust. For example, some models include the effect of dust on oceanic photosynthesis and thus on atmospheric CO 2 (Friedlingstein et al. 2006). The impact of long-range dust transport, with its multiple forcings and feedbacks, is a relatively new and complex area of research, where input from several disciplines is needed. So far, many of these effects have only been parameterized in models in very simple terms. For example, the representation of dust sources remains a major uncertainty in dust modeling and estimates of the global mass of airborne dust. This is a problem where Earth scientists could make an important contribution, by working with climate scientists to determine the type of environments in which easily erodible soil particles might have accumulated over time. Geologists could also help to identify the predominant mineralogical composition of dust sources, which is crucial for calculating the radiative and chemical effects of dust but is currently known for only a few regions. Understanding how climate and geological processes control source extent and characterizing the mineral content of airborne dust are two of the fascinating challenges in future dust research.
Howarth, Richard J; Evans, Graham; Croudace, Ian W; Cundy, Andrew B
2005-03-20
The Ensenada de San Simon is the inner part of the Ria de Vigo, one of the major mesotidal rias of the Galician coast, NW Spain. The geochemistry of its bottom sediments can be accounted for in terms of both natural and anthropogenic sources. Mixture-modelling enables much of the Cr, Ni, V, Cu, Pb and Zn concentrations of the bottom and subaqueous sediments to be explained by sediment input from the river systems and faecal matter from manmade mussel rafts. The compositions and relative contributions of additional, unknown, sources of anomalous heavy-metal concentrations are quantified using constrained nonlinear optimization. The pattern of metal enrichment is attributed to: material carried in solution and suspension in marine water entering the Ensenada from the polluted industrial areas of the adjacent Ria de Vigo; wind-borne urban dusts and/or vehicular emissions from the surrounding network of roads and a motorway road-bridge over the Estrecho de Rande; industrial and agricultural pollution from the R. Redondela; and waste from a former ceramics factory near the mouth of the combined R. Oitaben and R. Verdugo. Using (137)Cs dating, it is suggested that heavy metal build-up in the sediments since the late 1970s followed development of inshore fisheries and introduction of the mussel rafts (ca. 1960) and increasing industrialisation.
A nudging data assimilation algorithm for the identification of groundwater pumping
NASA Astrophysics Data System (ADS)
Cheng, Wei-Chen; Kendall, Donald R.; Putti, Mario; Yeh, William W.-G.
2009-08-01
This study develops a nudging data assimilation algorithm for estimating unknown pumping from private wells in an aquifer system using measured data of hydraulic head. The proposed algorithm treats the unknown pumping as an additional sink term in the governing equation of groundwater flow and provides a consistent physical interpretation for pumping rate identification. The algorithm identifies the unknown pumping and, at the same time, reduces the forecast error in hydraulic heads. We apply the proposed algorithm to the Las Posas Groundwater Basin in southern California. We consider the following three pumping scenarios: constant pumping rates, spatially varying pumping rates, and temporally varying pumping rates. We also study the impact of head measurement errors on the proposed algorithm. In the case study we seek to estimate the six unknown pumping rates from private wells using head measurements from four observation wells. The results show an excellent rate of convergence for pumping estimation. The case study demonstrates the applicability, accuracy, and efficiency of the proposed data assimilation algorithm for the identification of unknown pumping in an aquifer system.
A nudging data assimilation algorithm for the identification of groundwater pumping
NASA Astrophysics Data System (ADS)
Cheng, W.; Kendall, D. R.; Putti, M.; Yeh, W. W.
2008-12-01
This study develops a nudging data assimilation algorithm for estimating unknown pumping from private wells in an aquifer system using measurement data of hydraulic head. The proposed algorithm treats the unknown pumping as an additional sink term in the governing equation of groundwater flow and provides a consistently physical interpretation for pumping rate identification. The algorithm identifies unknown pumping and, at the same time, reduces the forecast error in hydraulic heads. We apply the proposed algorithm to the Las Posas Groundwater Basin in southern California. We consider the following three pumping scenarios: constant pumping rate, spatially varying pumping rates, and temporally varying pumping rates. We also study the impact of head measurement errors on the proposed algorithm. In the case study, we seek to estimate the six unknown pumping rates from private wells using head measurements from four observation wells. The results show excellent rate of convergence for pumping estimation. The case study demonstrates the applicability, accuracy, and efficiency of the proposed data assimilation algorithm for the identification of unknown pumping in an aquifer system.
NASA Astrophysics Data System (ADS)
Boddice, Daniel; Metje, Nicole; Tuckwell, George
2017-11-01
Geophysical surveying is widely used for the location of subsurface features. Current technology is limited in terms of its resolution (thus size of features it can detect) and penetration depth and a suitable technique is needed to bridge the gap between shallow near surface investigation using techniques such as EM conductivity mapping and GPR commonly used to map the upper 5 m below ground surface, and large features at greater depths detectable using conventional microgravity (> 5 m below ground surface). This will minimise the risks from unknown features buried in and conditions of the ground during civil engineering work. Quantum technology (QT) gravity sensors potentially offer a step-change in technology for locating features which lie outside of the currently detectable range in terms of size and depth, but that potential is currently unknown as field instruments have not been developed. To overcome this, a novel computer simulation was developed for a large range of different targets of interest. The simulation included realistic noise modelling of instrumental, environmental and location sources of noise which limit the accuracy of current microgravity measurements, in order to assess the potential capability of the new QT instruments in realistic situations and determine some of the likely limitations on their implementation. The results of the simulations for near surface features showed that the new technology is best employed in a gradiometer configuration as opposed to the traditional single sensor gravimeter used by current instruments due to the ability to suppress vibrational environmental noise effects due to common mode rejection between the sensors. A significant improvement in detection capability of 1.5-2 times was observed, putting targets such as mineshafts into the detectability zone which would be a major advantage for subsurface surveying. Thus this research, for the first time, has demonstrated clearly the benefits of QT gravity gradiometer sensors thereby increasing industry's confidence in this new technology.
NASA Astrophysics Data System (ADS)
Cohen, J. B.; Lan, R.; Lin, C.; Ng, D. H. L.; Lim, A.
2017-12-01
A multi-instrument, inverse modeling approach, is employed to identify and quantify large-scale global biomass urban aerosol emissions profiles. The approach uses MISR, MODIS, OMI and MOPITT, with data from 2006 to 2016, to generate spatial and temporal loads, as well as some information about composition. The method is able to identify regions impacted by stable urban sources, changing urban sources, intense fires, and linear-combinations. Subsequent quantification is a unified field, leading to a less biased profile, with the result not requiring arbitrary scaling to match long-term means. Additionally, the result reasonably reproduces inter and intra annual variation. Both meso-scale (WRF-CHEM) and global (MIT-AERO, multi-mode, multi-mixing state aerosol model) models of aerosol transport, chemistry, and physics, are used to generate resulting 4D aerosol fields. Comparisons with CALIOP, AERONET, and surface chemical and aerosol networks, provide unbiased confirmation, while column and vertical loadings provide additional feedback. There are three significant results. First, there is a reduction in sources over existing urban areas in East Asia. Second, there is an increase in sources over new urban areas in South, South East, and East Asia. Third, that there is an increase in fire sources in South and South East Asia. There are other initial findings relevant to the global tropics, which have not been as deeply investigated. The results improve the model match with both the mean and variation, which is essential if we hope to understand seasonal extremes. The results also quantify impacts of both local and long-range sources. This is of extreme urgency, in particular in developing nations, where there are considerable contributions from long-range or otherwise unknown sources, that impact hundreds of millions of people throughout Asia. It is hoped that the approach provided here can help us to make critical decisions about total sources, as well as point out the many missing scientific and analytical issues still required to address.
Jin, Virginia L; Schmer, Marty R; Stewart, Catherine E; Sindelar, Aaron J; Varvel, Gary E; Wienhold, Brian J
2017-07-01
Over the last 50 years, the most increase in cultivated land area globally has been due to a doubling of irrigated land. Long-term agronomic management impacts on soil organic carbon (SOC) stocks, soil greenhouse gas (GHG) emissions, and global warming potential (GWP) in irrigated systems, however, remain relatively unknown. Here, residue and tillage management effects were quantified by measuring soil nitrous oxide (N 2 O) and methane (CH 4 ) fluxes and SOC changes (ΔSOC) at a long-term, irrigated continuous corn (Zea mays L.) system in eastern Nebraska, United States. Management treatments began in 2002, and measured treatments included no or high stover removal (0 or 6.8 Mg DM ha -1 yr -1 , respectively) under no-till (NT) or conventional disk tillage (CT) with full irrigation (n = 4). Soil N 2 O and CH 4 fluxes were measured for five crop-years (2011-2015), and ΔSOC was determined on an equivalent mass basis to ~30 cm soil depth. Both area- and yield-scaled soil N 2 O emissions were greater with stover retention compared to removal and for CT compared to NT, with no interaction between stover and tillage practices. Methane comprised <1% of total emissions, with NT being CH 4 neutral and CT a CH 4 source. Surface SOC decreased with stover removal and with CT after 14 years of management. When ΔSOC, soil GHG emissions, and agronomic energy usage were used to calculate system GWP, all management systems were net GHG sources. Conservation practices (NT, stover retention) each decreased system GWP compared to conventional practices (CT, stover removal), but pairing conservation practices conferred no additional mitigation benefit. Although cropping system, management equipment/timing/history, soil type, location, weather, and the depth to which ΔSOC is measured affect the GWP outcomes of irrigated systems at large, this long-term irrigated study provides valuable empirical evidence of how management decisions can impact soil GHG emissions and surface SOC stocks. © 2017 John Wiley & Sons Ltd.
Microbial Monitoring of Surface Water in South Africa: An Overview
Luyt, Catherine D.; Tandlich, Roman; Muller, Wilhelmine J.; Wilhelmi, Brendan S.
2012-01-01
Infrastructural problems force South African households to supplement their drinking water consumption from water resources of inadequate microbial quality. Microbial water quality monitoring is currently based on the Colilert®18 system which leads to rapidly available results. Using Escherichia coli as the indicator microorganism limits the influence of environmental sources on the reported results. The current system allows for understanding of long-term trends of microbial surface water quality and the related public health risks. However, rates of false positive for the Colilert®18-derived concentrations have been reported to range from 7.4% to 36.4%. At the same time, rates of false negative results vary from 3.5% to 12.5%; and the Colilert medium has been reported to provide for cultivation of only 56.8% of relevant strains. Identification of unknown sources of faecal contamination is not currently feasible. Based on literature review, calibration of the antibiotic-resistance spectra of Escherichia coli or the bifidobacterial tracking ratio should be investigated locally for potential implementation into the existing monitoring system. The current system could be too costly to implement in certain areas of South Africa where the modified H2S strip test might be used as a surrogate for the Colilert®18. PMID:23066390
Thermodynamic Modelling of Phase Transformation in a Multi-Component System
NASA Astrophysics Data System (ADS)
Vala, J.
2007-09-01
Diffusion in multi-component alloys can be characterized by the vacancy mechanism for substitutional components, by the existence of sources and sinks for vacancies and by the motion of atoms of interstitial components. The description of diffusive and massive phase transformation of a multi-component system is based on the thermodynamic extremal principle by Onsager; the finite thickness of the interface between both phases is respected. The resulting system of partial differential equations of evolution with integral terms for unknown mole fractions (and additional variables in case of non-ideal sources and sinks for vacancies), can be analyzed using the method of lines and the finite difference technique (or, alternatively, the finite element one) together with the semi-analytic and numerical integration formulae and with certain iteration procedure, making use of the spectral properties of linear operators. The original software code for the numerical evaluation of solutions of such systems, written in MATLAB, offers a chance to simulate various real processes of diffusional phase transformation. Some results for the (nearly) steady-state real processes in substitutional alloys have been published yet. The aim of this paper is to demonstrate that the same approach can handle both substitutional and interstitial components even in case of a general system of evolution.
A Ratiometric Method for Johnson Noise Thermometry Using a Quantized Voltage Noise Source
NASA Astrophysics Data System (ADS)
Nam, S. W.; Benz, S. P.; Martinis, J. M.; Dresselhaus, P.; Tew, W. L.; White, D. R.
2003-09-01
Johnson Noise Thermometry (JNT) involves the measurement of the statistical variance of a fluctuating voltage across a resistor in thermal equilibrium. Modern digital techniques make it now possible to perform many functions required for JNT in highly efficient and predictable ways. We describe the operational characteristics of a prototype JNT system which uses digital signal processing for filtering, real-time spectral cross-correlation for noise power measurement, and a digitally synthesized Quantized Voltage Noise Source (QVNS) as an AC voltage reference. The QVNS emulates noise with a constant spectral density that is stable, programmable, and calculable in terms of known parameters using digital synthesis techniques. Changes in analog gain are accounted for by alternating the inputs between the Johnson noise sensor and the QVNS. The Johnson noise power at a known temperature is first balanced with a synthesized noise power from the QVNS. The process is then repeated by balancing the noise power from the same resistor at an unknown temperature. When the two noise power ratios are combined, a thermodynamic temperature is derived using the ratio of the two QVNS spectral densities. We present preliminary results where the ratio between the gallium triple point and the water triple point is used to demonstrate the accuracy of the measurement system with a standard uncertainty of 0.04 %.
NASA Astrophysics Data System (ADS)
Charrier, J. G.; Richards-Henderson, N. K.; Bein, K. J.; McFall, A. S.; Wexler, A. S.; Anastasio, C.
2014-09-01
Recent epidemiological evidence supports the hypothesis that health effects from inhalation of ambient particulate matter (PM) are governed by more than just the mass of PM inhaled. Both specific chemical components and sources have been identified as important contributors to mortality and hospital admissions, even when these endpoints are unrelated to PM mass. Sources may cause adverse health effects via their ability to produce reactive oxygen species, possibly due to the transition metal content of the PM. Our goal is to quantify the oxidative potential of ambient particle sources collected during two seasons in Fresno, CA using the dithiothreitol (DTT) assay. We collected PM from different sources or source combinations into different ChemVol (CV) samplers in real time using a novel source-oriented sampling technique based on single particle mass spectrometry. We segregated the particles from each source-oriented mixture into two size fractions - ultrafine (Dp ≤ 0.17 μm) and submicron fine (0.17 μm ≤ Dp ≤ 1.0 μm) - and measured metals and the rate of DTT loss in each PM extract. We find that the mass-normalized oxidative potential of different sources varies by up to a actor of 8 and that submicron fine PM typically has a larger mass-normalized oxidative potential than ultrafine PM from the same source. Vehicular Emissions, Regional Source Mix, Commute Hours, Daytime Mixed Layer and Nighttime Inversion sources exhibit the highest mass-normalized oxidative potential. When we apportion the volume-normalized oxidative potential, which also accounts for the source's prevalence, cooking sources account for 18-29% of the total DTT loss while mobile (traffic) sources account for 16-28%. When we apportion DTT activity for total PM sampled to specific chemical compounds, soluble copper accounts for roughly 50% of total air-volume-normalized oxidative potential, soluble manganese accounts for 20%, and other unknown species, likely including quinones and other organics, account for 30%. During nighttime, soluble copper and manganese largely explain the oxidative potential of PM, while daytime has a larger contribution from unknown (likely organic) species.
26. Evening view of concrete mixing plant, concrete placement tower, ...
26. Evening view of concrete mixing plant, concrete placement tower, cableway tower, power line and derrick. Photographer unknown, 1927. Source: MWD. - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
Benefits and Risks of Cochlear Implants
... The cochlear implant stimulates the nerves directly with electrical currents. Although this stimulation appears to be safe, the long term effect of these electrical currents on the nerves is unknown. May not ...
Evaluation of nitrous acid sources and sinks in urban outflow
NASA Astrophysics Data System (ADS)
Gall, Elliott T.; Griffin, Robert J.; Steiner, Allison L.; Dibb, Jack; Scheuer, Eric; Gong, Longwen; Rutter, Andrew P.; Cevik, Basak K.; Kim, Saewung; Lefer, Barry; Flynn, James
2016-02-01
Intensive air quality measurements made from June 22-25, 2011 in the outflow of the Dallas-Fort Worth (DFW) metropolitan area are used to evaluate nitrous acid (HONO) sources and sinks. A two-layer box model was developed to assess the ability of established and recently identified HONO sources and sinks to reproduce observations of HONO mixing ratios. A baseline model scenario includes sources and sinks established in the literature and is compared to scenarios including three recently identified sources: volatile organic compound-mediated conversion of nitric acid to HONO (S1), biotic emission from the ground (S2), and re-emission from a surface nitrite reservoir (S3). For all mechanisms, ranges of parametric values span lower- and upper-limit values. Model outcomes for 'likely' estimates of sources and sinks generally show under-prediction of HONO observations, implying the need to evaluate additional sources and variability in estimates of parameterizations, particularly during daylight hours. Monte Carlo simulation is applied to model scenarios constructed with sources S1-S3 added independently and in combination, generally showing improved model outcomes. Adding sources S2 and S3 (scenario S2/S3) appears to best replicate observed HONO, as determined by the model coefficient of determination and residual sum of squared errors (r2 = 0.55 ± 0.03, SSE = 4.6 × 106 ± 7.6 × 105 ppt2). In scenario S2/S3, source S2 is shown to account for 25% and 6.7% of the nighttime and daytime budget, respectively, while source S3 accounts for 19% and 11% of the nighttime and daytime budget, respectively. However, despite improved model fit, there remains significant underestimation of daytime HONO; on average, a 0.15 ppt/s unknown daytime HONO source, or 67% of the total daytime source, is needed to bring scenario S2/S3 into agreement with observation. Estimates of 'best fit' parameterizations across lower to upper-limit values results in a moderate reduction of the unknown daytime source, from 0.15 to 0.10 ppt/s.
77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-02
... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...
Fan, Quan-Yong; Yang, Guang-Hong
2016-01-01
This paper is concerned with the problem of integral sliding-mode control for a class of nonlinear systems with input disturbances and unknown nonlinear terms through the adaptive actor-critic (AC) control method. The main objective is to design a sliding-mode control methodology based on the adaptive dynamic programming (ADP) method, so that the closed-loop system with time-varying disturbances is stable and the nearly optimal performance of the sliding-mode dynamics can be guaranteed. In the first step, a neural network (NN)-based observer and a disturbance observer are designed to approximate the unknown nonlinear terms and estimate the input disturbances, respectively. Based on the NN approximations and disturbance estimations, the discontinuous part of the sliding-mode control is constructed to eliminate the effect of the disturbances and attain the expected equivalent sliding-mode dynamics. Then, the ADP method with AC structure is presented to learn the optimal control for the sliding-mode dynamics online. Reconstructed tuning laws are developed to guarantee the stability of the sliding-mode dynamics and the convergence of the weights of critic and actor NNs. Finally, the simulation results are presented to illustrate the effectiveness of the proposed method.
Improved Ambient Pressure Pyroelectric Ion Source
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kim, Hugh I.; Kanik, Isik; Ryu, Ernest K.; Beckett, Brett
2011-01-01
The detection of volatile vapors of unknown species in a complex field environment is required in many different applications. Mass spectroscopic techniques require subsystems including an ionization unit and sample transport mechanism. All of these subsystems must have low mass, small volume, low power, and be rugged. A volatile molecular detector, an ambient pressure pyroelectric ion source (APPIS) that met these requirements, was recently reported by Caltech researchers to be used in in situ environments.
12. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
12. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. PIMA LATERAL, LINING EQUIPMENT, 5/13/25 - San Carlos Irrigation Project, Pima Lateral, Main Canal at Sacaton Dam, Coolidge, Pinal County, AZ
14. Photographic copy of photograph. (Source: U.s. Department of Interior. ...
14. Photographic copy of photograph. (Source: U.s. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1927. Vol. I, Narrative and Photographs, District #4, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown. PIMA LATERAL, MCCLELLAN CONDUIT, ENTRANCE BEFORE POURING THE CONDUIT, 4/30/27 - San Carlos Irrigation Project, Pima Lateral, Main Canal at Sacaton Dam, Coolidge, Pinal County, AZ
21. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
21. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1926. Vol. I, Narrative and Photographs, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown. SACATON DAM, UPSTREAM SIDE FROM SOUTH END, 8/29/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
20. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
20. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1926. Vol. I, Narrative and Photographs, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown. SACATON DAM, BRIDGE FROM SOUTH END, 8/29/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
NASA Astrophysics Data System (ADS)
Zlotnik, A. A.
2017-04-01
The multidimensional quasi-gasdynamic system written in the form of mass, momentum, and total energy balance equations for a perfect polytropic gas with allowance for a body force and a heat source is considered. A new conservative symmetric spatial discretization of these equations on a nonuniform rectangular grid is constructed (with the basic unknown functions—density, velocity, and temperature—defined on a common grid and with fluxes and viscous stresses defined on staggered grids). Primary attention is given to the analysis of entropy behavior: the discretization is specially constructed so that the total entropy does not decrease. This is achieved via a substantial revision of the standard discretization and applying numerous original features. A simplification of the constructed discretization serves as a conservative discretization with nondecreasing total entropy for the simpler quasi-hydrodynamic system of equations. In the absence of regularizing terms, the results also hold for the Navier-Stokes equations of a viscous compressible heat-conducting gas.
Miyahara, Atsushi; Kawashima, Hisashi; Okubo, Yukari; Hoshika, Akinori
2011-06-01
To review baboon syndrome (BS). Date sources were obtained from PubMed and Google Scholar: Photographs of baboon syndrome were obtained from our patient. PubMed and Google Scholar were searched up to June 30, 2010. The search terms were "baboon syndrome", "SDRIFE" and "thimerosal allergy". Reverse references from relevant articles and Google Scholar were also used. As BS is a classical disease and cases of offending agents were relatively old, some references were more than five years old. In order to gather as many cases of offending agents as possible, more than 50 references were collected. We divided BS into as 4 groups; classical baboon syndrome, topical drug-induced baboon syndrome, systemic drug-induced baboon syndrome and symmetrical drug-related intertriginous and flexural exanthema (SDRIFE). The pathomechanism of BS is still unknown. A delayed type of hypersensitivity reaction, a recall phenomenon, pharmacologic interaction with immune-receptors and anatomical factors may be involved in the causation of BS.
Wang, Tianyun; Lu, Xinfei; Yu, Xiaofei; Xi, Zhendong; Chen, Weidong
2014-01-01
In recent years, various applications regarding sparse continuous signal recovery such as source localization, radar imaging, communication channel estimation, etc., have been addressed from the perspective of compressive sensing (CS) theory. However, there are two major defects that need to be tackled when considering any practical utilization. The first issue is off-grid problem caused by the basis mismatch between arbitrary located unknowns and the pre-specified dictionary, which would make conventional CS reconstruction methods degrade considerably. The second important issue is the urgent demand for low-complexity algorithms, especially when faced with the requirement of real-time implementation. In this paper, to deal with these two problems, we have presented three fast and accurate sparse reconstruction algorithms, termed as HR-DCD, Hlog-DCD and Hlp-DCD, which are based on homotopy, dichotomous coordinate descent (DCD) iterations and non-convex regularizations, by combining with the grid refinement technique. Experimental results are provided to demonstrate the effectiveness of the proposed algorithms and related analysis. PMID:24675758
Olsen, Ingar; Potempa, Jan
2014-01-01
Gingipains are the major virulence factors of Porphyromonas gingivalis, the main periodontopathogen. It is expected that inhibition of gingipain activity in vivo could prevent or slow down the progression of adult periodontitis. To date, several classes of gingipain inhibitors have been recognized. These include gingipain N-terminal prodomains, synthetic compounds, inhibitors from natural sources, antibiotics, antiseptics, antibodies, and bacteria. Several synthetic compounds are potent gingipain inhibitors but inhibit a broad spectrum of host proteases and have undesirable side effects. Synthetic compounds with high specificity for gingipains have unknown toxicity effects, making natural inhibitors more promising as therapeutic gingipain blockers. Cranberry and rice extracts interfere with gingipain activity and prevent the growth and biofilm formation of periodontopathogens. Although the ideal gingipain inhibitor has yet to be discovered, gingipain inhibition represents a novel approach to treat and prevent periodontitis. Gingipain inhibitors may also help treat systemic disorders that are associated with periodontitis, including cardiovascular disease, rheumatoid arthritis, aspiration pneumonia, pre-term birth, and low birth weight. PMID:25206939
A RuBisCO-mediated carbon metabolic pathway in methanogenic archaea
Kono, Takunari; Mehrotra, Sandhya; Endo, Chikako; Kizu, Natsuko; Matusda, Mami; Kimura, Hiroyuki; Mizohata, Eiichi; Inoue, Tsuyoshi; Hasunuma, Tomohisa; Yokota, Akiho; Matsumura, Hiroyoshi; Ashida, Hiroki
2017-01-01
Two enzymes are considered to be unique to the photosynthetic Calvin–Benson cycle: ribulose-1,5-bisphosphate carboxylase/oxygenase (RuBisCO), responsible for CO2 fixation, and phosphoribulokinase (PRK). Some archaea possess bona fide RuBisCOs, despite not being photosynthetic organisms, but are thought to lack PRK. Here we demonstrate the existence in methanogenic archaea of a carbon metabolic pathway involving RuBisCO and PRK, which we term ‘reductive hexulose-phosphate' (RHP) pathway. These archaea possess both RuBisCO and a catalytically active PRK whose crystal structure resembles that of photosynthetic bacterial PRK. Capillary electrophoresis-mass spectrometric analysis of metabolites reveals that the RHP pathway, which differs from the Calvin–Benson cycle only in a few steps, is active in vivo. Our work highlights evolutionary and functional links between RuBisCO-mediated carbon metabolic pathways in methanogenic archaea and photosynthetic organisms. Whether the RHP pathway allows for autotrophy (that is, growth exclusively with CO2 as carbon source) remains unknown. PMID:28082747
Unexpected seasonality in quantity and composition of Amazon rainforest air reactivity
Nölscher, A. C.; Yañez-Serrano, A. M.; Wolff, S.; de Araujo, A. Carioca; Lavrič, J. V.; Kesselmeier, J.; Williams, J.
2016-01-01
The hydroxyl radical (OH) removes most atmospheric pollutants from air. The loss frequency of OH radicals due to the combined effect of all gas-phase OH reactive species is a measureable quantity termed total OH reactivity. Here we present total OH reactivity observations in pristine Amazon rainforest air, as a function of season, time-of-day and height (0–80 m). Total OH reactivity is low during wet (10 s−1) and high during dry season (62 s−1). Comparison to individually measured trace gases reveals strong variation in unaccounted for OH reactivity, from 5 to 15% missing in wet-season afternoons to mostly unknown (average 79%) during dry season. During dry-season afternoons isoprene, considered the dominant reagent with OH in rainforests, only accounts for ∼20% of the total OH reactivity. Vertical profiles of OH reactivity are shaped by biogenic emissions, photochemistry and turbulent mixing. The rainforest floor was identified as a significant but poorly characterized source of OH reactivity. PMID:26797390
Unexpected seasonality in quantity and composition of Amazon rainforest air reactivity.
Nölscher, A C; Yañez-Serrano, A M; Wolff, S; de Araujo, A Carioca; Lavrič, J V; Kesselmeier, J; Williams, J
2016-01-22
The hydroxyl radical (OH) removes most atmospheric pollutants from air. The loss frequency of OH radicals due to the combined effect of all gas-phase OH reactive species is a measureable quantity termed total OH reactivity. Here we present total OH reactivity observations in pristine Amazon rainforest air, as a function of season, time-of-day and height (0-80 m). Total OH reactivity is low during wet (10 s(-1)) and high during dry season (62 s(-1)). Comparison to individually measured trace gases reveals strong variation in unaccounted for OH reactivity, from 5 to 15% missing in wet-season afternoons to mostly unknown (average 79%) during dry season. During dry-season afternoons isoprene, considered the dominant reagent with OH in rainforests, only accounts for ∼20% of the total OH reactivity. Vertical profiles of OH reactivity are shaped by biogenic emissions, photochemistry and turbulent mixing. The rainforest floor was identified as a significant but poorly characterized source of OH reactivity.
On 'Organized Crime' in the illicit antiquities trade: moving beyond the definitional debate.
Dietzler, Jessica
The extent to which 'organized crime' is involved in illicit antiquities trafficking is unknown and frequently debated. This paper explores the significance and scale of the illicit antiquities trade as a unique transnational criminal phenomenon that is often said to be perpetrated by and exhibit traits of so-called 'organized crime.' The definitional debate behind the term 'organized crime' is considered as a potential problem impeding our understanding of its existence or extent in illicit antiquities trafficking, and a basic progression-based model is then suggested as a new tool to move beyond the definitional debate for future research that may help to elucidate the actors, processes and criminal dynamics taking place within the illicit antiquities trade from source to market. The paper concludes that researchers should focus not on the question of whether organized criminals- particularly in a traditionally conceived, mafia-type stereotypical sense- are involved in the illicit antiquities trade, but instead on the structure and progression of antiquities trafficking itself that embody both organized and criminal dynamics.
Information content of contact-pattern representations and predictability of epidemic outbreaks
Holme, Petter
2015-01-01
To understand the contact patterns of a population—who is in contact with whom, and when the contacts happen—is crucial for modeling outbreaks of infectious disease. Traditional theoretical epidemiology assumes that any individual can meet any with equal probability. A more modern approach, network epidemiology, assumes people are connected into a static network over which the disease spreads. Newer yet, temporal network epidemiology, includes the time in the contact representations. In this paper, we investigate the effect of these successive inclusions of more information. Using empirical proximity data, we study both outbreak sizes from unknown sources, and from known states of ongoing outbreaks. In the first case, there are large differences going from a fully mixed simulation to a network, and from a network to a temporal network. In the second case, differences are smaller. We interpret these observations in terms of the temporal network structure of the data sets. For example, a fast overturn of nodes and links seem to make the temporal information more important. PMID:26403504
Hip Squeaking after Ceramic-on-ceramic Total Hip Arthroplasty
Wu, Guo-Liang; Zhu, Wei; Zhao, Yan; Ma, Qi; Weng, Xi-Sheng
2016-01-01
Objective: The present study aimed to review the characteristics and influencing factors of squeaking after ceramic-on-ceramic (CoC) total hip arthroplasty (THA) and to analyze the possible mechanisms of the audible noise. Data Sources: The data analyzed in this review were based on articles from PubMed and Web of Science. Study Selection: The articles selected for review were original articles and reviews found based on the following search terms: “total hip arthroplasty”, “ceramic-on-ceramic”, “hip squeaking”, and “hip noise.” Results: The mechanism of the squeaking remains unknown. The possible explanations included stripe wear, edge loading, a third body, fracture of the ceramic liner, and resonance of the prosthesis components. Squeaking occurrence is influenced by patient, surgical, and implant factors. Conclusions: Most studies indicated that squeaking after CoC THA was the consequence of increasing wear or impingement, caused by prosthesis design, patient characteristics, or surgical factors. However, as conflicts exist among different articles, the major reasons for the squeaking remain to be identified. PMID:27453238
Survival without sequelae after prolonged cardiopulmonary resuscitation after electric shock.
Motawea, Mohamad; Al-Kenany, Al-Sayed; Hosny, Mostafa; Aglan, Omar; Samy, Mohamad; Al-Abd, Mohamed
2016-03-01
"Electrical shock is the physiological reaction or injury caused by electric current passing through the human body. It occurs upon contact of a human body part with any source of electricity that causes a sufficient current through the skin, muscles, or hair causing undesirable effects ranging from simple burns to death." Ventricular fibrillation is believed to be the most common cause of death after electrical shock. "The ideal duration of cardiac resuscitation is unknown. Typically prolonged cardiopulmonary resuscitation is associated with poor neurologic outcomes and reduced long term survival. No consensus statement has been made and traditionally efforts are usually terminated after 15-30 minutes." The case under discussion seems worthy of the somewhat detailed description given. It is for a young man who survived after 65 minutes after electrical shock (ES) after prolonged high-quality cardiopulmonary resuscitation (CPR), multiple defibrillations, and artificial ventilation without any sequelae. Early start of adequate chest compressions and close adherence to advanced cardiac life support protocols played a vital role in successful CPR.
ERIC Educational Resources Information Center
Shields-Johnson, Maria E.; Hernandez, John S.; Torno, Cody; Adams, Katherine M.; Wainwright, Marcy L.; Mozzachiodi, Riccardo
2013-01-01
In "Aplysia," repeated trials of aversive stimuli produce long-term sensitization (LTS) of defensive reflexes and suppression of feeding. Whereas the cellular underpinnings of LTS have been characterized, the mechanisms of feeding suppression remained unknown. Here, we report that LTS training induced a long-term decrease in the excitability of…
Basu, Millie Nguyen; Johnsen, Iben Birgit Gade; Wehberg, Sonja; Sørensen, Rikke Guldberg; Barington, Torben; Nørgård, Bente Mertz
2018-02-23
We examined the causes of death amongst full term stillbirths and early neonatal deaths. Our cohort includes women in the Region of Southern Denmark, who gave birth at full term to a stillborn infant or a neonate who died within the first 7 days from 2010 through 2014. Demographic, biometric and clinical variables were analyzed to assess the causes of death using two classification systems: causes of death and associated conditions (CODAC) and a Danish system based on initial causes of fetal death (INCODE). A total of 95 maternal-infant cases were included. Using the CODAC and INCODE classification systems, we found that the causes of death were unknown in 59/95 (62.1%). The second most common cause of death in CODAC was congenital anomalies in 10/95 (10.5%), similar to INCODE with fetal, genetic, structural and karyotypic anomalies in 11/95 (11.6%). The majority of the mothers were healthy, primiparous, non-smokers, aged 20-34 years and with a normal body mass index (BMI). Based on an unselected cohort from an entire region in Denmark, the cause of stillbirth and early neonatal deaths among full term infants remained unknown for the vast majority.
Constraints on genes shape long-term conservation of macro-synteny in metazoan genomes.
Lv, Jie; Havlak, Paul; Putnam, Nicholas H
2011-10-05
Many metazoan genomes conserve chromosome-scale gene linkage relationships ("macro-synteny") from the common ancestor of multicellular animal life 1234, but the biological explanation for this conservation is still unknown. Double cut and join (DCJ) is a simple, well-studied model of neutral genome evolution amenable to both simulation and mathematical analysis 5, but as we show here, it is not sufficent to explain long-term macro-synteny conservation. We examine a family of simple (one-parameter) extensions of DCJ to identify models and choices of parameters consistent with the levels of macro- and micro-synteny conservation observed among animal genomes. Our software implements a flexible strategy for incorporating genomic context into the DCJ model to incorporate various types of genomic context ("DCJ-[C]"), and is available as open source software from http://github.com/putnamlab/dcj-c. A simple model of genome evolution, in which DCJ moves are allowed only if they maintain chromosomal linkage among a set of constrained genes, can simultaneously account for the level of macro-synteny conservation and for correlated conservation among multiple pairs of species. Simulations under this model indicate that a constraint on approximately 7% of metazoan genes is sufficient to constrain genome rearrangement to an average rate of 25 inversions and 1.7 translocations per million years.
The Strategic Dialogue on Tobacco Harm Reduction: a vision and blueprint for action in the US
Zeller, Mitchell; Hatsukami, Dorothy
2016-01-01
The issues related to tobacco harm reduction continue to challenge the tobacco control research and policy communities. The potential for combusting tobacco products to reduce exposure and risk remains largely unknown, but this has not stopped manufacturers from offering such products making these claims. The role of oral tobacco products in a harm reduction regimen has also been a source of dialogue and debate. Within the last few years, major cigarette manufacturing companies have begun selling smokeless products for the first time, claiming to target current cigarette smokers. Other cigarette manufacturers are also offering smokeless products in markets around the world. The harm reduction debate has at times been divisive. There has been no unifying set of principles or goals articulated to guide tobacco control efforts. In particular, the research needs are extraordinarily high in order to drive evidence-based policy in this area and avoid the mistakes made with “light” cigarettes. This paper discusses recommendations from a strategic dialogue held with key, mostly US-based tobacco control researchers and policy makers to develop a strategic vision and blueprint for research, policy and communications to reduce the harm from tobacco for the US. Short-term and long-term objectives are described. PMID:19240228
Regularized Semiparametric Estimation for Ordinary Differential Equations
Li, Yun; Zhu, Ji; Wang, Naisyin
2015-01-01
Ordinary differential equations (ODEs) are widely used in modeling dynamic systems and have ample applications in the fields of physics, engineering, economics and biological sciences. The ODE parameters often possess physiological meanings and can help scientists gain better understanding of the system. One key interest is thus to well estimate these parameters. Ideally, constant parameters are preferred due to their easy interpretation. In reality, however, constant parameters can be too restrictive such that even after incorporating error terms, there could still be unknown sources of disturbance that lead to poor agreement between observed data and the estimated ODE system. In this paper, we address this issue and accommodate short-term interferences by allowing parameters to vary with time. We propose a new regularized estimation procedure on the time-varying parameters of an ODE system so that these parameters could change with time during transitions but remain constants within stable stages. We found, through simulation studies, that the proposed method performs well and tends to have less variation in comparison to the non-regularized approach. On the theoretical front, we derive finite-sample estimation error bounds for the proposed method. Applications of the proposed method to modeling the hare-lynx relationship and the measles incidence dynamic in Ontario, Canada lead to satisfactory and meaningful results. PMID:26392639
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
A Study of Clinically Related Open Source Software Projects
Hogarth, Michael A.; Turner, Stuart
2005-01-01
Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056
Pharmaceuticals and Hormones in the Environment
Some of the earliest initial reports from Europe and the United States demonstrated that a variety of pharmaceuticals and hormones could be found in surface waters, source waters, drinking water, and influents and effluents from wastewater treatment plants (WWTPs). It is unknown...
Distributed Adaptive Neural Control for Stochastic Nonlinear Multiagent Systems.
Wang, Fang; Chen, Bing; Lin, Chong; Li, Xuehua
2016-11-14
In this paper, a consensus tracking problem of nonlinear multiagent systems is investigated under a directed communication topology. All the followers are modeled by stochastic nonlinear systems in nonstrict feedback form, where nonlinearities and stochastic disturbance terms are totally unknown. Based on the structural characteristic of neural networks (in Lemma 4), a novel distributed adaptive neural control scheme is put forward. The raised control method not only effectively handles unknown nonlinearities in nonstrict feedback systems, but also copes with the interactions among agents and coupling terms. Based on the stochastic Lyapunov functional method, it is indicated that all the signals of the closed-loop system are bounded in probability and all followers' outputs are convergent to a neighborhood of the output of leader. At last, the efficiency of the control method is testified by a numerical example.
Lipid Quality in Infant Nutrition: Current Knowledge and Future Opportunities
Delplanque, Bernadette; Gibson, Robert; Koletzko, Berthold; Lapillonne, Alexandre; Strandvik, Birgitta
2015-01-01
Abstract Dietary lipids are key for infants to not only meet their high energy needs but also fulfill numerous metabolic and physiological functions critical to their growth, development, and health. The lipid composition of breast milk varies during lactation and according to the mother's diet, whereas the lipid composition of infant formulae varies according to the blend of different fat sources. This report compares the compositions of lipids in breast milk and infant formulae, and highlights the roles of dietary lipids in term and preterm infants and their potential biological and health effects. The major differences between breast milk and formulae lie in a variety of saturated fatty acids (such as palmitic acid, including its structural position) and unsaturated fatty acids (including arachidonic acid and docosahexaenoic acid), cholesterol, and complex lipids. The functional outcomes of these differences during infancy and for later child and adult life are still largely unknown, and some of them are discussed, but there is consensus that opportunities exist for improvements in the qualitative lipid supply to infants through the mother's diet or infant formulae. Furthermore, research is required in several areas, including the needs of term and preterm infants for long-chain polyunsaturated fatty acids, the sites of action and clinical effects of lipid mediators on immunity and inflammation, the role of lipids on metabolic, neurological, and immunological outcomes, and the mechanisms by which lipids act on short- and long-term health. PMID:25883056
Using an epiphytic moss to identify previously unknown sources of atmospheric cadmium pollution.
Donovan, Geoffrey H; Jovan, Sarah E; Gatziolis, Demetrios; Burstyn, Igor; Michael, Yvonne L; Amacher, Michael C; Monleon, Vicente J
2016-07-15
Urban networks of air-quality monitors are often too widely spaced to identify sources of air pollutants, especially if they do not disperse far from emission sources. The objectives of this study were to test the use of moss bio-indicators to develop a fine-scale map of atmospherically-derived cadmium and to identify the sources of cadmium in a complex urban setting. We collected 346 samples of the moss Orthotrichum lyellii from deciduous trees in December, 2013 using a modified randomized grid-based sampling strategy across Portland, Oregon. We estimated a spatial linear model of moss cadmium levels and predicted cadmium on a 50m grid across the city. Cadmium levels in moss were positively correlated with proximity to two stained-glass manufacturers, proximity to the Oregon-Washington border, and percent industrial land in a 500m buffer, and negatively correlated with percent residential land in a 500m buffer. The maps showed very high concentrations of cadmium around the two stained-glass manufacturers, neither of which were known to environmental regulators as cadmium emitters. In addition, in response to our findings, the Oregon Department of Environmental Quality placed an instrumental monitor 120m from the larger stained-glass manufacturer in October, 2015. The monthly average atmospheric cadmium concentration was 29.4ng/m(3), which is 49 times higher than Oregon's benchmark of 0.6ng/m(3), and high enough to pose a health risk from even short-term exposure. Both stained-glass manufacturers voluntarily stopped using cadmium after the monitoring results were made public, and the monthly average cadmium levels precipitously dropped to 1.1ng/m(3) for stained-glass manufacturer #1 and 0.67ng/m(3) for stained-glass manufacturer #2. Published by Elsevier B.V.
Li, Shanshan; Flint, Alan; Pai, Jennifer K; Forman, John P; Hu, Frank B; Willett, Walter C; Rexrode, Kathryn M; Mukamal, Kenneth J; Rimm, Eric B
2014-09-22
The healthiest dietary pattern for myocardial infarction (MI) survivors is not known. Specific long-term benefits of a low-carbohydrate diet (LCD) are unknown, whether from animal or vegetable sources. There is a need to examine the associations between post-MI adherence to an LCD and all-cause and cardiovascular mortality. We included 2258 women from the Nurses' Health Study and 1840 men from the Health Professional Follow-Up Study who had survived a first MI during follow-up and provided a pre-MI and at least 1 post-MI food frequency questionnaire. Adherence to an LCD high in animal sources of protein and fat was associated with higher all-cause and cardiovascular mortality (hazard ratios of 1.33 [95% CI: 1.06 to 1.65] for all-cause mortality and 1.51 [95% CI: 1.09 to 2.07] for cardiovascular mortality comparing extreme quintiles). An increase in adherence to an animal-based LCD prospectively assessed from the pre- to post-MI period was associated with higher all-cause mortality and cardiovascular mortality (hazard ratios of 1.30 [95% CI: 1.03 to 1.65] for all-cause mortality and 1.53 [95% CI: 1.10 to 2.13] for cardiovascular mortality comparing extreme quintiles). An increase in adherence to a plant-based LCD was not associated with lower all-cause or cardiovascular mortality. Greater adherence to an LCD high in animal sources of fat and protein was associated with higher all-cause and cardiovascular mortality post-MI. We did not find a health benefit from greater adherence to an LCD overall after MI. © 2014 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Ziska, Lewis H; Pettis, Jeffery S; Edwards, Joan; Hancock, Jillian E; Tomecek, Martha B; Clark, Andrew; Dukes, Jeffrey S; Loladze, Irakli; Polley, H Wayne
2016-04-13
At present, there is substantive evidence that the nutritional content of agriculturally important food crops will decrease in response to rising levels of atmospheric carbon dioxide, Ca However, whether Ca-induced declines in nutritional quality are also occurring for pollinator food sources is unknown. Flowering late in the season, goldenrod (Solidago spp.) pollen is a widely available autumnal food source commonly acknowledged by apiarists to be essential to native bee (e.g. Bombus spp.) and honeybee (Apis mellifera) health and winter survival. Using floral collections obtained from the Smithsonian Natural History Museum, we quantified Ca-induced temporal changes in pollen protein concentration of Canada goldenrod (Solidago canadensis), the most wide spread Solidago taxon, from hundreds of samples collected throughout the USA and southern Canada over the period 1842-2014 (i.e. a Ca from approx. 280 to 398 ppm). In addition, we conducted a 2 year in situtrial of S. Canadensis populations grown along a continuous Ca gradient from approximately 280 to 500 ppm. The historical data indicated a strong significant correlation between recent increases in Ca and reductions in pollen protein concentration (r(2)= 0.81). Experimental data confirmed this decrease in pollen protein concentration, and indicated that it would be ongoing as Ca continues to rise in the near term, i.e. to 500 ppm (r(2)= 0.88). While additional data are needed to quantify the subsequent effects of reduced protein concentration for Canada goldenrod on bee health and population stability, these results are the first to indicate that increasing Ca can reduce protein content of a floral pollen source widely used by North American bees. © 2016 The Author(s).
Emerging Disparities in Dietary Sodium Intake from Snacking in the US Population
Dunford, Elizabeth K.; Poti, Jennifer M.; Popkin, Barry M.
2017-01-01
Background: The US population consumes dietary sodium well in excess of recommended levels. It is unknown how the contribution of snack foods to sodium intake has changed over time, and whether disparities exist within specific subgroups of the US population. Objective: To examine short and long term trends in the contribution of snack food sources to dietary sodium intake for US adults and children over a 37-year period from 1977 to 2014. Methods: We used data collected from eight nationally representative surveys of food intake in 50,052 US children aged 2–18 years, and 73,179 adults aged 19+ years between 1977 and 2014. Overall, patterns of snack food consumption, trends in sodium intake from snack food sources and trends in food and beverage sources of sodium from snack foods across race-ethnic, age, gender, body mass index, household education and income groups were examined. Results: In all socio-demographic subgroups there was a significant increase in both per capita sodium intake, and the proportion of sodium intake derived from snacks from 1977–1978 to 2011–2014 (p < 0.01). Those with the lowest household education, Non-Hispanic Black race-ethnicity, and the lowest income had the largest increase in sodium intake from snacks. While in 1977–1978 Non-Hispanic Blacks had a lower sodium intake from snacks compared to Non-Hispanic Whites (p < 0.01), in 2011–2014 they had a significantly higher intake. Conclusions: Important disparities are emerging in dietary sodium intake from snack sources in Non-Hispanic Blacks. Our findings have implications for future policy interventions targeting specific US population subgroups. PMID:28629146
NASA Astrophysics Data System (ADS)
Marques, Joana; Corby, Patricia M.; Barber, Cheryl A.; Abrams, William R.; Malamud, Daniel
2015-05-01
The field of "salivary diagnostics" includes studies utilizing samples obtained from a variety of sources within the oral cavity. These samples include; whole unstimulated saliva, stimulated whole saliva, duct saliva collected directly from the parotid, submandibular/sublingual glands or minor salivary glands, swabs of the buccal mucosa, tongue or tonsils, and gingival crevicular fluid. Many publications state "we collected saliva from subjects" without fully describing the process or source of the oral fluid. Factors that need to be documented in any study include the time of day of the collection, the method used to stimulate and collect the fluid, and how much fluid is being collected and for how long. The handling of the oral fluid during and post-collection is also critical and may include addition of protease or nuclease inhibitors, centrifugation, and cold or frozen storage prior to assay. In an effort to create a standard protocol for determining a biomarker's origin we carried out a pilot study collecting oral fluid from 5 different sites in the mouth and monitoring the concentrations of pro- and anti-inflammatory cytokines detected using MesoScaleDiscovery (MSD) electrochemiluminesence assays. Our data suggested that 3 of the cytokines are primarily derived from the submandibular gland, while 7 of the cytokines come from a source other than the major salivary glands such as the minor salivary glands or cells in the oral mucosae. Here we review the literature on monitoring biomarkers in oral samples and stress the need for determining the blood/saliva ratio when a quantitative determination is needed and suggest that the term oral diagnostic be used if the source of an analyte in the oral cavity is unknown.
Serebruany, Victor L; Cherepanov, Vasily; Kim, Moo Hyun; Litvinov, Oleg; Cabrera-Fuentes, Hector A; Marciniak, Thomas A
The US Food and Drug Administration Adverse Event Reporting System (FAERS) is a global passive surveillance database that relies on voluntary reporting by health care professionals and consumers as well as required mandatory reporting by pharmaceutical manufacturers. However, the initial filers and comparative patterns for oral P2Y12 platelet inhibitor reporting are unknown. We assessed who generated original FAERS reports for clopidogrel, prasugrel, and ticagrelor in 2015. From the FAERS database we extracted and examined adverse event cases coreported with oral P2Y12 platelet inhibitors. All adverse event filing originating sources were dichotomized into consumers, lawyers, pharmacists, physicians, other health care professionals, and unknown. Overall, 2015 annual adverse events were more commonly coreported with clopidogrel (n = 13,234) with known source filers (n = 12,818, or 96.9%) than with prasugrel (2,896; 98.9% out of 2,927 cases) or ticagrelor (2,163, or 82.3%, out of 2,627 cases, respectively). Overall, most adverse events were filed by consumers (8,336, or 44.4%), followed by physicians (5,290, or 28.2%), other health care professionals (2,997, or 16.0%), pharmacists (1,125, or 6.0%), and finally by lawyers (129, or 0.7%). The origin of 811 (4.7%) initial reports remains unknown. The adverse event filing sources differ among drugs. While adverse events coreported with clopidogrel and prasugrel were commonly originated by patients (40.4 and 84.3%, respectively), most frequently ticagrelor reports (42.5%) were filed by physicians. The reporting quality and initial sources differ among oral P2Y12 platelet inhibitors in FAERS. The ticagrelor surveillance in 2015 was inadequate when compared to clopidogrel and prasugrel. Patients filed most adverse events for clopidogrel and prasugrel, while physicians originated most ticagrelor complaints. These differences justify stricter compliance control for ticagrelor manufacturers and may be attributed to the confusion of treating physicians with unexpected fatal, cardiac, and thrombotic adverse events linked to ticagrelor. © 2017 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Krklec, Kristina; Domínguez-Villar, David; Lojen, Sonja
2018-06-01
The stable isotope composition of precipitation records processes taking place within the hydrological cycle. Potentially, moisture sources are important controls on the stable isotope composition of precipitation, but studies focused on this topic are still scarce. We studied the moisture sources contributing to precipitation at Postojna (Slovenia) from 2009 to 2013. Back trajectory analyses were computed for the days with precipitation at Postojna. The moisture uptake locations were identified along these trajectories using standard hydrometeorological formulation. The moisture uptake locations were integrated in eight source regions to facilitate its comparison to the monthly oxygen isotope composition (δ18O values) of precipitation. Nearly half of the precipitation originated from continental sources (recycled moisture), and >40% was from central and western Mediterranean. Results show that moisture sources do not have a significant impact on the oxygen isotope composition at this site. We suggest that the large proportion of recycled moisture originated from transpiration rather than evaporation, which produced water vapour with less negative δ18O values. Thus the difference between the oceanic and local vapour source was reduced, which prevented the distinction of the moisture sources based on their oxygen isotope signature. Nevertheless, δ18O values of precipitation are partially controlled by climate parameters, which is of major importance for paleoclimate studies. We found that the main climate control on Postojna δ18O values of precipitation is the surface temperature. Amount effect was not recorded at this site, and the winter North Atlantic Oscillation (NAO) does not impact the δ18O values of precipitation. The Western Mediterranean Oscillation (WeMO) was correlated to oxygen stable isotope composition, although this atmospheric pattern was not a control. Instead we found that the link to δ18O values results from synoptic scenarios affecting WeMO index as well as temperature. Therefore, interpretation of δ18O values of precipitation in terms of climate is limited to surface temperature, although at least half of the variability observed still depends on unknown controls of the hydrological cycle.
Blind source separation by sparse decomposition
NASA Astrophysics Data System (ADS)
Zibulevsky, Michael; Pearlmutter, Barak A.
2000-04-01
The blind source separation problem is to extract the underlying source signals from a set of their linear mixtures, where the mixing matrix is unknown. This situation is common, eg in acoustics, radio, and medical signal processing. We exploit the property of the sources to have a sparse representation in a corresponding signal dictionary. Such a dictionary may consist of wavelets, wavelet packets, etc., or be obtained by learning from a given family of signals. Starting from the maximum a posteriori framework, which is applicable to the case of more sources than mixtures, we derive a few other categories of objective functions, which provide faster and more robust computations, when there are an equal number of sources and mixtures. Our experiments with artificial signals and with musical sounds demonstrate significantly better separation than other known techniques.
Radiological analysis of plutonium glass batches with natural/enriched boron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainisch, R.
2000-06-22
The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less
Personnel Dose Assessment during Active Interrogation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Akkurt, Hatice; Patton, Bruce W
A leading candidate in the detection of special nuclear material (SNM) is active interrogation (AI). Unlike passive interrogation, AI uses a source to enhance or create a detectable signal from SNM (usually fission), particularly in shielded scenarios or scenarios where the SNM has a low activity. The use of AI thus makes the detection of SNM easier or, in some scenarios, even enables previously impossible detection. During the development of AI sources, significant effort is put into determining the source strength required to detect SNM in specific scenarios. Usually during this process, but not always, an evaluation of personnel dosemore » is also completed. In this instance personnel dose could involve any of the following: (1) personnel performing the AI; (2) unknown stowaways who are inside the object being interrogated; or (3) in clandestine interrogations, personnel who are known to be inside the object being interrogated but are unaware of the interrogation. In most instances, dose to anyone found smuggling SNM will be a secondary issue. However, for the organizations performing the AI, legal if not moral considerations should make dose to the personnel performing the AI, unknown stowaways, or innocent bystanders in clandestine interrogations a serious concern.« less
Tuning into Scorpius X-1: adapting a continuous gravitational-wave search for a known binary system
NASA Astrophysics Data System (ADS)
Meadors, Grant David; Goetz, Evan; Riles, Keith
2016-05-01
We describe how the TwoSpect data analysis method for continuous gravitational waves (GWs) has been tuned for directed sources such as the low-mass X-ray binary (LMXB), Scorpius X-1 (Sco X-1). A comparison of five search algorithms generated simulations of the orbital and GW parameters of Sco X-1. Whereas that comparison focused on relative performance, here the simulations help quantify the sensitivity enhancement and parameter estimation abilities of this directed method, derived from an all-sky search for unknown sources, using doubly Fourier-transformed data. Sensitivity is shown to be enhanced when the source sky location and period are known, because we can run a fully templated search, bypassing the all-sky hierarchical stage using an incoherent harmonic sum. The GW strain and frequency, as well as the projected semi-major axis of the binary system, are recovered and uncertainty estimated, for simulated signals that are detected. Upper limits for GW strain are set for undetected signals. Applications to future GW observatory data are discussed. Robust against spin-wandering and computationally tractable despite an unknown frequency, this directed search is an important new tool for finding gravitational signals from LMXBs.
Inverse modeling of April 2013 radioxenon detections
NASA Astrophysics Data System (ADS)
Hofman, Radek; Seibert, Petra; Philipp, Anne
2014-05-01
Significant concentrations of radioactive xenon isotopes (radioxenon) were detected by the International Monitoring System (IMS) for verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) in April 2013 in Japan. Particularly, three detections of Xe-133 made between 2013-04-07 18:00 UTC and 2013-04-09 06:00 UTC at the station JPX38 are quite notable with respect to the measurement history of the station. Our goal is to analyze the data and perform inverse modeling under different assumptions. This work is useful with respect to nuclear test monitoring as well as for the analysis of and response to nuclear emergencies. Two main scenarios will be pursued: (i) Source location is assumed to be known (DPRK test site). (ii) Source location is considered unknown. We attempt to estimate the source strength and the source strength along with its plausible location compatible with the data in scenario (i) and (ii), respectively. We are considering also the possibility of a vertically distributed source. Calculations of source-receptor sensitivity (SRS) fields and the subsequent inversion are aimed at going beyond routine calculations performed by the CTBTO. For SRS calculations, we employ the Lagrangian particle dispersion model FLEXPART with high resolution ECMWF meteorological data (grid cell sizes of 0.5, 0.25 and ca. 0.125 deg). This is important in situations where receptors or sources are located in complex terrain which is the case of the likely source of detections-the DPRK test site. SRS will be calculated with convection enabled in FLEXPART which will also increase model accuracy. In the variational inversion procedure attention will be paid not only to all significant detections and their uncertainties but also to non-detections which can have a large impact on inversion quality. We try to develop and implement an objective algorithm for inclusion of relevant data where samples from temporal and spatial vicinity of significant detections are added in an iterative manner and the inversion is recalculated in each iteration. This procedure should gradually narrow down the set of hypotheses on the source term, where the source term is here understood as an emission in both spatial and temporal domains. Especially in scenario (ii) we expect a strong impact of non-detections for the reduction of possible solutions. For these and also other purposes like statistical quantification of typical background values, measurements from all IMS noble gas stations north of 30 deg S for a period from January to June 2013 were extracted from vDEC platform. We would like to acknowledge the Preparatory Commission for the CTBTO for kindly providing limited access to the IMS data. This work contains only opinions of the authors, which can not in any case establish legal engagement of the Provisional Technical Secretariat of the CTBTO. This work is partially financed through the project "PREPARE: Innovative integrated tools and platforms for radiological emergency preparedness and post-accident response in Europe" (FP7, Grant 323287).
NASA Astrophysics Data System (ADS)
Perez, Pedro B.; Hamawi, John N.
2017-09-01
Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.
9. Excavation work at Pleasant Dam (now called Waddell Dam). ...
9. Excavation work at Pleasant Dam (now called Waddell Dam). Photographer unknown, July, 22, 1926. Source: Maricopa County Municipal Water Conservation District Number One (MWD). - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
9. Upstream view showing diversion flume at lower left and ...
9. Upstream view showing diversion flume at lower left and mixing plant at left center. Photographer unknown, June 9, 1924. Source: Salt River Project. - Mormon Flat Dam, On Salt River, Eastern Maricopa County, east of Phoenix, Phoenix, Maricopa County, AZ
27. Evening view of downstream face of Pleasant Dam under ...
27. Evening view of downstream face of Pleasant Dam under construction. Part of construction camp housing is visible in foreground. Photographer unknown, 1927. Source: MWD. - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
Evidence for foliar endophytic nitrogen fixation in a widely distributed subalpine conifer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moyes, Andrew B.; Kueppers, Lara M.; Pett-Ridge, Jennifer
Coniferous forest nitrogen (N) budgets indicate unknown sources of N. A consistent association between limber pine ( Pinus flexilis) and potential N 2-fixing acetic acid bacteria (AAB) indicates that native foliar endophytes may supply subalpine forests with N.
10. Downstream face of Mormon Flat Dam under construction. Cement ...
10. Downstream face of Mormon Flat Dam under construction. Cement storage shed is at center right. Photographer unknown, September 1924. Source: Salt River Project. - Mormon Flat Dam, On Salt River, Eastern Maricopa County, east of Phoenix, Phoenix, Maricopa County, AZ
Evidence for foliar endophytic nitrogen fixation in a widely distributed subalpine conifer
Moyes, Andrew B.; Kueppers, Lara M.; Pett-Ridge, Jennifer; ...
2016-02-01
Coniferous forest nitrogen (N) budgets indicate unknown sources of N. A consistent association between limber pine ( Pinus flexilis) and potential N 2-fixing acetic acid bacteria (AAB) indicates that native foliar endophytes may supply subalpine forests with N.
Development of Rapid Canine Fecal Source Identification PCR-based Assays
The extent to which dogs contribute to aquatic fecal contamination is unknown despite the potential for zoonotic transfer of harmful human pathogens. We used Genome Fragment Enrichment (GFE) to identify novel non-ribosomal microbial genetic markers potentially useful for detectin...
A new population of very high energy gamma-ray sources in the Milky Way.
Aharonian, F; Akhperjanian, A G; Aye, K-M; Bazer-Bachi, A R; Beilicke, M; Benbow, W; Berge, D; Berghaus, P; Bernlöhr, K; Boisson, C; Bolz, O; Borgmeier, C; Braun, I; Breitling, F; Brown, A M; Gordo, J Bussons; Chadwick, P M; Chounet, L-M; Cornils, R; Costamante, L; Degrange, B; Djannati-Ataï, A; Drury, L O'C; Dubus, G; Ergin, T; Espigat, P; Feinstein, F; Fleury, P; Fontaine, G; Funk, S; Gallant, Y A; Giebels, B; Gillessen, S; Goret, P; Hadjichristidis, C; Hauser, M; Heinzelmann, G; Henri, G; Hermann, G; Hinton, J A; Hofmann, W; Holleran, M; Horns, D; de Jager, O C; Jung, I; Khélifi, B; Komin, Nu; Konopelko, A; Latham, I J; Le Gallou, R; Lemière, A; Lemoine, M; Leroy, N; Lohse, T; Marcowith, A; Masterson, C; McComb, T J L; de Naurois, M; Nolan, S J; Noutsos, A; Orford, K J; Osborne, J L; Ouchrif, M; Panter, M; Pelletier, G; Pita, S; Pühlhofer, G; Punch, M; Raubenheimer, B C; Raue, M; Raux, J; Rayner, S M; Redondo, I; Reimer, A; Reimer, O; Ripken, J; Rob, L; Rolland, L; Rowell, G; Sahakian, V; Saugé, L; Schlenker, S; Schlickeiser, R; Schuster, C; Schwanke, U; Siewert, M; Sol, H; Steenkamp, R; Stegmann, C; Tavernet, J-P; Terrier, R; Théoret, C G; Tluczykont, M; van der Walt, D J; Vasileiadis, G; Venter, C; Vincent, P; Visser, B; Völk, H J; Wagner, S J
2005-03-25
Very high energy gamma-rays probe the long-standing mystery of the origin of cosmic rays. Produced in the interactions of accelerated particles in astrophysical objects, they can be used to image cosmic particle accelerators. A first sensitive survey of the inner part of the Milky Way with the High Energy Stereoscopic System (HESS) reveals a population of eight previously unknown firmly detected sources of very high energy gamma-rays. At least two have no known radio or x-ray counterpart and may be representative of a new class of "dark" nucleonic cosmic ray sources.
Convergence Rates for Multivariate Smoothing Spline Functions.
1982-10-01
GAI (,T) g (T)dT - g In order to show convergence of the series and obtain bounds on the terms, we need to estimate £ Now (1 + Ay v) AyV ( g ,#V...Cox* Technical Summary Report #2437 October 1982 ABSTRACT Given data z i - g (ti ) + ci, 1 4 i 4 n, where g is the unknown function, the ti are unknown...d-dimensional variables in a domain fl, and the ei are i.i.d. random errors, the smoothing spline estimate g n is defined to be the
Lu, Feng; Matsushita, Yasuyuki; Sato, Imari; Okabe, Takahiro; Sato, Yoichi
2015-10-01
We propose an uncalibrated photometric stereo method that works with general and unknown isotropic reflectances. Our method uses a pixel intensity profile, which is a sequence of radiance intensities recorded at a pixel under unknown varying directional illumination. We show that for general isotropic materials and uniformly distributed light directions, the geodesic distance between intensity profiles is linearly related to the angular difference of their corresponding surface normals, and that the intensity distribution of the intensity profile reveals reflectance properties. Based on these observations, we develop two methods for surface normal estimation; one for a general setting that uses only the recorded intensity profiles, the other for the case where a BRDF database is available while the exact BRDF of the target scene is still unknown. Quantitative and qualitative evaluations are conducted using both synthetic and real-world scenes, which show the state-of-the-art accuracy of smaller than 10 degree without using reference data and 5 degree with reference data for all 100 materials in MERL database.
8. Photographic copy of photograph. (Source: Department of Interior. Bureau ...
8. Photographic copy of photograph. (Source: Department of Interior. Bureau of Reclamation. Bitterroot Project History 1931-1962. National Archives, Denver, RG 115, Accession #115-90-039, Box 243) Photographer unknown. View of original rock-fill crib diversion structure, September 13, 1949. Diversion and head works for big ditch on Rock Creek. - Bitter Root Irrigation Project, Rock Creek Diversion Dam, One mile east of Como Dam, west of U.S. Highway 93, Darby, Ravalli County, MT
10. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
10. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1919. Vol. I, RG 75, Entry 655, BOx 25, National Archives, Washington, D.C.) Photographer unknown. SACATION DAM SITE LOOKING SOUTH SHOWING HEADWORKS OF SAN TAN FLOOD-WATER CANAL - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
23. Photographic copy of photograph. (Source: U.S. Department of interior. ...
23. Photographic copy of photograph. (Source: U.S. Department of interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1926. Vol. I, Narrative and Photographs, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown, SACATON DAM, NORTH SIDE SIPHON AND INTAKE GATES, 2/23/26 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
14. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
14. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, PRISON WASH CROSSING, 3/9/25 - San Carlos Irrigation Project, Marin Canal, Amhurst-Hayden Dam to Picacho Reservoir, Coolidge, Pinal County, AZ
14. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
14. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, CONSTRUCTION BRIDGE PIERS - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
16. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
16. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, CHECK AND PIMA, LATERAL TURNOUT, 3/9/25 - San Carlos Irrigation Project, Marin Canal, Amhurst-Hayden Dam to Picacho Reservoir, Coolidge, Pinal County, AZ
15. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
15. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, BOGARD WASH INTAKE, 3/9/25 - San Carlos Irrigation Project, Marin Canal, Amhurst-Hayden Dam to Picacho Reservoir, Coolidge, Pinal County, AZ
22. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
22. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1926. Vol. I, Narrative and Photographs, RG 75, Entry 655, Box 29, National Archives, Washington, DC.) Photographer unknown. SACATON DAM, SOUTH END WITH CANAL AND ROADWAY, 8/29/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
Ahmed, Anwar E
2017-09-01
Although the literature indicates that patient delays in seeking medical support for Middle East respiratory syndrome coronavirus (MERS-CoV) infections are associated with poor clinical outcomes, delays in the diagnosis itself remain poorly understood in these patients. This study aimed to determine the median time interval from symptom onset to a confirmed diagnosis and to identify the potential predictors of this interval in Saudi Arabian MERS patients. This was a retrospective study of patients with confirmed MERS who were publicly reported by the World Health Organization (WHO). Five hundred and thirty-seven symptomatic cases of MERS-CoV infection were included. The median time interval between symptom onset and confirmation of the MERS diagnosis was 4 days (interquartile range 2-7 days), ranging from 0 to 36 days. According to the negative binomial model, the unadjusted rate ratio (RR) of delays in the diagnosis was significantly higher in older patients (>65 years) (RR 1.42), non-healthcare workers (RR 1.74), patients with severe illness (RR 1.22), those with an unknown source of infection (RR 1.84), and those who had been in close contact with camels (RR 1.74). After accounting for confounders, the adjusted rate ratio (aRR) of delays in the diagnosis was independently associated with unknown source of infection (aRR 1.68) and close contact with camels (aRR 1.58). The time interval from symptom onset to diagnosis was greater in older patients, non-healthcare workers, patients with severe illness, patients with an unknown source of infection, and patients who had been in close contact with camels. The findings warrant educational interventions to raise general public awareness of the importance of early symptom notification. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
ERIC Educational Resources Information Center
Wong, Winsy; Low, Sam-Po
2008-01-01
The present study investigated verbal recall of semantically preserved and degraded words and nonwords by taking into consideration the status of one's semantic short-term memory (STM). Two experiments were conducted on 2 Chinese individuals with aphasia. The first experiment showed that they had largely preserved phonological processing abilities…
Aaron D. Stottlemyer
2014-01-01
The long-term impacts of even-age forest management and excessive browsing by white-tailed deer (Odocoileus virginianus) on regeneration are unknown for hardwood forests in the eastern United States. In 1965, Gene Wood, a graduate student at Pennsylvania State University, initiated a study in a mixed-oak forest in central Pennsylvania to examine...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
26 CFR 1.737-3 - Basis adjustments; Recovery rules.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...
Method and apparatus for sensor fusion
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Inventor); Shaw, Scott (Inventor); Defigueiredo, Rui J. P. (Inventor)
1991-01-01
Method and apparatus for fusion of data from optical and radar sensors by error minimization procedure is presented. The method was applied to the problem of shape reconstruction of an unknown surface at a distance. The method involves deriving an incomplete surface model from an optical sensor. The unknown characteristics of the surface are represented by some parameter. The correct value of the parameter is computed by iteratively generating theoretical predictions of the radar cross sections (RCS) of the surface, comparing the predicted and the observed values for the RCS, and improving the surface model from results of the comparison. Theoretical RCS may be computed from the surface model in several ways. One RCS prediction technique is the method of moments. The method of moments can be applied to an unknown surface only if some shape information is available from an independent source. The optical image provides the independent information.
Kaklamanos, James; Baise, Laurie G.; Boore, David M.
2011-01-01
The ground-motion prediction equations (GMPEs) developed as part of the Next Generation Attenuation of Ground Motions (NGA-West) project in 2008 are becoming widely used in seismic hazard analyses. However, these new models are considerably more complicated than previous GMPEs, and they require several more input parameters. When employing the NGA models, users routinely face situations in which some of the required input parameters are unknown. In this paper, we present a framework for estimating the unknown source, path, and site parameters when implementing the NGA models in engineering practice, and we derive geometrically-based equations relating the three distance measures found in the NGA models. Our intent is for the content of this paper not only to make the NGA models more accessible, but also to help with the implementation of other present or future GMPEs.
CHEMICAL CONTAMINATION AND TOXICITY ASSOCIATED WITH A COASTAL GOLF COURSE COMPLEX
The increasing density of golf courses represents a potential source of contamination to nearby coastal areas, the chemical and biological magnitude of which is almost unknown. The objective of this study was to compare the concentrations of contaminants and toxicities of sedime...
SEDIMENT CHEMICAL CONTAMINATION AND TOXICITY ASSOCIATED WITH A COASTAL GOLF COURSE COMPLEX.
The increasing density of golf courses represents a potential source of sediment contamination to nearby coastal areas, the chemical and biological magnitude of which is almost unknown. The objective of this study was to determine the concentrations of contaminants and toxicities...
Development and Testing of Novel Canine Fecal Source-Identification Assays
The extent to which dogs contribute to aquatic fecal contamination is unknown despite the potential for zoonotic transfer of harmful human pathogens. Recent method comparison studies have shown that available Bacteroidales 16S rRNA-based methods for the detection of canine fecal ...
The pointwise estimates of diffusion wave of the compressible micropolar fluids
NASA Astrophysics Data System (ADS)
Wu, Zhigang; Wang, Weike
2018-09-01
The pointwise estimates for the compressible micropolar fluids in dimension three are given, which exhibit generalized Huygens' principle for the fluid density and fluid momentum as the compressible Navier-Stokes equation, while the micro-rational momentum behaves like the fluid momentum of the Euler equation with damping. To circumvent the complexity from 7 × 7 Green's matrix, we use the decomposition of fluid part and electromagnetic part for the momentums to study three smaller Green's matrices. The following from this decomposition is that we have to deal with the new problem that the nonlinear terms contain nonlocal operators. We solve it by using the natural match of these new Green's functions and the nonlinear terms. Moreover, to derive the different pointwise estimates for different unknown variables such that the estimate of each unknown variable is in agreement with its Green's function, we develop some new estimates on the nonlinear interplay between different waves.
Physicians' Knowledge of and Attitudes Toward Use of Opioids in Long-Term Care Facilities.
Griffioen, Charlotte; Willems, Eva G; Kouwenhoven, Sanne M; Caljouw, Monique A A; Achterberg, Wilco P
2017-06-01
Insufficient pain management in vulnerable older persons living in long-term care facilities is common, and opiophobia might contribute to this. As opiophobia and its related factors have not been investigated in long-term care, this study evaluates the degree of knowledge of opioids among elderly-care physicians (ECPs) and ECP trainees, as well as their attitudes and other factors possibly influencing the clinical use of opioids in these facilities. A questionnaire was designed and distributed among ECPs and ECP trainees by email, regional symposia, and all three university training faculties for elderly-care medicine in the Netherlands. Respondents were 324 ECPs and 111 ECP trainees. Fear of addiction did not influence the prescription of opioids. Main barriers to the clinical use of opioids were patients' reluctance to take opioids (83.3%); unknown degree of pain (79.2%); and pain of unknown origin (51.4%). ECPs' average knowledge scores were sufficient: those who felt that their knowledge of opioids was poor scored lower than those who felt that their knowledge was good. Factors identified in this study may help provide better pain management for vulnerable older persons living in a long-term care facility. Also, more patient information on the pros and cons of opioid use is needed, as well as appropriate tools for better clinical assessment of pain in a long-term care population. © 2016 World Institute of Pain.
[Determination of unknown impurities in cefotiam hexetil by HPLC-MS/MS].
Tang, Qun-Xing; Liu, Ming-Dong; Yan, You-Yi; Ye, Yi; Wang, Zhi-Hui; Zhan, Lan-Fen; Liao, Lin-Chuan
2013-05-01
To detect unknown impurities in raw drug material of cefotiam hexetil. High performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) was employed for the determination of impurities in cefotiam hexetil. Agilent SB-C18 column (150 mm x 2.1 mm i. d. , 3.5 microm particles) was used for chromatographic separations of cofotiam hexetil dissolved in deionized water, with mobile phase consisting of (A) 0.1% formic acid and (B) acetonitrile and timed gradient program T (min)/B (%): 0/3, 5/3, 15/20, 20/40, 30/60, 40/80. The flow rate was set at 0. 3 mL/min with DAD detector wavelength fixed at 254 nm. Electrospray ionization source was applied and operated in positive ion MRM mode. The source voltage was kept at 4 kV and cone voltage was 100 V with the mass range m/z 50-1000. Nitrogen was used as nebulizing gas and the nebulizer pressure was 40 psi. The drying gas temperature was 350 degrees C and the drying gas flow was 10 L/min. Results Unknown impurities of cefotiam hexetil were identified. Substance 1 was delta3-isomer of cefotiam hexetil. The structures of 3 other substances were also determined. The method is sensitive, rapid and credible for the analysis of cefotiam hexetil and its related impurities, which can be applied in quality control of cefotiam hexetil.
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2014 CFR
2014-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2012 CFR
2012-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2010 CFR
2010-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2013 CFR
2013-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
10 CFR 50.67 - Accident source term.
Code of Federal Regulations, 2011 CFR
2011-01-01
... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...
A Strong Shallow Heat Source in the Accreting Neutron Star MAXI J0556-332
NASA Astrophysics Data System (ADS)
Deibel, Alex; Cumming, Andrew; Brown, Edward F.; Page, Dany
2015-08-01
An accretion outburst in an X-ray transient deposits material onto the neutron star primary; this accumulation of matter induces reactions in the neutron star’s crust. During the accretion outburst these reactions heat the crust out of thermal equilibrium with the core. When accretion halts, the crust cools to its long-term equilibrium temperature on observable timescales. Here we examine the accreting neutron star transient MAXI J0556-332, which is the hottest transient, at the start of quiescence, observed to date. Models of the quiescent light curve require a large deposition of heat in the shallow outer crust from an unknown source. The additional heat injected is ≈4-10 MeV per accreted nucleon; when the observed decline in accretion rate at the end of the outburst is accounted for, the required heating increases to ≈6-16 MeV. This shallow heating is still required to fit the light curve even after taking into account a second accretion episode, uncertainties in distance, and different surface gravities. The amount of shallow heating is larger than that inferred for other neutron star transients and is larger than can be supplied by nuclear reactions or compositionally driven convection; but it is consistent with stored mechanical energy in the accretion disk. The high crust temperature ({T}b≳ {10}9 {{K}}) makes its cooling behavior in quiescence largely independent of the crust composition and envelope properties, so that future observations will probe the gravity of the source. Fits to the light curve disfavor the presence of Urca cooling pairs in the crust.
Simon, J; Waldhecker, P; Brüggemann, N; Rennenberg, H
2010-05-01
To investigate the short-term consequences of direct competition between beech and sycamore maple on root N uptake and N composition, mycorrhizal seedlings of both tree species were incubated for 4 days (i.e. beech only, sycamore maple only or both together) in an artificial nutrient solution with low N availability. On the fourth day, N uptake experiments were conducted to study the effects of competition on inorganic and organic N uptake. For this purpose, multiple N sources were applied with a single label. Furthermore, fine roots were sampled and analysed for total amino acids, soluble protein, total nitrogen, nitrate and ammonium content. Our results clearly show that both tree species were able to use inorganic and organic N sources. Uptake of inorganic and organic N by beech roots was negatively affected in the presence of the competing tree species. In contrast, the presence of beech stimulated inorganic N uptake by sycamore maple roots. Both the negative effect of sycamore maple on N uptake of beech and the positive effect of beech on N uptake of sycamore maple led to an increase in root soluble protein in beech, despite an overall decrease in total N concentration. Thus, beech compensated for the negative effects of the tree competitor on N uptake by incorporating less N into structural N components, but otherwise exhibited the same strategy as the competitor, namely, enhancing soluble protein levels in roots when grown under competition. It is speculated that enhanced enzyme activities of so far unknown nature are required in beech as a defence response to inter-specific competition.
NASA Astrophysics Data System (ADS)
Cerovski-Darriau, C.; Stock, J. D.; Winans, W. R.
2016-12-01
Episodic storm runoff in West Maui (Hawai'i) brings plumes of terrestrially-sourced fine sediment to the nearshore ocean environment, degrading coral reef ecosystems. The sediment pollution sources were largely unknown, though suspected to be due to modern human disturbance of the landscape, and initially assumed to be from visibly obvious exposed soil on agricultural fields and unimproved roads. To determine the sediment sources and estimate a sediment budget for the West Maui watersheds, we mapped the geomorphic processes in the field and from DEMs and orthoimagery, monitored erosion rates in the field, and modeled the sediment flux using the mapped processes and corresponding rates. We found the primary source of fine sands, silts and clays to be previously unidentified fill terraces along the stream bed. These terraces, formed during legacy agricultural activity, are the banks along 40-70% of the streams where the channels intersect human-modified landscapes. Monitoring over the last year shows that a few storms erode the fill terraces 10-20 mm annually, contributing up to 100s of tonnes of sediment per catchment. Compared to the average long-term, geologic erosion rate of 0.03 mm/yr, these fill terraces alone increase the suspended sediment flux to the coral reefs by 50-90%. Stakeholders can use our resulting geomorphic process map and sediment budget to inform the location and type of mitigation effort needed to limit terrestrial sediment pollution. We compare our mapping, monitoring, and modeling (M3) approach to NOAA's OpenNSPECT model. OpenNSPECT uses empirical hydrologic and soil erosion models paired with land cover data to compare the spatially distributed sediment yield from different land-use scenarios. We determine the relative effectiveness of calculating a baseline watershed sediment yield from each approach, and the utility of calibrating OpenNSEPCT with M3 results to better forecast future sediment yields from land-use or climate change scenarios.
A direct localization of a fast radio burst and its host.
Chatterjee, S; Law, C J; Wharton, R S; Burke-Spolaor, S; Hessels, J W T; Bower, G C; Cordes, J M; Tendulkar, S P; Bassa, C G; Demorest, P; Butler, B J; Seymour, A; Scholz, P; Abruzzo, M W; Bogdanov, S; Kaspi, V M; Keimpema, A; Lazio, T J W; Marcote, B; McLaughlin, M A; Paragi, Z; Ransom, S M; Rupen, M; Spitler, L G; van Langevelde, H J
2017-01-04
Fast radio bursts are astronomical radio flashes of unknown physical nature with durations of milliseconds. Their dispersive arrival times suggest an extragalactic origin and imply radio luminosities that are orders of magnitude larger than those of all known short-duration radio transients. So far all fast radio bursts have been detected with large single-dish telescopes with arcminute localizations, and attempts to identify their counterparts (source or host galaxy) have relied on the contemporaneous variability of field sources or the presence of peculiar field stars or galaxies. These attempts have not resulted in an unambiguous association with a host or multi-wavelength counterpart. Here we report the subarcsecond localization of the fast radio burst FRB 121102, the only known repeating burst source, using high-time-resolution radio interferometric observations that directly image the bursts. Our precise localization reveals that FRB 121102 originates within 100 milliarcseconds of a faint 180-microJansky persistent radio source with a continuum spectrum that is consistent with non-thermal emission, and a faint (twenty-fifth magnitude) optical counterpart. The flux density of the persistent radio source varies by around ten per cent on day timescales, and very long baseline radio interferometry yields an angular size of less than 1.7 milliarcseconds. Our observations are inconsistent with the fast radio burst having a Galactic origin or its source being located within a prominent star-forming galaxy. Instead, the source appears to be co-located with a low-luminosity active galactic nucleus or a previously unknown type of extragalactic source. Localization and identification of a host or counterpart has been essential to understanding the origins and physics of other kinds of transient events, including gamma-ray bursts and tidal disruption events. However, if other fast radio bursts have similarly faint radio and optical counterparts, our findings imply that direct subarcsecond localizations may be the only way to provide reliable associations.
Unaccounted source of systematic errors in measurements of the Newtonian gravitational constant G
NASA Astrophysics Data System (ADS)
DeSalvo, Riccardo
2015-06-01
Many precision measurements of G have produced a spread of results incompatible with measurement errors. Clearly an unknown source of systematic errors is at work. It is proposed here that most of the discrepancies derive from subtle deviations from Hooke's law, caused by avalanches of entangled dislocations. The idea is supported by deviations from linearity reported by experimenters measuring G, similarly to what is observed, on a larger scale, in low-frequency spring oscillators. Some mitigating experimental apparatus modifications are suggested.
19. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
19. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, CANAL BRIDGE, OPERATING HOUSE AND INTAKE, SOUTH END, 2/14/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
12. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
12. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE FROM QUARRY HILL, PRACTICALLY COMPLETED, 6/18/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
16. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
16. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol, I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, SHOWING WEIR APRONS AND BRIDGE, 6/18/25. - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
24. Photographic copy of photograph. (Source: U.S. Department of interior. ...
24. Photographic copy of photograph. (Source: U.S. Department of interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1928. Vol I. Irrigation District #4, California and Southern Arizona, RG 75, BIA-Phoenix, Box 40, National Archives, Pacific Southwest Region) Photographer unknown. SACATON DAM, CONDUIT ANCHORING AND REINFORCING STEEL, APRIL 10, 1928 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
17. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
17. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, CONSTRUCTION OF WEIR, 1/17/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
15. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
15. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, CONSTRUCTION BRIDGE DECK, 4/5/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
13. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
13. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, CONSTRUCTION OF MAIN APRON, 12/9/24 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
11. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
11. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE SITE FROM QUARRY HILL, 10/1/24 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
Ongoing outbreak of invasive listeriosis, Germany, 2012 to 2015.
Ruppitsch, Werner; Prager, Rita; Halbedel, Sven; Hyden, Patrick; Pietzka, Ariane; Huhulescu, Steliana; Lohr, Dorothee; Schönberger, Katharina; Aichinger, Elisabeth; Hauri, Anja; Stark, Klaus; Vygen, Sabine; Tietze, Erhard; Allerberger, Franz; Wilking, Hendrik
2015-01-01
Listeriosis patient isolates in Germany have shown a new identical pulsed-field gel electrophoresis (PFGE) pattern since 2012 (n = 66). Almost all isolates (Listeria monocytogenes serotype 1/2a) belonged to cases living in southern Germany, indicating an outbreak with a so far unknown source. Case numbers in 2015 are high (n = 28). No outbreak cases outside Germany have been reported. Next generation sequencing revealed the unique cluster type CT1248 and confirmed the outbreak. Investigations into the source are ongoing.
Pulsar statistics and their interpretations
NASA Technical Reports Server (NTRS)
Arnett, W. D.; Lerche, I.
1981-01-01
It is shown that a lack of knowledge concerning interstellar electron density, the true spatial distribution of pulsars, the radio luminosity source distribution of pulsars, the real ages and real aging rates of pulsars, the beaming factor (and other unknown factors causing the known sample of about 350 pulsars to be incomplete to an unknown degree) is sufficient to cause a minimum uncertainty of a factor of 20 in any attempt to determine pulsar birth or death rates in the Galaxy. It is suggested that this uncertainty must impact on suggestions that the pulsar rates can be used to constrain possible scenarios for neutron star formation and stellar evolution in general.
Real time gamma-ray signature identifier
Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA
2012-05-15
A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.
Electromagnetic scattering from two-dimensional thick material junctions
NASA Technical Reports Server (NTRS)
Ricoy, M. A.; Volakis, John L.
1990-01-01
The problem of the plane wave diffraction is examined by an arbitrary symmetric two dimensional junction, where Generalized Impedance Boundary Conditions (GIBCs) and Generalized Sheet Transition Conditions (GSTCs) are employed to simulate the slabs. GIBCs and GSTCs are constructed for multilayer planar slabs of arbitrary thickness and the resulting GIBC/GSTC reflection coefficients are compared with exact counterparts to evaluate the GIBCs/GSTCs. The plane wave diffraction by a multilayer material slab recessed in a perfectly conducting ground plane is formulated and solved via the Generalized Scattering Matrix Formulation (GDMF) in conjunction with the dual integral equation approach. Various scattering patterns are computed and validated with exact results where possible. The diffraction by a material discontinuity in a thick dielectric/ferrite slab is considered by modelling the constituent slabs with GSTCs. A non-unique solution in terms of unknown constants is obtained, and these constants are evaluated for the recessed slab geometry by comparison with the solution obtained therein. Several other simplified cases are also presented and discussed. An eigenfunction expansion method is introduced to determine the unknown solution constants in the general case. This procedure is applied to the non-unique solution in terms of unknown constants; and scattering patterns are presented for various slab junctions and compared with alternative results where possible.
Zhang, Huaguang; Cui, Lili; Zhang, Xin; Luo, Yanhong
2011-12-01
In this paper, a novel data-driven robust approximate optimal tracking control scheme is proposed for unknown general nonlinear systems by using the adaptive dynamic programming (ADP) method. In the design of the controller, only available input-output data is required instead of known system dynamics. A data-driven model is established by a recurrent neural network (NN) to reconstruct the unknown system dynamics using available input-output data. By adding a novel adjustable term related to the modeling error, the resultant modeling error is first guaranteed to converge to zero. Then, based on the obtained data-driven model, the ADP method is utilized to design the approximate optimal tracking controller, which consists of the steady-state controller and the optimal feedback controller. Further, a robustifying term is developed to compensate for the NN approximation errors introduced by implementing the ADP method. Based on Lyapunov approach, stability analysis of the closed-loop system is performed to show that the proposed controller guarantees the system state asymptotically tracking the desired trajectory. Additionally, the obtained control input is proven to be close to the optimal control input within a small bound. Finally, two numerical examples are used to demonstrate the effectiveness of the proposed control scheme.
11. Buttress rising above stream bed elevation. Concrete mixing plant ...
11. Buttress rising above stream bed elevation. Concrete mixing plant is at right, west tower and placement tower boom are visible. Photographer unknown, November 24, 1926. Source: Ralph Pleasant. - Waddell Dam, On Agua Fria River, 35 miles northwest of Phoenix, Phoenix, Maricopa County, AZ
Ambient betatron motion and its excitation by ``ghost lines'' in Tevatron
Shiltsev, Vladimir; Stancari, Giulio; Valishev, Alexander
2011-08-02
Transverse betatron motion of the Tevatron proton beam is measured and analysed. It is shown that the motion is coherent and excited by external sources of unknown origins. The observations of the time varying “ghost lines“ in the betatron spectra are reported.
Evaluation of Two PCR-based Swine-specific Fecal Source Tracking Assays (Abstract)
Several PCR-based methods have been proposed to identify swine fecal pollution in environmental waters. However, the utility of these assays in identifying swine fecal contamination on a broad geographic scale is largely unknown. In this study, we evaluated the specificity, distr...
Solving differential equations with unknown constitutive relations as recurrent neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagge, Tobias J.; Stinis, Panagiotis; Yeung, Enoch H.
We solve a system of ordinary differential equations with an unknown functional form of a sink (reaction rate) term. We assume that the measurements (time series) of state variables are partially available, and use a recurrent neural network to “learn” the reaction rate from this data. This is achieved by including discretized ordinary differential equations as part of a recurrent neural network training problem. We extend TensorFlow’s recurrent neural network architecture to create a simple but scalable and effective solver for the unknown functions, and apply it to a fedbatch bioreactor simulation problem. Use of techniques from recent deep learningmore » literature enables training of functions with behavior manifesting over thousands of time steps. Our networks are structurally similar to recurrent neural networks, but differ in purpose, and require modified training strategies.« less
Piecewise synonyms for enhanced UMLS source terminology integration.
Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J
2007-10-11
The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.
Chen, H.; Smith, G. J. D.; Li, K. S.; Wang, J.; Fan, X. H.; Rayner, J. M.; Vijaykrishna, D.; Zhang, J. X.; Zhang, L. J.; Guo, C. T.; Cheung, C. L.; Xu, K. M.; Duan, L.; Huang, K.; Qin, K.; Leung, Y. H. C.; Wu, W. L.; Lu, H. R.; Chen, Y.; Xia, N. S.; Naipospos, T. S. P.; Yuen, K. Y.; Hassan, S. S.; Bahri, S.; Nguyen, T. D.; Webster, R. G.; Peiris, J. S. M.; Guan, Y.
2006-01-01
Preparedness for a possible influenza pandemic caused by highly pathogenic avian influenza A subtype H5N1 has become a global priority. The spread of the virus to Europe and continued human infection in Southeast Asia have heightened pandemic concern. It remains unknown from where the pandemic strain may emerge; current attention is directed at Vietnam, Thailand, and, more recently, Indonesia and China. Here, we report that genetically and antigenically distinct sublineages of H5N1 virus have become established in poultry in different geographical regions of Southeast Asia, indicating the long-term endemicity of the virus, and the isolation of H5N1 virus from apparently healthy migratory birds in southern China. Our data show that H5N1 influenza virus, has continued to spread from its established source in southern China to other regions through transport of poultry and bird migration. The identification of regionally distinct sublineages contributes to the understanding of the mechanism for the perpetuation and spread of H5N1, providing information that is directly relevant to control of the source of infection in poultry. It points to the necessity of surveillance that is geographically broader than previously supposed and that includes H5N1 viruses of greater genetic and antigenic diversity. PMID:16473931
Welvaert, Marijke; Caley, Peter
2016-01-01
Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.
The Riemann-Lanczos equations in general relativity and their integrability
NASA Astrophysics Data System (ADS)
Dolan, P.; Gerber, A.
2008-06-01
The aim of this paper is to examine the Riemann-Lanczos equations and how they can be made integrable. They consist of a system of linear first-order partial differential equations that arise in general relativity, whereby the Riemann curvature tensor is generated by an unknown third-order tensor potential field called the Lanczos tensor. Our approach is based on the theory of jet bundles, where all field variables and all their partial derivatives of all relevant orders are treated as independent variables alongside the local manifold coordinates (xa) on the given space-time manifold M. This approach is adopted in (a) Cartan's method of exterior differential systems, (b) Vessiot's dual method using vector field systems, and (c) the Janet-Riquier theory of systems of partial differential equations. All three methods allow for the most general situations under which integrability conditions can be found. They give equivalent results, namely, that involutivity is always achieved at all generic points of the jet manifold M after a finite number of prolongations. Two alternative methods that appear in the general relativity literature to find integrability conditions for the Riemann-Lanczos equations generate new partial differential equations for the Lanczos potential that introduce a source term, which is nonlinear in the components of the Riemann tensor. We show that such sources do not occur when either of method (a), (b), or (c) are used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xibing; Dong, Longjun, E-mail: csudlj@163.com; Australian Centre for Geomechanics, The University of Western Australia, Crawley, 6009
This paper presents an efficient closed-form solution (ECS) for acoustic emission(AE) source location in three-dimensional structures using time difference of arrival (TDOA) measurements from N receivers, N ≥ 6. The nonlinear location equations of TDOA are simplified to linear equations. The unique analytical solution of AE sources for unknown velocity system is obtained by solving the linear equations. The proposed ECS method successfully solved the problems of location errors resulting from measured deviations of velocity as well as the existence and multiplicity of solutions induced by calculations of square roots in existed close-form methods.
Some practical universal noiseless coding techniques
NASA Technical Reports Server (NTRS)
Rice, R. F.
1979-01-01
Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.
Travels of airborne pollen. Final report, 1 Oct 1970--31 Dec 1974
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The following studies were conducted on the transport and dispersion of airborne pollen: (a) Development and evaluation of sampling devices for pollen; (b) development and evaluation techniques for tagging pollen in living plants with dyes and radioisotopes; (c) dispersion and deposition of pollen from known sources of various configurations; (d) effects of forested areas on the removal of pollen from the atmosphere; (e) concentration variations of pollen, natural sources with distance, height, time and other variables; (f) feasibility of predicting ragweed pollen concentrations, unknown sources; (g) measurements on ragweed pollen concentrations in a large source-free area; and (h) comparisons ofmore » the ragweed pollen concentrations before and after ragweed eradication efforts. (GRA)« less
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the location of the water-supply pumping wells based on the available data. The possible applications of the NMFk algorithm are not limited to hydrology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.
Low birth weight and air pollution in California: Which sources and components drive the risk?
Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun
2016-01-01
Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.
2012-12-01
commercial feasibility; involved materials of unknown biocompatibility ; or were low yield in practice. Some approaches required relatively expensive capital...feasibility; involved materials of unknown biocompatibility that could not be readily substituted; or were low yield in practice. Some of the approaches...possible micro- biocompatibility issues was a tradeoff that had to be made. The following was determined: 1) Use a low molecular weight silicone
NASA Astrophysics Data System (ADS)
Saikia, C. K.; Roman-nieves, J. I.; Woods, M. T.
2013-12-01
Source parameters of nuclear and chemical explosions are often estimated by matching either the corner frequency and spectral level of a single event or the spectral ratio when spectra from two events are available with known source parameters for one. In this study, we propose an alternative method in which waveforms from two or more events can be simultaneously equalized by setting the differential of the processed seismograms at one station from any two individual events to zero. The method involves convolving the equivalent Mueller-Murphy displacement source time function (MMDSTF) of one event with the seismogram of the second event and vice-versa, and then computing their difference seismogram. MMDSTF is computed at the elastic radius including both near and far-field terms. For this method to yield accurate source parameters, an inherent assumption is that green's functions for the any paired events from the source to a receiver are same. In the frequency limit of the seismic data, this is a reasonable assumption and is concluded based on the comparison of green's functions computed for flat-earth models at various source depths ranging from 100m to 1Km. Frequency domain analysis of the initial P wave is, however, sensitive to the depth phase interaction, and if tracked meticulously can help estimating the event depth. We applied this method to the local waveforms recorded from the three SPE shots and precisely determined their yields. These high-frequency seismograms exhibit significant lateral path effects in spectrogram analysis and 3D numerical computations, but the source equalization technique is independent of any variation as long as their instrument characteristics are well preserved. We are currently estimating the uncertainty in the derived source parameters assuming the yields of the SPE shots as unknown. We also collected regional waveforms from 95 NTS explosions at regional stations ALQ, ANMO, CMB, COR, JAS LON, PAS, PFO and RSSD. We are currently employing a station based analysis using the equalization technique to estimate depth and yields of many relative to those of the announced explosions; and to develop their relationship with the Mw and Mo for the NTS explosions.
NASA Astrophysics Data System (ADS)
Singh, Sarvesh Kumar; Kumar, Pramod; Rani, Raj; Turbelin, Grégory
2017-04-01
The study highlights a theoretical comparison and various interpretations of a recent inversion technique, called renormalization, developed for the reconstruction of unknown tracer emissions from their measured concentrations. The comparative interpretations are presented in relation to the other inversion techniques based on principle of regularization, Bayesian, minimum norm, maximum entropy on mean, and model resolution optimization. It is shown that the renormalization technique can be interpreted in a similar manner to other techniques, with a practical choice of a priori information and error statistics, while eliminating the need of additional constraints. The study shows that the proposed weight matrix and weighted Gram matrix offer a suitable deterministic choice to the background error and measurement covariance matrices, respectively, in the absence of statistical knowledge about background and measurement errors. The technique is advantageous since it (i) utilizes weights representing a priori information apparent to the monitoring network, (ii) avoids dependence on background source estimates, (iii) improves on alternative choices for the error statistics, (iv) overcomes the colocalization problem in a natural manner, and (v) provides an optimally resolved source reconstruction. A comparative illustration of source retrieval is made by using the real measurements from a continuous point release conducted in Fusion Field Trials, Dugway Proving Ground, Utah.
NASA Technical Reports Server (NTRS)
Hansen, James R. (Editor); Taylor, D. Bryan; Kinney, Jeremy; Lee, J. Lawrence
2003-01-01
This first volume, plus the succeeding five now in preparation, covers the impact of aerodynamic development on the evolution of the airplane in America. As the six-volume series will ultimately demonstrate, just as the airplane is a defining technology of the twentieth century, aerodynamics has been the defining element of the airplane. Volumes two through six will proceed in roughly chronological order, covering such developments as the biplane, the advent of commercial airliners, flying boats, rotary aircraft, supersonic flight, and hypersonic flight. This series is designed as an aeronautics companion to the Exploring the Unknown: Selected Documents in the History of the U.S. Civil Space Program (NASA SP-4407) series of books. As with Exploring the Unknown, the documents collected during this research project were assembled from a diverse number of public and private sources. A major repository of primary source materials relative to the history of the civil space program is the NASA Historical Reference Collection in the NASA Headquarters History Office. Historical materials housed at NASA field centers, academic institutions, and Presidential libraries were other sources of documents considered for inclusion, as were papers in the archives of private individuals and corporations.
Achievement and School Behavior among Children with Epilepsy.
ERIC Educational Resources Information Center
Matthews, Wendy S.; And Others
1983-01-01
Compared the school behavior of 15 epileptic children with that of diabetic and healthy children. The epileptic children were more likely to attribute the success or failure of their school performance to unknown sources of control, and to hold less positive feelings about school and their own self-worth. (Author)
ENVIRONMENTAL EFFECTS OF A GOLF COMPLEX ON COASTAL WETLANDS IN THE GULF OF MEXICO
The increasing density of golf courses represents a potential source of contamination to nearby coastal wetlands and other near-shore areas. The chemical and biological magnitude of the problem is almost unknown. To provide perspective on this issue, the effects of golf complex r...
USDA-ARS?s Scientific Manuscript database
Plasmodium parasites are known to manipulate the behaviour of their vectors so as to enhance their transmission. However, it is unknown if this vector manipulation also affects mosquito-plant interaction and sugar uptake. Dual-choice olfactometer and probing assays were used to study plant seeking b...
Timing Studies of X-Ray Binary Orbits
2003-01-01
Technology.1 It is a remarkable coincidence that there are two unrelated X-ray pulsars in Centaurus with nearly identical spin periods separated by only 15...associated with a B-type supergiant companion (V830 Centaurus ). Until our study, this source had a rather sparse observational history and an unknown
12. Close up view of construction on the downstream face. ...
12. Close up view of construction on the downstream face. Track at lower center conveyed aggregate from the stream bed to the mixing plant. Photographer unknown, October 15, 1924. Source: Salt River Project. - Mormon Flat Dam, On Salt River, Eastern Maricopa County, east of Phoenix, Phoenix, Maricopa County, AZ
Why environmental scientists are becoming Bayesians
James S. Clark
2005-01-01
Advances in computational statistics provide a general framework for the high dimensional models typically needed for ecological inference and prediction. Hierarchical Bayes (HB) represents a modelling structure with capacity to exploit diverse sources of information, to accommodate influences that are unknown (or unknowable), and to draw inference on large numbers of...
X-Ray Fluorescence Spectroscopy for Analysis of Explosive-Related Materials and Unknowns
2017-08-01
locally sourced baking soda) were added for further investigation. 3.2.5 Sample Cups and Film The samples cups used for this work were Chemplex...from Na and Cl; photograph was taken after irradiation, which induced the tan coloring Sodium bicarbonate Generic, store-brand baking soda
Threshold friction velocity of soils within the Columbia Plateau
USDA-ARS?s Scientific Manuscript database
Wind erosion only occurs when the friction velocity exceeds the threshold friction velocity (TFV) of the surface. The TFV of loessial soils commonly found across the Columbia Plateau region of the U.S. Pacific Northwest is virtually unknown even though these soils are highly erodible and a source of...
Understanding Our Changing World through Mapping and Geotechnologies
ERIC Educational Resources Information Center
Kerski, Joseph
2008-01-01
People have always been fascinated with investigating their home--the Earth. For centuries, maps have stirred imaginations and inspired explorations of the unknown. Maps are a rich source of information, showing spatial relationships between climate, vegetation, population, landforms, river systems, land use, soils, natural hazards, and much more.…
High-order scheme for the source-sink term in a one-dimensional water temperature model
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005
High-order scheme for the source-sink term in a one-dimensional water temperature model.
Jing, Zheng; Kang, Ling
2017-01-01
The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.
A Mobile Anchor Assisted Localization Algorithm Based on Regular Hexagon in Wireless Sensor Networks
Rodrigues, Joel J. P. C.
2014-01-01
Localization is one of the key technologies in wireless sensor networks (WSNs), since it provides fundamental support for many location-aware protocols and applications. Constraints of cost and power consumption make it infeasible to equip each sensor node in the network with a global position system (GPS) unit, especially for large-scale WSNs. A promising method to localize unknown nodes is to use several mobile anchors which are equipped with GPS units moving among unknown nodes and periodically broadcasting their current locations to help nearby unknown nodes with localization. This paper proposes a mobile anchor assisted localization algorithm based on regular hexagon (MAALRH) in two-dimensional WSNs, which can cover the whole monitoring area with a boundary compensation method. Unknown nodes calculate their positions by using trilateration. We compare the MAALRH with HILBERT, CIRCLES, and S-CURVES algorithms in terms of localization ratio, localization accuracy, and path length. Simulations show that the MAALRH can achieve high localization ratio and localization accuracy when the communication range is not smaller than the trajectory resolution. PMID:25133212
... at birth Tumor of the hormonal and nervous systems (neuroblastoma) Unknown causes In some cases the cause of Horner syndrome cannot be identified. This is known as idiopathic Horner syndrome. By Mayo Clinic Staff . Mayo Clinic Footer Legal Conditions and Terms Any use of this site ...
Near Real-Time Imaging of the Galactic Plane with BATSE
NASA Technical Reports Server (NTRS)
Harmon, B. A.; Zhang, S. N.; Robinson, C. R.; Paciesas, W. S.; Barret, D.; Grindlay, J.; Bloser, P.; Monnelly, C.
1997-01-01
The discovery of new transient or persistent sources in the hard X-ray regime with the BATSE Earth occultation Technique has been limited previously to bright sources of about 200 mCrab or more. While monitoring known source locations is not a problem to a daily limiting sensitivity of about 75 mCrab, the lack of a reliable background model forces us to use more intensive computer techniques to find weak, previously unknown emission from hard X-ray/gamma sources. The combination of Radon transform imaging of the galactic plane in 10 by 10 degree fields and the Harvard/CFA-developed Image Search (CBIS) allows us to straightforwardly search the sky for candidate sources in a +/- 20 degree latitude band along the plane. This procedure has been operating routinely on a weekly basis since spring 1997. We briefly describe the procedure, then concentrate on the performance aspects of the technique and candidate source results from the search.
Substance abuse treatment and services by criminal justice and other funding sources.
Arfken, Cynthia L; Kubiak, Sheryl Pimlott
2009-01-01
Studies have found funding source, whether public or private, is associated with treatment and services offered in community-based agencies. However, the association of criminal justice funding with community-based treatment and services is unknown. Using a mixed method case study approach with 34 agencies within one state we assessed administrators' perspectives of the most important funding source, treatment and services offered. We found that agencies rely on multiple funding sources and the source rated most important was associated with treatment and services offered in the agency. Those agencies citing a criminal justice entity as the most important funder were more likely to offer specific ancillary services and adopt motivational interviewing than those citing private funds. Although client characteristics or training opportunities may determine these services and practices, the agency's most important funding source may have implications for services offered.
Puerta, Elena; Hervias, Isabel; Goñi-Allo, Beatriz; Zhang, Steven F; Jordán, Joaquín; Starkov, Anatoly A; Aguirre, Norberto
2010-01-01
Background and purpose: 3,4-methylenedioxymethamphetamine (MDMA) causes a persistent loss of dopaminergic cell bodies in the substantia nigra of mice. Current evidence indicates that such neurotoxicity is due to oxidative stress but the source of free radicals remains unknown. Inhibition of mitochondrial electron transport chain complexes by MDMA was assessed as a possible source. Experimental approach: Activities of mitochondrial complexes after MDMA were evaluated spectrophotometrically. In situ visualization of superoxide production in the striatum was assessed by ethidium fluorescence and striatal dopamine levels were determined by HPLC as an index of dopaminergic toxicity. Key results: 3,4-methylenedioxymethamphetamine decreased mitochondrial complex I activity in the striatum of mice, an effect accompanied by an increased production of superoxide radicals and the inhibition of endogenous aconitase. α-Lipoic acid prevented superoxide generation and long-term toxicity independent of any effect on complex I inhibition. These effects of α-lipoic acid were also associated with a significant increase of striatal glutathione levels. The relevance of glutathione was supported by reducing striatal glutathione content with L-buthionine-(S,R)-sulfoximine, which exacerbated MDMA-induced dopamine deficits, effects suppressed by α-lipoic acid. The nitric oxide synthase inhibitor, NG-nitro-L-arginine, partially prevented MDMA-induced dopamine depletions, an effect reversed by L-arginine but not D-arginine. Finally, a direct relationship between mitochondrial complex I inhibition and long-term dopamine depletions was found in animals treated with MDMA in combination with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine. Conclusions and implications: Inhibition of mitochondrial complex I following MDMA could be the source of free radicals responsible for oxidative stress and the consequent neurotoxicity of this drug in mice. This article is commented on by Moncada, pp. 217–219 of this issue. To view this commentary visit http://dx.doi.org/10.1111/j.1476-5381.2010.00706.x and to view related papers in this issue by Pravdic et al. and Kurz et al. visit http://dx.doi.org/10.1111/j.1476-5381.2010.00698.x and http://dx.doi.org/10.1111/j.1476-5381.2010.00656.x PMID:20423338
Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2004-01-01
A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Identification of "Known Unknowns" Utilizing Accurate Mass Data and ChemSpider
NASA Astrophysics Data System (ADS)
Little, James L.; Williams, Antony J.; Pshenichnov, Alexey; Tkachenko, Valery
2012-01-01
In many cases, an unknown to an investigator is actually known in the chemical literature, a reference database, or an internet resource. We refer to these types of compounds as "known unknowns." ChemSpider is a very valuable internet database of known compounds useful in the identification of these types of compounds in commercial, environmental, forensic, and natural product samples. The database contains over 26 million entries from hundreds of data sources and is provided as a free resource to the community. Accurate mass mass spectrometry data is used to query the database by either elemental composition or a monoisotopic mass. Searching by elemental composition is the preferred approach. However, it is often difficult to determine a unique elemental composition for compounds with molecular weights greater than 600 Da. In these cases, searching by the monoisotopic mass is advantageous. In either case, the search results are refined by sorting the number of references associated with each compound in descending order. This raises the most useful candidates to the top of the list for further evaluation. These approaches were shown to be successful in identifying "known unknowns" noted in our laboratory and for compounds of interest to others.
A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms
2014-01-01
Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581
Insulin glulisine in the management of diabetes
Yamada, Satoru
2009-01-01
Insulin glulisine is appealing in principle, but the advantages of this drug over the other rapid-acting insulin analogs are still relatively unknown. The frequency of hypoglycemia, convenience in the timing of administration, and improvements in terms of HbA1c seem similar among the rapid-acting insulin analogs, including insulin glulisine. Only properly randomized long-term clinical studies with insulin glulisine will reveal the true value of this novel insulin analog. PMID:21437124
Rigorous analysis of thick microstrip antennas and wire antennas embedded in a substrate
NASA Astrophysics Data System (ADS)
Smolders, A. B.
1992-07-01
An efficient and rigorous method for the analysis of electrically thick rectangular microstrip antennas and wire antennas with a dielectric cover is presented. The method of moments is used in combination with the exact spectral domain Green's function in order to find the unknown currents on the antenna. The microstrip antenna is fed by a coaxial cable. A proper model of the feeding coaxial structure is used. In addition, a special attachment mode was applied to ensure continuity of current at the patch-coax transition. The efficiency of the method of moments is improved by using the so called source term extraction technique, where a great part of the infinite integrals involved with the method of moment formulation is calculated analytically. Computation time can be saved by selecting a set of basis functions that describes the current distribution on the patch and probe in an accurate way using only a few terms of this set. Thick microstrip antennas have broadband characteristics. However, a proper match to 50 Ohms is often difficult. This matching problem can be avoided by using a slightly different excitation structure. The patch is now electromagnetically coupled to the feeding probe. A bandwidth of more than 40 can easily be obtained for this type of microstrip antenna. The price to be paid is a degradation of the radiation characteristics.
Audio-tactile integration and the influence of musical training.
Kuchenbuch, Anja; Paraskevopoulos, Evangelos; Herholz, Sibylle C; Pantev, Christo
2014-01-01
Perception of our environment is a multisensory experience; information from different sensory systems like the auditory, visual and tactile is constantly integrated. Complex tasks that require high temporal and spatial precision of multisensory integration put strong demands on the underlying networks but it is largely unknown how task experience shapes multisensory processing. Long-term musical training is an excellent model for brain plasticity because it shapes the human brain at functional and structural levels, affecting a network of brain areas. In the present study we used magnetoencephalography (MEG) to investigate how audio-tactile perception is integrated in the human brain and if musicians show enhancement of the corresponding activation compared to non-musicians. Using a paradigm that allowed the investigation of combined and separate auditory and tactile processing, we found a multisensory incongruency response, generated in frontal, cingulate and cerebellar regions, an auditory mismatch response generated mainly in the auditory cortex and a tactile mismatch response generated in frontal and cerebellar regions. The influence of musical training was seen in the audio-tactile as well as in the auditory condition, indicating enhanced higher-order processing in musicians, while the sources of the tactile MMN were not influenced by long-term musical training. Consistent with the predictive coding model, more basic, bottom-up sensory processing was relatively stable and less affected by expertise, whereas areas for top-down models of multisensory expectancies were modulated by training.
Nonuniformity correction of imaging systems with a spatially nonhomogeneous radiation source.
Gutschwager, Berndt; Hollandt, Jörg
2015-12-20
We present a novel method of nonuniformity correction of imaging systems in a wide optical spectral range by applying a radiation source with an unknown and spatially nonhomogeneous radiance or radiance temperature distribution. The benefit of this method is that it can be applied with radiation sources of arbitrary spatial radiance or radiance temperature distribution and only requires the sufficient temporal stability of this distribution during the measurement process. The method is based on the recording of several (at least three) images of a radiation source and a purposeful row- and line-shift of these sequent images in relation to the first primary image. The mathematical procedure is explained in detail. Its numerical verification with a source of a predefined nonhomogenous radiance distribution and a thermal imager of a predefined nonuniform focal plane array responsivity is presented.
Three studies using Ceriodaphnia to detect nonpoint sources of metals from mine drainage
Nimmo, Del Wayne R.; Dodson, Max H.; Davies, Patrick H.; Greene, Joseph C.; Kerr, Mark A.
1990-01-01
Since its introduction, Ceriodaphnia dubia, a small planktonic daphnid, has been widely used for biomonitoring point source discharges. This species was also used to determine nonpoint sources of metals and related contaminants in three trout streams in the west where mining activities have been widespread. Along Chalk Creek, Colo., specific tailings (and impacted tributaries) were sources of metals toxic to fish using the water in a hatchery. At stations below extensive mine tailings in the upper Clark Fork River, Mont., drainage was acutely and chronically toxic to daphnids and paralleled reduced or nonexistent populations of trout. In Whitewood Creek, S. Dak., reduced toxicity below a gold mine portended that fish could live in the stream segment previously impaired by the mine. Toxicity downstream revealed a previously unknown nonpoint source of chromium.
Inverse random source scattering for the Helmholtz equation in inhomogeneous media
NASA Astrophysics Data System (ADS)
Li, Ming; Chen, Chuchu; Li, Peijun
2018-01-01
This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.
Constraining the global bromomethane budget from carbon stable isotopes
NASA Astrophysics Data System (ADS)
Bahlmann, Enno; Wittmer, Julian; Greule, Markus; Zetzsch, Cornelius; Seifert, Richard; Keppler, Frank
2016-04-01
Despite intense research in the last two decades, the global bromomethane (CH3Br) budget remains unbalanced with the known sinks exceeding the known sources by about 25%. The reaction with OH is the largest sink for CH3Br. We have determined the kinetic isotope effects for the reactions of CH3Br with the OH and Cl radical in order to better constrain the global CH3Br budget from an isotopic perspective. The isotope fractionation experiments were performed at 20±1°C in a 3500 L Teflon smog-chamber with initial CH3Br mixing ratios of about 2 and 10 ppm and perflourohexane (25 ppb) as internal standard. Atomic chlorine (Cl) was generated via photolysis of molecular chlorine (Cl2) using a solar simulator with an actinic flux comparable to that of the sun in mid-summer in Germany. OH radicals were generated via the photolysis of ozone (O3) at 253.7 nm in the presence of water vapor (RH = 70%).The mixing ratios of CH3Br, and perflourohexane were monitored by GC-MS with a time resolution of 15 minutes throughout the experiments. From each experiment 10 to 15 sub samples were taken in regular time intervals for subsequent carbon isotope ratio determinations by GC-IRMS performed at two independent laboratories in parallel. We found a kinetic isotope effect (KIE) of 17.6±3.3‰ for the reaction of CH3Br with OH and a KIE of 9.8±1.4 ‰ for the reaction with Cl*. We used these fractionation factors along with new data on the isotopic composition of CH3Br in the troposphere (-34±7‰) and the surface ocean (-26±7‰) along with reported source signatures, to constrain the unknown source from an isotopic perspective. The largest uncertainty in estimating the isotopic composition of the unknown source arises from the soil sink. Microbial degradation in soils is the second largest sink and assigned with a large fractionation factors of about 50‰. However, field experiments revealed substantially smaller apparent fractionation factors ranging from 11 to 22‰. In addition, simple model studies suggest that the soil uptake of CH3Br and hence its isotopic effect is largely controlled by diffusion resulting in an even smaller apparent isotopic fractionation. As a consequence, the estimated source signature for the unknown source is discussed with respect to the assumptions made for the soil sink.
26 CFR 1.737-1 - Recognition of precontribution gain.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...
Source term model evaluations for the low-level waste facility performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yim, M.S.; Su, S.I.
1995-12-31
The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.
Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro
2015-01-01
While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even if data on cause and effect are still inconclusive. Hence, an improved enforcement of the code of conduct coupled with a reduction in the conditioning of the whale sharks through provisioning were proposed to minimise the impacts on whale sharks in Oslob. PMID:26644984
Schleimer, Anna; Araujo, Gonzalo; Penketh, Luke; Heath, Anna; McCoy, Emer; Labaja, Jessica; Lucey, Anna; Ponzo, Alessandro
2015-01-01
While shark-based tourism is a rapidly growing global industry, there is ongoing controversy about the effects of provisioning on the target species. This study investigated the effect of feeding on whale sharks (Rhincodon typus) at a provisioning site in Oslob, Cebu, in terms of arrival time, avoidance and feeding behaviour using photo-identification and focal follows. Additionally, compliance to the code of conduct in place was monitored to assess tourism pressure on the whale sharks. Newly identified sharks gradually arrived earlier to the provisioning site after their initial sighting, indicating that the animals learn to associate the site with food rewards. Whale sharks with a long resighting history showed anticipatory behaviour and were recorded at the site on average 5 min after the arrival of feeder boats. Results from a generalised linear mixed model indicated that animals with a longer resighting history were less likely to show avoidance behaviour to touches or boat contact. Similarly, sequential data on feeding behaviour was modelled using a generalised estimating equations approach, which suggested that experienced whale sharks were more likely to display vertical feeding behaviour. It was proposed that the continuous source of food provides a strong incentive for the modification of behaviours, i.e., learning, through conditioning. Whale sharks are large opportunistic filter feeders in a mainly oligotrophic environment, where the ability to use novel food sources by modifying their behaviour could be of great advantage. Non-compliance to the code of conduct in terms of minimum distance to the shark (2 m) increased from 79% in 2012 to 97% in 2014, suggesting a high tourism pressure on the whale sharks in Oslob. The long-term effects of the observed behavioural modifications along with the high tourism pressure remain unknown. However, management plans are traditionally based on the precautionary principle, which aims to take preventive actions even if data on cause and effect are still inconclusive. Hence, an improved enforcement of the code of conduct coupled with a reduction in the conditioning of the whale sharks through provisioning were proposed to minimise the impacts on whale sharks in Oslob.
Broadband Phase Retrieval for Image-Based Wavefront Sensing
NASA Technical Reports Server (NTRS)
Dean, Bruce H.
2007-01-01
A focus-diverse phase-retrieval algorithm has been shown to perform adequately for the purpose of image-based wavefront sensing when (1) broadband light (typically spanning the visible spectrum) is used in forming the images by use of an optical system under test and (2) the assumption of monochromaticity is applied to the broadband image data. Heretofore, it had been assumed that in order to obtain adequate performance, it is necessary to use narrowband or monochromatic light. Some background information, including definitions of terms and a brief description of pertinent aspects of image-based phase retrieval, is prerequisite to a meaningful summary of the present development. Phase retrieval is a general term used in optics to denote estimation of optical imperfections or aberrations of an optical system under test. The term image-based wavefront sensing refers to a general class of algorithms that recover optical phase information, and phase-retrieval algorithms constitute a subset of this class. In phase retrieval, one utilizes the measured response of the optical system under test to produce a phase estimate. The optical response of the system is defined as the image of a point-source object, which could be a star or a laboratory point source. The phase-retrieval problem is characterized as image-based in the sense that a charge-coupled-device camera, preferably of scientific imaging quality, is used to collect image data where the optical system would normally form an image. In a variant of phase retrieval, denoted phase-diverse phase retrieval [which can include focus-diverse phase retrieval (in which various defocus planes are used)], an additional known aberration (or an equivalent diversity function) is superimposed as an aid in estimating unknown aberrations by use of an image-based wavefront-sensing algorithm. Image-based phase-retrieval differs from such other wavefront-sensing methods, such as interferometry, shearing interferometry, curvature wavefront sensing, and Shack-Hartmann sensing, all of which entail disadvantages in comparison with image-based methods. The main disadvantages of these non-image based methods are complexity of test equipment and the need for a wavefront reference.
Erythropoietin Levels in Elderly Patients with Anemia of Unknown Etiology
Sriram, Swetha; Martin, Alison; Xenocostas, Anargyros; Lazo-Langner, Alejandro
2016-01-01
Background In many elderly patients with anemia, a specific cause cannot be identified. This study investigates whether erythropoietin levels are inappropriately low in these cases of “anemia of unknown etiology” and whether this trend persists after accounting for confounders. Methods This study includes all anemic patients over 60 years old who had erythropoietin measured between 2005 and 2013 at a single center. Three independent reviewers used defined criteria to assign each patient’s anemia to one of ten etiologies: chronic kidney disease, iron deficiency, chronic disease, confirmed myelodysplastic syndrome (MDS), suspected MDS, vitamin B12 deficiency, folate deficiency, anemia of unknown etiology, other etiology, or multifactorial etiology. Iron deficiency anemia served as the comparison group in all analyses. We used linear regression to model the relationship between erythropoietin and the presence of each etiology, sequentially adding terms to the model to account for the hemoglobin concentration, estimated glomerular filtration rate (eGFR) and Charlson Comorbidity Index. Results A total of 570 patients met the inclusion criteria. Linear regression analysis showed that erythropoietin levels in chronic kidney disease, anemia of chronic disease and anemia of unknown etiology were lower by 48%, 46% and 27%, respectively, compared to iron deficiency anemia even after adjusting for hemoglobin, eGFR and comorbidities. Conclusions We have shown that erythropoietin levels are inappropriately low in anemia of unknown etiology, even after adjusting for confounders. This suggests that decreased erythropoietin production may play a key role in the pathogenesis of anemia of unknown etiology. PMID:27310832
Frie, Jakob; Padilla, Nelly; Ådén, Ulrika; Lagercrantz, Hugo; Bartocci, Marco
2016-05-01
To compare cortical hemodynamic responses to known and unknown facial stimuli between infants born extremely preterm and term-born infants, and to correlate the responses of the extremely preterm-born infants to regional cortical volumes at term-equivalent age. We compared 27 infants born extremely preterm (<28 gestational weeks) with 26 term-born infants. Corrected age and chronological age at testing were between 6 and 10 months, respectively. Both groups were exposed to a gray background, their mother's face, and an unknown face. Cerebral regional concentrations of oxygenated and deoxygenated hemoglobin were measured with near-infrared spectroscopy. In the preterm group, we also performed structural brain magnetic resonance imaging and correlated regional cortical volumes to hemodynamic responses. The preterm-born infants demonstrated different cortical face recognition processes than the term-born infants. They had a significantly smaller hemodynamic response in the right frontotemporal areas while watching their mother's face (0.13 μmol/L vs 0.63 μmol/L; P < .001). We also found a negative correlation between the magnitude of the oxygenated hemoglobin increase in the right frontotemporal cortex and regional gray matter volume in the left fusiform gyrus and amygdala (voxels, 25; r = 0.86; P < .005). At 6-10 months corrected age, the preterm-born infants demonstrated a different pattern in the maturation of their cortical face recognition process compared with term-born infants. Copyright © 2016 Elsevier Inc. All rights reserved.
An Efficient Solution Method for Multibody Systems with Loops Using Multiple Processors
NASA Technical Reports Server (NTRS)
Ghosh, Tushar K.; Nguyen, Luong A.; Quiocho, Leslie J.
2015-01-01
This paper describes a multibody dynamics algorithm formulated for parallel implementation on multiprocessor computing platforms using the divide-and-conquer approach. The system of interest is a general topology of rigid and elastic articulated bodies with or without loops. The algorithm divides the multibody system into a number of smaller sets of bodies in chain or tree structures, called "branches" at convenient joints called "connection points", and uses an Order-N (O (N)) approach to formulate the dynamics of each branch in terms of the unknown spatial connection forces. The equations of motion for the branches, leaving the connection forces as unknowns, are implemented in separate processors in parallel for computational efficiency, and the equations for all the unknown connection forces are synthesized and solved in one or several processors. The performances of two implementations of this divide-and-conquer algorithm in multiple processors are compared with an existing method implemented on a single processor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Sen; Zhang, Wei; Lian, Jianming
This two-part paper considers the coordination of a population of Thermostatically Controlled Loads (TCLs) with unknown parameters to achieve group objectives. The problem involves designing the bidding and market clearing strategy to motivate self-interested users to realize efficient energy allocation subject to a peak power constraint. The companion paper (Part I) formulates the problem and proposes a load coordination framework using the mechanism design approach. To address the unknown parameters, Part II of this paper presents a joint state and parameter estimation framework based on the expectation maximization algorithm. The overall framework is then validated using real-world weather data andmore » price data, and is compared with other approaches in terms of aggregated power response. Simulation results indicate that our coordination framework can effectively improve the efficiency of the power grid operations and reduce power congestion at key times.« less
Bounemeur, Abdelhamid; Chemachema, Mohamed; Essounbouli, Najib
2018-05-10
In this paper, an active fuzzy fault tolerant tracking control (AFFTTC) scheme is developed for a class of multi-input multi-output (MIMO) unknown nonlinear systems in the presence of unknown actuator faults, sensor failures and external disturbance. The developed control scheme deals with four kinds of faults for both sensors and actuators. The bias, drift, and loss of accuracy additive faults are considered along with the loss of effectiveness multiplicative fault. A fuzzy adaptive controller based on back-stepping design is developed to deal with actuator failures and unknown system dynamics. However, an additional robust control term is added to deal with sensor faults, approximation errors, and external disturbances. Lyapunov theory is used to prove the stability of the closed loop system. Numerical simulations on a quadrotor are presented to show the effectiveness of the proposed approach. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Probabilistic and deterministic aspects of linear estimation in geodesy. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Dermanis, A.
1976-01-01
Recent advances in observational techniques related to geodetic work (VLBI, laser ranging) make it imperative that more consideration should be given to modeling problems. Uncertainties in the effect of atmospheric refraction, polar motion and precession-nutation parameters, cannot be dispensed with in the context of centimeter level geodesy. Even physical processes that have generally been previously altogether neglected (station motions) must now be taken into consideration. The problem of modeling functions of time or space, or at least their values at observation points (epochs) is explored. When the nature of the function to be modeled is unknown. The need to include a limited number of terms and to a priori decide upon a specific form may result in a representation which fails to sufficiently approximate the unknown function. An alternative approach of increasing application is the modeling of unknown functions as stochastic processes.
Li, Yongming; Ma, Zhiyao; Tong, Shaocheng
2017-09-01
The problem of adaptive fuzzy output-constrained tracking fault-tolerant control (FTC) is investigated for the large-scale stochastic nonlinear systems of pure-feedback form. The nonlinear systems considered in this paper possess the unstructured uncertainties, unknown interconnected terms and unknown nonaffine nonlinear faults. The fuzzy logic systems are employed to identify the unknown lumped nonlinear functions so that the problems of structured uncertainties can be solved. An adaptive fuzzy state observer is designed to solve the nonmeasurable state problem. By combining the barrier Lyapunov function theory, adaptive decentralized and stochastic control principles, a novel fuzzy adaptive output-constrained FTC approach is constructed. All the signals in the closed-loop system are proved to be bounded in probability and the system outputs are constrained in a given compact set. Finally, the applicability of the proposed controller is well carried out by a simulation example.
Observation-based source terms in the third-generation wave model WAVEWATCH
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.
2015-12-01
Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.
Backstepping Design of Adaptive Neural Fault-Tolerant Control for MIMO Nonlinear Systems.
Gao, Hui; Song, Yongduan; Wen, Changyun
In this paper, an adaptive controller is developed for a class of multi-input and multioutput nonlinear systems with neural networks (NNs) used as a modeling tool. It is shown that all the signals in the closed-loop system with the proposed adaptive neural controller are globally uniformly bounded for any external input in . In our control design, the upper bound of the NN modeling error and the gains of external disturbance are characterized by unknown upper bounds, which is more rational to establish the stability in the adaptive NN control. Filter-based modification terms are used in the update laws of unknown parameters to improve the transient performance. Finally, fault-tolerant control is developed to accommodate actuator failure. An illustrative example applying the adaptive controller to control a rigid robot arm shows the validation of the proposed controller.In this paper, an adaptive controller is developed for a class of multi-input and multioutput nonlinear systems with neural networks (NNs) used as a modeling tool. It is shown that all the signals in the closed-loop system with the proposed adaptive neural controller are globally uniformly bounded for any external input in . In our control design, the upper bound of the NN modeling error and the gains of external disturbance are characterized by unknown upper bounds, which is more rational to establish the stability in the adaptive NN control. Filter-based modification terms are used in the update laws of unknown parameters to improve the transient performance. Finally, fault-tolerant control is developed to accommodate actuator failure. An illustrative example applying the adaptive controller to control a rigid robot arm shows the validation of the proposed controller.
Wong, Winsy; Low, Sam-Po
2008-07-01
The present study investigated verbal recall of semantically preserved and degraded words and nonwords by taking into consideration the status of one's semantic short-term memory (STM). Two experiments were conducted on 2 Chinese individuals with aphasia. The first experiment showed that they had largely preserved phonological processing abilities accompanied by mild but comparable semantic processing deficits; however, their performance on STM tasks revealed a double dissociation. The second experiment found that the participant with more preserved semantic STM had better recall of known words and nonwords than of their unknown counterparts, whereas such effects were absent in the patient with severe semantic STM deficit. The results are compatible with models that assume separate phonological and semantic STM components, such as that of R. C. Martin, M. Lesch, and M. Bartha (1999). In addition, the distribution of error types was different from previous studies. This is discussed in terms of the methodology of the authors' experiments and current views regarding the nature of semantic STM and representations in the Chinese mental lexicon. (c) 2008 APA
NASA Astrophysics Data System (ADS)
Shoukat, Sobia; Naqvi, Qaisar A.
2016-12-01
In this manuscript, scattering from a perfect electric conducting strip located at planar interface of topological insulator (TI)-chiral medium is investigated using the Kobayashi Potential method. Longitudinal components of electric and magnetic vector potential in terms of unknown weighting function are considered. Use of related set of boundary conditions yields two algebraic equations and four dual integral equations (DIEs). Integrand of two DIEs are expanded in terms of the characteristic functions with expansion coefficients which must satisfy, simultaneously, the discontinuous property of the Weber-Schafheitlin integrals, required edge and boundary conditions. The resulting expressions are then combined with algebraic equations to express the weighting function in terms of expansion coefficients, these expansion coefficients are then substituted in remaining DIEs. The projection is applied using the Jacobi polynomials. This treatment yields matrix equation for expansion coefficients which is solved numerically. These unknown expansion coefficients are used to find the scattered field. The far zone scattering width is investigated with respect to different parameters of the geometry, i.e, chirality of chiral medium, angle of incidence, size of the strip. Significant effects of different parameters including TI parameter on the scattering width are noted.
Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.
2005-01-01
A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.
Gunia, Sven; Koch, Stefan; May, Matthias
2013-02-01
Penile, vulvar and anal squamous cell carcinomas (SCCs) share histomorphological overlap and are prone to lymphatic dissemination into inguinal nodes. Anal SCCs might derive from the anorectal zone (ARZ), anal transitional zone, squamous zone or from perianal skin. These anatomically distinct zones differ in terms of their embryological development. We sought to investigate the role of caudal-related homeobox 2 (CDX2), a homeobox gene implicated in the development and anterior/posterior pattern specification from duodenum to rectum including the ARZ, in terms of narrowing the possible sites of origin to be considered in the setting of SCC with unknown primary presenting with histologically confirmed inguinal lymph node metastasis. By immunohistochemistry (IHC) employing a panel of antibodies directed against CK5/6, CK7, CK20, p63, p16, CEA and CDX2, we compared 89 penile, 11 vulvar and eight anal SCCs with respect to their staining profiles. Moreover, anal SCCs were subjected to in situ hybridisation (ISH) for high-risk human papillomavirus (HPV) subtypes. By IHC, CDX2 expression was observed in 2/8 anal SCCs (25%) while being absent from all penile and vulvar SCCs examined. High-risk HPV subtypes were detected by ISH in all anal SCCs examined, which were uniformly p16-positive by IHC. CDX2 might be valuable in terms of narrowing the possible sites of origin to be considered in the setting of SCC with unknown primary presenting with inguinal lymph node metastasis. However, despite its favourable specificity, the diagnostic benefit achieved by this observation is limited by the low sensitivity.
NASA Astrophysics Data System (ADS)
Akhtar, S. S.; Hussain, T.; Bokhari, A. H.; Khan, F.
2018-04-01
We provide a complete classification of static plane symmetric space-times according to conformal Ricci collineations (CRCs) and conformal matter collineations (CMCs) in both the degenerate and nondegenerate cases. In the case of a nondegenerate Ricci tensor, we find a general form of the vector field generating CRCs in terms of unknown functions of t and x subject to some integrability conditions. We then solve the integrability conditions in different cases depending upon the nature of the Ricci tensor and conclude that the static plane symmetric space-times have a 7-, 10- or 15-dimensional Lie algebra of CRCs. Moreover, we find that these space-times admit an infinite number of CRCs if the Ricci tensor is degenerate. We use a similar procedure to study CMCs in the case of a degenerate or nondegenerate matter tensor. We obtain the exact form of some static plane symmetric space-time metrics that admit nontrivial CRCs and CMCs. Finally, we present some physical applications of our obtained results by considering a perfect fluid as a source of the energy-momentum tensor.
Improving Predictions and Management of Hydrological Extremes through Climate Services
NASA Astrophysics Data System (ADS)
van den Hurk, Bart; Wijngaard, Janet; Pappenberger, Florian; Bouwer, Laurens; Weerts, Albrecht; Buontempo, Carlo; Doescher, Ralf; Manez, Maria; Ramos, Maria-Helena; Hananel, Cedric; Ercin, Ertug; Hunink, Johannes; Klein, Bastian; Pouget, Laurent; Ward, Philip
2016-04-01
The EU Roadmap on Climate Services can be seen as a result of convergence between the society's call for "actionable research", and the climate research community providing tailored data, information and knowledge. However, although weather and climate have clearly distinct definitions, a strong link between weather and climate services exists that is not explored extensively. Stakeholders being interviewed in the context of the Roadmap consider climate as a far distant long term feature that is difficult to consider in present-day decision taking, which is dominated by daily experience with handling extreme events. It is argued that this experience is a rich source of inspiration to increase society's resilience to an unknown future. A newly started European research project, IMPREX, is built on the notion that "experience in managing current day weather extremes is the best learning school to anticipate consequences of future climate". This paper illustrates possible ways to increase the link between information and services addressing weather and climate time scales by discussing the underlying concepts of IMPREX and its expected outcome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Scott E.; Hesthaven, Jan S.; Lau, Stephen R.
In the context of metric perturbation theory for nonspinning black holes, extreme mass ratio binary systems are described by distributionally forced master wave equations. Numerical solution of a master wave equation as an initial boundary value problem requires initial data. However, because the correct initial data for generic-orbit systems is unknown, specification of trivial initial data is a common choice, despite being inconsistent and resulting in a solution which is initially discontinuous in time. As is well known, this choice leads to a burst of junk radiation which eventually propagates off the computational domain. We observe another potential consequence ofmore » trivial initial data: development of a persistent spurious solution, here referred to as the Jost junk solution, which contaminates the physical solution for long times. This work studies the influence of both types of junk on metric perturbations, waveforms, and self-force measurements, and it demonstrates that smooth modified source terms mollify the Jost solution and reduce junk radiation. Our concluding section discusses the applicability of these observations to other numerical schemes and techniques used to solve distributionally forced master wave equations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, J. E.; Berry, J. A.; Seibt, U.
Growth in terrestrial gross primary production (GPP) may provide a feedback for climate change, but there is still strong disagreement on the extent to which biogeochemical processes may suppress this GPP growth at the ecosystem to continental scales. The consequent uncertainty in modeling of future carbon storage by the terrestrial biosphere constitutes one of the largest unknowns in global climate projections for the next century. Here we provide a global, measurement-based estimate of historical GPP growth using long-term atmospheric carbonyl sulfide (COS) records derived from ice core, firn, and ambient air samples. We interpret these records using a model thatmore » relates changes in the COS concentration to changes in its sources and sinks, the largest of which is proportional to GPP. The COS history was most consistent with simulations that assume a large historical GPP growth. Carbon-climate models that assume little to no GPP growth predicted trajectories of COS concentration over the anthropogenic era that differ from those observed. Continued COS monitoring may be useful for detecting ongoing changes in GPP while extending the ice core record to glacial cycles could provide further opportunities to evaluate earth system models.« less
Caldas, José; Gehlenborg, Nils; Kettunen, Eeva; Faisal, Ali; Rönty, Mikko; Nicholson, Andrew G; Knuutila, Sakari; Brazma, Alvis; Kaski, Samuel
2012-01-15
Genome-wide measurement of transcript levels is an ubiquitous tool in biomedical research. As experimental data continues to be deposited in public databases, it is becoming important to develop search engines that enable the retrieval of relevant studies given a query study. While retrieval systems based on meta-data already exist, data-driven approaches that retrieve studies based on similarities in the expression data itself have a greater potential of uncovering novel biological insights. We propose an information retrieval method based on differential expression. Our method deals with arbitrary experimental designs and performs competitively with alternative approaches, while making the search results interpretable in terms of differential expression patterns. We show that our model yields meaningful connections between biological conditions from different studies. Finally, we validate a previously unknown connection between malignant pleural mesothelioma and SIM2s suggested by our method, via real-time polymerase chain reaction in an independent set of mesothelioma samples. Supplementary data and source code are available from http://www.ebi.ac.uk/fg/research/rex.
Laser-mediated Photodynamic Therapy: An Alternative Treatment for Actinic Keratosis?
Kessels, Janneke P H M; Nelemans, Patty J; Mosterd, Klara; Kelleners-Smeets, Nicole W J; Krekels, Gertruud A M; Ostertag, Judith U
2016-03-01
Photodynamic therapy (PDT) with light emitting diode (LED) illumination is a frequently used treatment modality for actinic keratosis (AK) with excellent cosmetic outcome. A major disadvantage, however, is the high pain score. Pulsed dye laser (PDL) illumination has been suggested, but the long-term efficacy of this treatment is unknown. In this split-face study we prospectively treated 61 patients with AK, with both LED-PDT and PDL-PDT. The mean change in the number of lesions between the end of follow-up and start of therapy was -4.25 (95% confidence interval (95% CI) -5.07; -3.43) for LED-PDT and -3.88 (95% CI -4,76; -2.99) for PDL-PDT, with a non-significant difference (p = 0.258) of -0.46 (95% CI -1.28; 0.35). The percentage decrease from baseline in the total number of AK was 55.8% and 47.8%, respectively, at 12-month follow-up. Visual analogue scale pain score was lower after PDL (mean 2.64) compared with LED illumination (mean 6.47). These findings indicate that PDL-PDT is an effective alternative illumination source fo.
Ensemble positive unlabeled learning for disease gene identification.
Yang, Peng; Li, Xiaoli; Chua, Hon-Nian; Kwoh, Chee-Keong; Ng, See-Kiong
2014-01-01
An increasing number of genes have been experimentally confirmed in recent years as causative genes to various human diseases. The newly available knowledge can be exploited by machine learning methods to discover additional unknown genes that are likely to be associated with diseases. In particular, positive unlabeled learning (PU learning) methods, which require only a positive training set P (confirmed disease genes) and an unlabeled set U (the unknown candidate genes) instead of a negative training set N, have been shown to be effective in uncovering new disease genes in the current scenario. Using only a single source of data for prediction can be susceptible to bias due to incompleteness and noise in the genomic data and a single machine learning predictor prone to bias caused by inherent limitations of individual methods. In this paper, we propose an effective PU learning framework that integrates multiple biological data sources and an ensemble of powerful machine learning classifiers for disease gene identification. Our proposed method integrates data from multiple biological sources for training PU learning classifiers. A novel ensemble-based PU learning method EPU is then used to integrate multiple PU learning classifiers to achieve accurate and robust disease gene predictions. Our evaluation experiments across six disease groups showed that EPU achieved significantly better results compared with various state-of-the-art prediction methods as well as ensemble learning classifiers. Through integrating multiple biological data sources for training and the outputs of an ensemble of PU learning classifiers for prediction, we are able to minimize the potential bias and errors in individual data sources and machine learning algorithms to achieve more accurate and robust disease gene predictions. In the future, our EPU method provides an effective framework to integrate the additional biological and computational resources for better disease gene predictions.
Review of magnetic and electric field effects near active faults and volcanoes in the U.S.A.
Johnston, M.J.S.
1989-01-01
Synchronized measurements of geomagnetic field have been recorded along 800 km of the San Andreas fault and in the Long Valley caldera since 1974, and during eruptions on Mount St. Helens since 1980. For shorter periods of time, continuous measurements of geoelectric field measurements have been made on Mount St. Helens and near the San Andreas fault where moderate seismicity and fault slip frequently occurs. Significant tectonic and volcanic events for which nearby magnetic and electric field data have been obtained include: (1) two moderate earthquakes (ML > 5.8) for which magnetometers were close enough to expect observable signals (about three source lengths), (2) one moderate earthquake (MS 7.3) for which magnetometers were installed as massive fluid outflow occurred during the post-seismic phase, (3) numerous fault creep events and moderate seismicity, (4) a major explosive volcanic eruption and numerous minor extrusive eruptions, and (5) an episode of aseismic uplift. For one of the two earthquakes with ML > 5.8, seismomagnetic effects of -1.3 and -0.3 nT were observed. For this event, magnetometers were optimally located near the epicenter and the observations obtained are consistent with simple seismomagnetic models of the event. Similar models for the other event indicate that the expected seismomagnetic effects are below the signal resolution of the nearest magnetometer. Precursive tectonomagnetic effects were recorded on two independent instruments at distances of 30 and 50 km from a ML 5.2 earthquake. Longer-term changes were recorded in one region in southern California where a moderate ML 5.9 earthquake has since occurred. Surface observations of fault creep events have no associated magnetic or electrical signature above the present measurement precision (0.25 nT and 0.01%, respectively) and are consistent with near-surface fault failure models of these events. Longer-term creep is sometimes associated with corresponding longer-term magnetic field perturbations. Correlated changes in gravity, magnetic field, areal strain, and uplift occurred during episodes of aseismic deformation in southern California primarily between 1979 and 1983. Because the relationships between these parameters agrees with those calculated from simple deformation and tectonomagnetic models, the preferred explanation appeals to short-term strain episodes independently detected in each data set. An unknown source of meteorologically generated noise in the strain, gravity, and uplift data and an unknown, but correlated, disturbance in the absolute magnetic data might also explain the data. No clear observations of seismoelectric or tectonoelectric effects have yet been reported. The eruption of Mount St. Helens generated large oscillatory fields and 9 ?? 2 nT offset on the only surviving magnetometer. A large-scale traveling magnetic disturbance passed through the San Andreas array from 1 to 2 h after the eruption. Subsequent extrusive eruptions generated small precursory magnetic changes in some cases. These data are consistent with a simple volcanomagnetic model, magneto-gas dynamic effects, and a blast excited traveling ionospheric disturbance. Traveling ionospheric disturbances (TIDs), also generated by earthquake-related atmospheric pressure waves, may explain many electromagnetic disturbances apparently associated with earthquakes. Local near-fault magnetic field transients rarely exceed a few nT at periods of a few minutes and longer. ?? 1989.
Assessment of Ethanol Trends on the ISS
NASA Technical Reports Server (NTRS)
Perry, Jay; Carter, Layne; Kayatin, Matthew; Gazda, Daniel; McCoy, Torin; Limero, Thomas
2016-01-01
The International Space Station (ISS) Environmental Control and Life Support System (ECLSS) provides a working environment for six crewmembers through atmosphere revitalization and water recovery systems. In the last year, elevated ethanol levels have presented a unique challenge for the ISS ECLSS. Ethanol is monitored on the ISS by the Air Quality Monitor (AQM). The source of this increase is currently unknown. This paper documents the credible sources for the increased ethanol concentration, the monitoring provided by the AQM, and the impact on the atmosphere revitalization and water recovery systems.
2013-03-06
lithium - ion battery ,” Journal of Power Sources, vol. 195, no. 9, pp. 2961 – 2968, 2010. [10] L. Cai and R. White, “An efficient electrochemical-thermal...13] D. R. Pendergast, E. P. DeMauro, M. Fletcher, E. Stimson, and J. C. Mollendorf, “A rechargeable lithium - ion battery module for underwater use...Journal of Power Sources, vol. 196, no. 2, pp. 793–800, 2011. [14] D. H. Jeon and S. M. Baek, “Thermal modeling of cylindrical lithium ion battery during
Green's function of radial inhomogeneous spheres excited by internal sources.
Zouros, Grigorios P; Kokkorakis, Gerassimos C
2011-01-01
Green's function in the interior of penetrable bodies with inhomogeneous compressibility by sources placed inside them is evaluated through a Schwinger-Lippmann volume integral equation. In the case of a radial inhomogeneous sphere, the radial part of the unknown Green's function can be expanded in a double Dini's series, which allows analytical evaluation of the involved cumbersome integrals. The simple case treated here can be extended to more difficult situations involving inhomogeneous density as well as to the corresponding electromagnetic or elastic problem. Finally, numerical results are given for various inhomogeneous compressibility distributions.
4. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
4. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, CHINA WASH FLUME, 5/13/25 - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ
18. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
18. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. SACATON DAM AND BRIDGE, SOUTH END SLUICEWAY, INTAKE CANAL BRIDGE, OPERATING HOUSE, AND MAIN BRIDGE, 6/18/25 - San Carlos Irrigation Project, Sacaton Dam & Bridge, Gila River, T4S R6E S12/13, Coolidge, Pinal County, AZ
5. Photographic copy of photograph. (Source: U.S. Department of Interior. ...
5. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, CHINA WASH FLUME, 5/13/25 - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ
Localisation of an Unknown Number of Land Mines Using a Network of Vapour Detectors
Chhadé, Hiba Haj; Abdallah, Fahed; Mougharbel, Imad; Gning, Amadou; Julier, Simon; Mihaylova, Lyudmila
2014-01-01
We consider the problem of localising an unknown number of land mines using concentration information provided by a wireless sensor network. A number of vapour sensors/detectors, deployed in the region of interest, are able to detect the concentration of the explosive vapours, emanating from buried land mines. The collected data is communicated to a fusion centre. Using a model for the transport of the explosive chemicals in the air, we determine the unknown number of sources using a Principal Component Analysis (PCA)-based technique. We also formulate the inverse problem of determining the positions and emission rates of the land mines using concentration measurements provided by the wireless sensor network. We present a solution for this problem based on a probabilistic Bayesian technique using a Markov chain Monte Carlo sampling scheme, and we compare it to the least squares optimisation approach. Experiments conducted on simulated data show the effectiveness of the proposed approach. PMID:25384008
Extended gamma-ray sources around pulsars constrain the origin of the positron flux at Earth
NASA Astrophysics Data System (ADS)
Abeysekara, A. U.; Albert, A.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Avila Rojas, D.; Ayala Solares, H. A.; Barber, A. S.; Bautista-Elivar, N.; Becerril, A.; Belmont-Moreno, E.; BenZvi, S. Y.; Berley, D.; Bernal, A.; Braun, J.; Brisbois, C.; Caballero-Mora, K. S.; Capistrán, T.; Carramiñana, A.; Casanova, S.; Castillo, M.; Cotti, U.; Cotzomi, J.; Coutiño de León, S.; De León, C.; De la Fuente, E.; Dingus, B. L.; DuVernois, M. A.; Díaz-Vélez, J. C.; Ellsworth, R. W.; Engel, K.; Enríquez-Rivera, O.; Fiorino, D. W.; Fraija, N.; García-González, J. A.; Garfias, F.; Gerhardt, M.; González Muñoz, A.; González, M. M.; Goodman, J. A.; Hampel-Arias, Z.; Harding, J. P.; Hernández, S.; Hernández-Almada, A.; Hinton, J.; Hona, B.; Hui, C. M.; Hüntemeyer, P.; Iriarte, A.; Jardin-Blicq, A.; Joshi, V.; Kaufmann, S.; Kieda, D.; Lara, A.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; Vargas, H. León; Linnemann, J. T.; Longinotti, A. L.; Luis Raya, G.; Luna-García, R.; López-Coto, R.; Malone, K.; Marinelli, S. S.; Martinez, O.; Martinez-Castellanos, I.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Nisa, M. U.; Noriega-Papaqui, R.; Pelayo, R.; Pretz, J.; Pérez-Pérez, E. G.; Ren, Z.; Rho, C. D.; Rivière, C.; Rosa-González, D.; Rosenberg, M.; Ruiz-Velasco, E.; Salazar, H.; Salesa Greus, F.; Sandoval, A.; Schneider, M.; Schoorlemmer, H.; Sinnis, G.; Smith, A. J.; Springer, R. W.; Surajbali, P.; Taboada, I.; Tibolla, O.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Vianello, G.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Yodh, G.; Younk, P. W.; Zepeda, A.; Zhou, H.; Guo, F.; Hahn, J.; Li, H.; Zhang, H.
2017-11-01
The unexpectedly high flux of cosmic-ray positrons detected at Earth may originate from nearby astrophysical sources, dark matter, or unknown processes of cosmic-ray secondary production. We report the detection, using the High-Altitude Water Cherenkov Observatory (HAWC), of extended tera–electron volt gamma-ray emission coincident with the locations of two nearby middle-aged pulsars (Geminga and PSR B0656+14). The HAWC observations demonstrate that these pulsars are indeed local sources of accelerated leptons, but the measured tera–electron volt emission profile constrains the diffusion of particles away from these sources to be much slower than previously assumed. We demonstrate that the leptons emitted by these objects are therefore unlikely to be the origin of the excess positrons, which may have a more exotic origin.
Alloying of steel and graphite by hydrogen in nuclear reactor
NASA Astrophysics Data System (ADS)
Krasikov, E.
2017-02-01
In traditional power engineering hydrogen may be one of the first primary source of equipment damage. This problem has high actuality for both nuclear and thermonuclear power engineering. Study of radiation-hydrogen embrittlement of the steel raises the question concerning the unknown source of hydrogen in reactors. Later unexpectedly high hydrogen concentrations were detected in irradiated graphite. It is necessary to look for this source of hydrogen especially because hydrogen flakes were detected in reactor vessels of Belgian NPPs. As a possible initial hypothesis about the enigmatical source of hydrogen one can propose protons generation during beta-decay of free neutrons поскольку inasmuch as protons detected by researches at nuclear reactors as witness of beta-decay of free neutrons.
NASA Astrophysics Data System (ADS)
Bonhoff, H. A.; Petersson, B. A. T.
2010-08-01
For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.
Measurement method bias in games for health research
USDA-ARS?s Scientific Manuscript database
Studies of games for health often use self report measures which are subject to many known and unknown sources of error. When two self report measures are used in the same study, they tend to be moderately to highly correlated. This is not because there is a true moderate to high correlation, but be...
Incidental L2 Vocabulary Acquisition "from" and "while" Reading: An Eye-Tracking Study
ERIC Educational Resources Information Center
Pellicer-Sánchez, Ana
2016-01-01
Previous studies have shown that reading is an important source of incidental second language (L2) vocabulary acquisition. However, we still do not have a clear picture of what happens when readers encounter unknown words. Combining offline (vocabulary tests) and online (eye-tracking) measures, the incidental acquisition of vocabulary knowledge…
The significance of the many wastewater discharges in the Gulf of Mexico region as sources of trace metal contamination to indigenous biota in nearby coastal areas is relatively unknown. The primary objective of this baseline survey was to provide some insight on this issue by d...
The Beaks of Finches & the Tool Analogy: Use with Care
ERIC Educational Resources Information Center
Milne, Catherine
2008-01-01
Analogies are an integral feature of scientific theories, like evolution. They are developed to support explanations, proposed on the basis of evidence collected from experimental studies, field studies, and other observational studies. They map a known source or process to an unknown or target with the goal of helping educators understand the…
God Forbids or Mom Disapproves? Religious Beliefs that Prevent Drug Use among Youth
ERIC Educational Resources Information Center
Sanchez, Zila M.; Opaleye, Emerita Satiro; Chaves, Tharcila V.; Noto, Ana R.; Nappo, Solange A.
2011-01-01
Researches have emphasized religiosity as a protective factor against drug use although the mechanism through which it occurs is still unknown. This article aims to explore religious beliefs that could prevent drug use among youth. Three sources of qualitative data were used: participant observation in 21 religious institutions, semistructured…